GribStream

GribStream for AI tools

This page is the starting point for AI integrations with GribStream. It is intended for people and tools building workflows on top of GribStream with Codex, Claude Code, Gemini, Cursor, custom agents, or any other AI runtime that can read instructions and make HTTP requests.

The current foundation is a public vendor-neutral skill file plus the OpenAPI spec and the regular docs. Later on, this page can also host or point to a GribStream MCP.

Core resources

Authentication setup

Live GribStream data queries require an API token.

  1. Create or sign in to an account at /auth/login.
  2. Create an API token from /app/dashboard.
  3. Set it as the environment variable GRIBSTREAM_API_TOKEN.
  4. Start the AI tool from that same shell or configure the tool to expose that variable to the runtime.

Recommended shell setup:

export GRIBSTREAM_API_TOKEN='YOUR_TOKEN_HERE'

CLI tools such as Codex or Claude Code typically inherit environment variables from the parent shell that launched them. In practice that means you should export GRIBSTREAM_API_TOKEN first and then start the tool from that same terminal session. If the tool is already running, restart it after setting the variable.

Using an environment variable or a vendor secret store mapped to that variable is preferred. Avoid pasting tokens into prompts, checked-in files, or reusable scripts.

What the skill teaches

The public skill file is intentionally strict about the things AI tools tend to get wrong:

Recommended workflow for AI tools

  1. Read the OpenAPI spec first for the contract.
  2. Use the skill file as behavioral guidance on how to interpret natural-language asks and avoid selector hallucinations.
  3. Use model pages and inventories or the catalog endpoints when exact selectors are not already known.
  4. Use Quick-start and Expressions for request examples and expression syntax.

Raw file

The current public skill file can be fetched directly at:

https://gribstream.com/skills/gribstream-query.md

That file is meant to be portable across AI vendors. Tool-specific wrappers can be kept very thin and point back to that shared content.

What comes next

This page is intended to evolve. The next likely addition is an MCP offering the same capabilities through a more structured interface. When that exists, it can live here alongside the current skill file and OpenAPI contract.