GribStream

GribStream for AI tools

This page is the starting point for AI integrations with GribStream. It is intended for people and tools building workflows on top of GribStream with ChatGPT, Claude, Gemini, Codex, Claude Code, Cursor, custom agents, or any other AI runtime that can call tools and make HTTP requests.

The recommended integration is the hosted GribStream MCP connector. It helps AI tools discover datasets, resolve exact selectors, build valid /timeseries and /runs requests, validate expressions, and return copy-pasteable HTTP requests.

Hosted MCP connector

Use the hosted MCP connector when your AI tool supports remote MCP over Streamable HTTP.

https://gribstream.com/mcp

The connector is read-only and does not require a GribStream API token. It does not query weather data directly. Instead, it discovers metadata and builds regular GribStream API requests that you can run against /api/v2/{dataset}/timeseries or /api/v2/{dataset}/runs.

Setup examples

ChatGPT

If your ChatGPT account or workspace supports custom MCP connectors, add a connector with:

Claude

In Claude, add a custom connector from the Connectors settings:

Gemini CLI

Add this to ~/.gemini/settings.json:

{
  "mcpServers": {
    "gribstream": {
      "httpUrl": "https://gribstream.com/mcp",
      "timeout": 30000
    }
  }
}

Then start Gemini CLI and run /mcp to confirm the connector is available.

What the MCP provides

Use the MCP to create the request. Use the regular GribStream API to execute it.

Chat your way into an analysis

A forecast request can look reasonable and still be wrong. The connector helps the AI check the catalog, use exact selectors, choose the right endpoint, and validate the request before you run it.

A session can start broad:

What models does GribStream support for global forecasts?

Then narrow into a concrete request:

Build a request for temperature, wind speed, and relative humidity in Lisbon tomorrow.

Then keep iterating:

  1. Make it a grid over Portugal at 0.5 degrees.
  2. Show what this would have looked like using only forecasts available 18 hours ago.
  3. Give me the last three model runs forecasting those same valid hours.
  4. Switch from GFS to IFS and re-resolve the selectors.

With a GribStream API token available to a local AI tool, that can turn into actual analysis: compare models, calculate mean absolute error against an analysis dataset, search for weather thresholds, or summarize where two models disagree most.

Core resources

Authentication setup

The hosted MCP connector does not require authentication. Live GribStream data queries do require an API token when you run the generated API request.

  1. Create or sign in to an account at /auth/login.
  2. Create an API token from /app/dashboard.
  3. Set it as the environment variable GRIBSTREAM_API_TOKEN.
  4. Start the AI tool from that same shell or configure the tool to expose that variable to the runtime.

Recommended shell setup:

export GRIBSTREAM_API_TOKEN='YOUR_TOKEN_HERE'

CLI tools typically inherit environment variables from the parent shell that launched them. In practice that means you should export GRIBSTREAM_API_TOKEN first and then start the tool from that same terminal session. If the tool is already running, restart it after setting the variable.

Using an environment variable or a vendor secret store mapped to that variable is preferred. Avoid pasting tokens into prompts, checked-in files, or reusable scripts.

What the skill teaches

The public skill file is intentionally strict about the things AI tools tend to get wrong:

Recommended workflow for AI tools

  1. Connect the hosted MCP at https://gribstream.com/mcp if your AI tool supports remote MCP.
  2. Ask the AI tool to use GribStream MCP to discover datasets, resolve selectors, and build a validated request.
  3. Review the generated request, especially time ranges, coordinates, grid size, variables, and expressions.
  4. Run the generated request against the regular GribStream API with your GRIBSTREAM_API_TOKEN.
  5. Use the OpenAPI spec, skill file, Quick-start, and Expressions pages as references or fallback inputs for tools that do not support MCP.

Raw file

The current public skill file can be fetched directly at:

https://gribstream.com/skills/gribstream-query.md

That file is meant to be portable across AI vendors. It is most useful for tools that do not support remote MCP yet.