Skip to main content
@charcoalhq/ai-sdk wraps Charcoal search as a Vercel AI SDK tool. Give your agent one tool and Charcoal handles planning, multi-hop retrieval, and citation over your corpus — your agent reasons over complete, cited results instead of managing a search loop.

Install

npm install @charcoalhq/ai-sdk @charcoalhq/sdk ai zod
Set CHARCOAL_API_KEY in your environment. See Getting started for how to create an API key.

Usage

The package ships two entry points — searchTool (zero-config) and createSearchTool (factory with developer-supplied defaults).

searchTool

Your agent’s model supplies namespace, objective, and context on every call.
import { generateText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
import { searchTool } from "@charcoalhq/ai-sdk";

const { text } = await generateText({
  model: anthropic("claude-sonnet-4-6"),
  tools: { search: searchTool },
  prompt:
    "In the `contracts` namespace, find every MSA with auto-renewal longer than 24 months.",
});

createSearchTool

Bake the namespace, auth, filters, or description into the tool. Anything set here is removed from the input schema your agent’s model sees, so it only reasons about objective and context.
import { generateText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
import { createSearchTool } from "@charcoalhq/ai-sdk";

const contractSearch = createSearchTool({
  namespace: "contracts",
  filters: { status: "active" },
  description: "Search active vendor contracts.",
});

const { text } = await generateText({
  model: anthropic("claude-sonnet-4-6"),
  tools: { search: contractSearch },
  prompt: "Find every MSA with auto-renewal longer than 24 months.",
});
Under the hood, each tool call hits client.namespaces.search.create(namespace, { objective, context, filters? }) on @charcoalhq/sdk and returns a synthesized answer with cited excerpts.

Streaming

Both searchTool and createSearchTool stream by default. execute is an async generator that yields each SearchStreamEvent as it arrives — the AI SDK surfaces intermediate status events as preliminary tool outputs on the data stream, so your UI sees progress in real time. The final yield is the session_result event, which your agent’s model consumes as the tool result. Error events are thrown by the underlying SDK.
import { streamText } from "ai";
import { anthropic } from "@ai-sdk/anthropic";
import { createSearchTool } from "@charcoalhq/ai-sdk";

const { fullStream } = streamText({
  model: anthropic("claude-sonnet-4-6"),
  tools: { search: createSearchTool({ namespace: "contracts" }) },
  prompt: "...",
});

for await (const part of fullStream) {
  if (part.type === "tool-output-available") {
    // part.output is a SearchStreamEvent — 'status', 'session_result', etc.
  }
}
Opt into a single-shot non-streaming call with createSearchTool({ stream: false }).

Options

type CreateSearchToolOptions = {
  // Namespace to search. If set, removed from the input schema your agent's model sees.
  namespace?: string;
  // Defaults to process.env.CHARCOAL_API_KEY.
  apiKey?: string;
  // Pre-built Charcoal client (takes precedence over apiKey and baseUrl).
  client?: Charcoal;
  // Filters applied to every search call.
  filters?: SearchCreateParams["filters"];
  // Override the tool description shown to your agent's model.
  description?: string;
  // Stream SSE events through execute as an async generator. Defaults to true.
  stream?: boolean;
  // Override the API base URL. Defaults to https://api.withcharcoal.com.
  baseUrl?: string;
};
See the Search guide for the full search API, the Filters guide for filter syntax, and the source on GitHub for the latest types.