@charcoalhq/ai-sdk wraps Charcoal search as a Vercel AI SDK tool. Give your agent one tool and Charcoal handles planning, multi-hop retrieval, and citation over your corpus — your agent reasons over complete, cited results instead of managing a search loop.
Install
CHARCOAL_API_KEY in your environment. See Getting started for how to create an API key.
Usage
The package ships two entry points —searchTool (zero-config) and createSearchTool (factory with developer-supplied defaults).
searchTool
Your agent’s model supplies namespace, objective, and context on every call.
createSearchTool
Bake the namespace, auth, filters, or description into the tool. Anything set here is removed from the input schema your agent’s model sees, so it only reasons about objective and context.
client.namespaces.search.create(namespace, { objective, context, filters? }) on @charcoalhq/sdk and returns a synthesized answer with cited excerpts.
Streaming
BothsearchTool and createSearchTool stream by default. execute is an async generator that yields each SearchStreamEvent as it arrives — the AI SDK surfaces intermediate status events as preliminary tool outputs on the data stream, so your UI sees progress in real time. The final yield is the session_result event, which your agent’s model consumes as the tool result. Error events are thrown by the underlying SDK.
createSearchTool({ stream: false }).