fromLLM()
Creates a unified reactive source for LLM inference via any OpenAI-compatible endpoint.
Signature
ts
function fromLLM(opts: LLMOptions): LLMStoreParameters
| Parameter | Type | Description |
|---|---|---|
opts | LLMOptions | Provider configuration (provider, baseURL, apiKey, model). |
Returns
LLMStore — Store<string> with status, error, tokens, toolCalls, generationId companion stores, plus generate() and abort().
Basic Usage
ts
import { fromLLM, toToolCallRequests } from 'callbag-recharge/ai';
import { effect } from 'callbag-recharge';
const llm = fromLLM({ provider: 'openai', apiKey: 'sk-...', model: 'gpt-4o' });
// Text generation
llm.generate([{ role: 'user', content: 'What is TypeScript?' }]);
// Tool calling
llm.generate(messages, { tools: registry.definitions() });
effect([llm.toolCalls], () => {
const calls = llm.toolCalls.get();
if (calls.length > 0) {
registry.execute(toToolCallRequests(calls));
}
});Options / Behavior Details
- Provider-agnostic: Works with OpenAI, Ollama, Anthropic (via proxy), Vercel AI SDK, or any OpenAI-compatible endpoint.
- No hard deps: Uses fetch + SSE line parsing. No SDK imports required.
- Auto-cancel: Calling
generate()while streaming aborts the previous generation. - Generation nonce:
generationIdis a monotonically increasingStore<number>incremented on eachgenerate()call. Use it to distinguish stale status emissions from previous generations when subscribing tostatus. - Tool calling: Pass
toolsinGenerateOptionsto enable function calling. Parsed tool calls accumulate in thetoolCallsstore. UsetoToolCallRequests()to convert toToolCallRequest[]fortoolRegistry.execute(). - Token tracking:
tokensstore populated on stream completion (when usage data is available). - Status: Uses WithStatusStatus enum (pending → active → completed/errored) for consistent lifecycle tracking.
- Persistent source: This is a long-lived store backed by
state(). It does not send callbag END — lifecycle is managed imperatively viagenerate()/abort(), not via stream completion. Do not wrap withwithStatus()orretry()— use the built-in.statusand.errorcompanions instead.