Kitaru

llm

LLM call primitive for tracked model interactions.

kitaru.llm() wraps one provider SDK completion call with Kitaru tracking. Built-in runtime support covers openai/*, anthropic/*, ollama/*, and openrouter/* models. Ollama and OpenRouter use the OpenAI-compatible API and require the openai package (pip install kitaru[openai]).

funcllm(prompt, *, model=None, system=None, temperature=None, max_tokens=None, name=None) -> str

Make a tracked LLM call.

parampromptstr | list[dict[str, Any]]

User prompt text or a chat-style message list.

parammodelstr | None
= None

Model alias or provider/model identifier (e.g. openai/gpt-5-nano).

paramsystemstr | None
= None

Optional system prompt.

paramtemperaturefloat | None
= None

Optional sampling temperature.

parammax_tokensint | None
= None

Optional maximum response tokens.

paramnamestr | None
= None

Optional display name for this call.

Returns

str

The model response text.