Methods
Abstract
chat
- chat(__namedParameters): llm.LLMStream
Parameters
- __namedParameters: {
chatCtx: ChatContext;
fncCtx?: FunctionContext;
n?: number;
parallelToolCalls?: boolean;
temperature?: number;
}Optional
n?: number
Optional
parallelToolCalls?: boolean
Optional
temperature?: number
Returns a LLMStream that can be used to push text and receive LLM responses.