Optional
apiOptional
apiOptional
baseURL?: stringOptional
modelOptional
provider?: stringGet the model name/identifier for this LLM instance.
The model name if available, "unknown" otherwise.
Plugins should override this property to provide their model information.
Returns a LLMStream that can be used to push text and receive LLM responses.
Optional
connOptional
extraOptional
parallelOptional
toolOptional
toolStatic
from
Livekit Cloud Inference LLM