apiKey must be set to your OpenAI API key, either using the argument or by setting the
OPENAI_API_KEY environment variable.
Get the model name/identifier for this LLM instance.
The model name if available, "unknown" otherwise.
Plugins should override this property to provide their model information.
Get the provider name for this LLM instance.
The provider name if available, "unknown" otherwise.
Plugins should override this property to provide their provider information.
Returns a LLMStream that can be used to push text and receive LLM responses.
Optional connOptional extraOptional parallelOptional toolOptional toolStatic withCreate a new instance of OpenAI LLM with Azure.
Optional apiOptional apiOptional azureOptional azureOptional azureOptional azureOptional baseURL?: stringOptional organization?: stringOptional project?: stringOptional temperature?: numberOptional user?: stringThis automatically infers the following arguments from their corresponding environment variables if they are not provided:
apiKey from AZURE_OPENAI_API_KEYorganization from OPENAI_ORG_IDproject from OPENAI_PROJECT_IDazureAdToken from AZURE_OPENAI_AD_TOKENapiVersion from OPENAI_API_VERSIONazureEndpoint from AZURE_OPENAI_ENDPOINTStatic withCreate a new instance of Cerebras LLM.
apiKey must be set to your Cerebras API key, either using the argument or by setting the
CEREBRAS_API_KEY environment variable.
Static withCreate a new instance of DeepSeek LLM.
apiKey must be set to your DeepSeek API key, either using the argument or by setting the
DEEPSEEK_API_KEY environment variable.
Static withCreate a new instance of Fireworks LLM.
apiKey must be set to your Fireworks API key, either using the argument or by setting the
FIREWORKS_API_KEY environment variable.
Static withCreate a new instance of Groq LLM.
apiKey must be set to your Groq API key, either using the argument or by setting the
GROQ_API_KEY environment variable.
Static withCreate a new instance of Meta Llama LLM.
apiKey must be set to your Meta Llama API key, either using the argument or by setting the
LLAMA_API_KEY environment variable.
Static withOVHcloudCreate a new instance of OVHcloud AI Endpoints LLM.
apiKey must be set to your OVHcloud AI Endpoints API key, either using the argument or by setting the
OVHCLOUD_API_KEY environment variable.
Static withCreate a new instance of OctoAI LLM.
apiKey must be set to your OctoAI API key, either using the argument or by setting the
OCTOAI_TOKEN environment variable.
Static withStatic withCreate a new instance of PerplexityAI LLM.
apiKey must be set to your PerplexityAI API key, either using the argument or by setting the
PERPLEXITY_API_KEY environment variable.
Static withCreate a new instance of Telnyx LLM.
apiKey must be set to your Telnyx API key, either using the argument or by setting the
TELNYX_API_KEY environment variable.
Static withCreate a new instance of TogetherAI LLM.
apiKey must be set to your TogetherAI API key, either using the argument or by setting the
TOGETHER_API_KEY environment variable.
Static withXAICreate a new instance of xAI LLM.
apiKey must be set to your xAI API key, either using the argument or by setting the
XAI_API_KEY environment variable.
Create a new instance of OpenAI LLM.