Hierarchy (view full)

Constructors

  • Create a new instance of OpenAI LLM.

    Parameters

    Returns LLM

    Remarks

    apiKey must be set to your OpenAI API key, either using the argument or by setting the OPENAI_API_KEY environmental variable.

Methods

  • Create a new instance of OpenAI LLM with Azure.

    Parameters

    • opts: {
          apiKey?: string;
          apiVersion?: string;
          azureAdToken?: string;
          azureAdTokenProvider?: (() => Promise<string>);
          azureDeployment?: string;
          azureEndpoint?: string;
          baseURL?: string;
          model: string;
          organization?: string;
          project?: string;
          temperature?: number;
          user?: string;
      } = defaultAzureLLMOptions
      • Optional apiKey?: string
      • Optional apiVersion?: string
      • Optional azureAdToken?: string
      • Optional azureAdTokenProvider?: (() => Promise<string>)
          • (): Promise<string>
          • Returns Promise<string>

      • Optional azureDeployment?: string
      • Optional azureEndpoint?: string
      • Optional baseURL?: string
      • model: string
      • Optional organization?: string
      • Optional project?: string
      • Optional temperature?: number
      • Optional user?: string

    Returns LLM

    Remarks

    This automatically infers the following arguments from their corresponding environment variables if they are not provided:

    • apiKey from AZURE_OPENAI_API_KEY
    • organization from OPENAI_ORG_ID
    • project from OPENAI_PROJECT_ID
    • azureAdToken from AZURE_OPENAI_AD_TOKEN
    • apiVersion from OPENAI_API_VERSION
    • azureEndpoint from AZURE_OPENAI_ENDPOINT
  • Create a new instance of Cerebras LLM.

    Parameters

    • opts: Partial<{
          apiKey?: string;
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
          user?: string;
      }> = {}

    Returns LLM

    Remarks

    apiKey must be set to your Cerebras API key, either using the argument or by setting the CEREBRAS_API_KEY environmental variable.

  • Create a new instance of DeepSeek LLM.

    Parameters

    • opts: Partial<{
          apiKey?: string;
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
          user?: string;
      }> = {}

    Returns LLM

    Remarks

    apiKey must be set to your DeepSeek API key, either using the argument or by setting the DEEPSEEK_API_KEY environmental variable.

  • Create a new instance of Fireworks LLM.

    Parameters

    Returns LLM

    Remarks

    apiKey must be set to your Fireworks API key, either using the argument or by setting the FIREWORKS_API_KEY environmental variable.

  • Create a new instance of Groq LLM.

    Parameters

    • opts: Partial<{
          apiKey?: string;
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
          user?: string;
      }> = {}

    Returns LLM

    Remarks

    apiKey must be set to your Groq API key, either using the argument or by setting the GROQ_API_KEY environmental variable.

  • Create a new instance of OctoAI LLM.

    Parameters

    • opts: Partial<{
          apiKey?: string;
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
          user?: string;
      }> = {}

    Returns LLM

    Remarks

    apiKey must be set to your OctoAI API key, either using the argument or by setting the OCTOAI_TOKEN environmental variable.

  • Create a new instance of Ollama LLM.

    Parameters

    • opts: Partial<{
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
      }> = {}

    Returns LLM

  • Create a new instance of PerplexityAI LLM.

    Parameters

    • opts: Partial<{
          apiKey?: string;
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
          user?: string;
      }> = {}

    Returns LLM

    Remarks

    apiKey must be set to your PerplexityAI API key, either using the argument or by setting the PERPLEXITY_API_KEY environmental variable.

  • Create a new instance of Telnyx LLM.

    Parameters

    • opts: Partial<{
          apiKey?: string;
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
          user?: string;
      }> = {}

    Returns LLM

    Remarks

    apiKey must be set to your Telnyx API key, either using the argument or by setting the TELNYX_API_KEY environmental variable.

  • Create a new instance of TogetherAI LLM.

    Parameters

    • opts: Partial<{
          apiKey?: string;
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
          user?: string;
      }> = {}

    Returns LLM

    Remarks

    apiKey must be set to your TogetherAI API key, either using the argument or by setting the TOGETHER_API_KEY environmental variable.

  • Create a new instance of xAI LLM.

    Parameters

    • opts: Partial<{
          apiKey?: string;
          baseURL?: string;
          client: OpenAI;
          model: string;
          temperature?: number;
          user?: string;
      }> = {}

    Returns LLM

    Remarks

    apiKey must be set to your xAI API key, either using the argument or by setting the XAI_API_KEY environmental variable.