Model
LLMs at your fingertips
AXAR AI supports multiple providers to interact with different language models. We can specify the provider and model identifier using the @model annotation when defining our agent.
Setting up provider API keys
To use a specific provider, set up the required API key as an environment variable. Each provider has a specific environment variable name. For example, if you're using Anthropic, you can set the API key like this:
export ANTHROPIC_API_KEY="sk-proj-YOUR-API-KEY"In development, you can simplify API key management by using the dotenv package along with a .env file. Here’s how to set it up:
Install the
dotenvpackage Use npm or yarn to install thedotenvpackage:npm install dotenv # or yarn add dotenvCreate a
.envfile In the root directory of your project, create a.envfile and add your API keys:ANTHROPIC_API_KEY=sk-proj-YOUR-API-KEYLoad the
.envfile in your code Import and configuredotenvat the entry point of your application (e.g.,index.ts):import * as dotenv from 'dotenv'; // Load environment variables from .env file dotenv.config();
OpenAI
Supported models
gpt-4o
@model('openai:gpt-4o')
gpt-4o-mini
@model('openai:gpt-4o-mini')
gpt-4-turbo
@model('openai:gpt-4-turbo')
gpt-4
@model('openai:gpt-4')
o1
@model('openai:o1')
o1-mini
@model('openai:o1-mini')
The API key for OpenAI is read from the OPENAI_API_KEY environment variable.
Anthropic
Supported models
claude-3-5-sonnet-20241022
@model('anthropic:claude-3-5-sonnet-20241022')
claude-3-5-sonnet-20240620
@model('anthropic:claude-3-5-sonnet-20240620')
claude-3-5-haiku-20241022
@model('anthropic:claude-3-5-haiku-20241022')
The API key for Anthropic is read from the ANTHROPIC_API_KEY environment variable.
Google
Prerequisites
The API key for Google is read from the GOOGLE_GENERATIVE_AI_API_KEY env variable.
Supported models
gemini-2.0-flash-exp
@model('google:gemini-2.0-flash-exp')
gemini-1.5-flash
@model('google:gemini-1.5-flash')
gemini-1.5-pro
@model('google:gemini-1.5-pro')
DeepSeek
Prerequisites
The API key for DeepSeek is read from the DEEPSEEK_API_KEY env var.
Supported models
deepseek-chat
@model('deepseek:deepseek-chat')
Cerebras
Prerequisites
The API key for Cerebras is read from theCEREBRAS_API_KEY environment variable.
Supported models
llama3.1-8b
@model('cerebras:llama3.1-8b')
llama3.1-70b
@model('cerebras:llama3.1-70b')
llama3.3-70b
@model('cerebras:llama3.3-70b')
Groq
Prerequisites
The API key for Groq is read from the GROQ_API_KEY environment variable.
Supported models
llama-3.3-70b-versatile
@model('groq:llama-3.3-70b-versatile')
llama-3.1-8b-instant
@model('groq:llama-3.1-8b-instant')
mixtral-8x7b-32768
@model('groq:mixtral-8x7b-32768')
gemma2-9b-it
@model('groq:gemma2-9b-it')
We can support all models and providers found here, if there's one missing or not working let us know: https://sdk.vercel.ai/docs/foundations/providers-and-models
Last updated