Model
LLMs at your fingertips
AXAR AI supports multiple providers to interact with different language models. We can specify the provider and model identifier using the @model
annotation when defining our agent.
OpenAI
Supported models
gpt-4o
@model('openai:gpt-4o')
gpt-4o-mini
@model('openai:gpt-4o-mini')
gpt-4-turbo
@model('openai:gpt-4-turbo')
gpt-4
@model('openai:gpt-4')
o1
@model('openai:o1')
o1-mini
@model('openai:o1-mini')
Anthropic
Supported models
claude-3-5-sonnet-20241022
@model('anthropic:claude-3-5-sonnet-20241022')
claude-3-5-sonnet-20240620
@model('anthropic:claude-3-5-sonnet-20240620')
claude-3-5-haiku-20241022
@model('anthropic:claude-3-5-haiku-20241022')
Google
Prerequisites
Supported models
gemini-2.0-flash-exp
@model('google:gemini-2.0-flash-exp')
gemini-1.5-flash
@model('google:gemini-1.5-flash')
gemini-1.5-pro
@model('google:gemini-1.5-pro')
DeepSeek
Prerequisites
Supported models
deepseek-chat
@model('deepseek:deepseek-chat')
Cerebras
Prerequisites
Supported models
llama3.1-8b
@model('
cerebras:llama3.1-8b')
llama3.1-70b
@model('
cerebras:llama3.1-70b')
llama3.3-70b
@model('
cerebras:llama3.3-70b')
Groq
Prerequisites
Supported models
llama-3.3-70b-versatile
@model('groq:llama-3.3-70b-versatile')
llama-3.1-8b-instant
@model('groq:llama-3.1-8b-instant')
mixtral-8x7b-32768
@model('groq:mixtral-8x7b-32768')
gemma2-9b-it
@model('groq:gemma2-9b-it')
Last updated