# Model

AXAR AI supports multiple providers to interact with different language models. We can specify the provider and model identifier using the `@model` annotation when defining our agent.&#x20;

<details>

<summary>Setting up provider API keys</summary>

To use a specific provider, set up the required API key as an environment variable. Each provider has a specific environment variable name. For example, if you're using **Anthropic**, you can set the API key like this:

```bash
export ANTHROPIC_API_KEY="sk-proj-YOUR-API-KEY"
```

In development, you can simplify API key management by using the `dotenv` package along with a `.env` file. Here’s how to set it up:

1. **Install the `dotenv` package**\
   Use npm or yarn to install the `dotenv` package:

   ```bash
   npm install dotenv
   # or
   yarn add dotenv
   ```
2. **Create a `.env` file**\
   In the root directory of your project, create a `.env` file and add your API keys:

   ```bash
   ANTHROPIC_API_KEY=sk-proj-YOUR-API-KEY
   ```
3. **Load the `.env` file in your code**\
   Import and configure `dotenv` at the entry point of your application (e.g., `index.ts`):

   ```typescript
   import * as dotenv from 'dotenv';
   // Load environment variables from .env file
   dotenv.config();
   ```

</details>

### OpenAI

#### Supported models

<table data-full-width="false"><thead><tr><th width="307">Model</th><th>Usage</th></tr></thead><tbody><tr><td><code>gpt-4o</code></td><td><code>@model('openai:gpt-4o')</code></td></tr><tr><td><code>gpt-4o-mini</code></td><td><code>@model('openai:gpt-4o-mini')</code></td></tr><tr><td><code>gpt-4-turbo</code></td><td><code>@model('openai:gpt-4-turbo')</code></td></tr><tr><td><code>gpt-4</code></td><td><code>@model('openai:gpt-4')</code></td></tr><tr><td><code>o1</code></td><td><code>@model('openai:o1')</code></td></tr><tr><td><code>o1-mini</code></td><td><code>@model('openai:o1-mini')</code></td></tr></tbody></table>

{% hint style="info" %}
The API key for **OpenAI** is read from the `OPENAI_API_KEY` environment variable.
{% endhint %}

### Anthropic

#### Supported models

<table data-full-width="false"><thead><tr><th width="309">Model</th><th>Usage</th></tr></thead><tbody><tr><td><code>claude-3-5-sonnet-20241022</code></td><td><code>@model('anthropic:claude-3-5-sonnet-20241022')</code></td></tr><tr><td><code>claude-3-5-sonnet-20240620</code></td><td><code>@model('anthropic:claude-3-5-sonnet-20240620')</code></td></tr><tr><td><code>claude-3-5-haiku-20241022</code></td><td><code>@model('anthropic:claude-3-5-haiku-20241022')</code></td></tr></tbody></table>

{% hint style="info" %}
The API key for **Anthropic** is read from the `ANTHROPIC_API_KEY` environment variable.
{% endhint %}

### Google

#### Prerequisites

```bash
npm install @ai-sdk/google
```

{% hint style="info" %}
The API key for **Google** is read from the `GOOGLE_GENERATIVE_AI_API_KEY` env variable.
{% endhint %}

#### Supported models

<table data-full-width="false"><thead><tr><th width="315">Model</th><th>Usage</th></tr></thead><tbody><tr><td><code>gemini-2.0-flash-exp</code></td><td><code>@model('google:gemini-2.0-flash-exp')</code></td></tr><tr><td><code>gemini-1.5-flash</code></td><td><code>@model('google:gemini-1.5-flash')</code></td></tr><tr><td><code>gemini-1.5-pro</code></td><td><code>@model('google:gemini-1.5-pro')</code></td></tr></tbody></table>

### DeepSeek

#### Prerequisites

```bash
npm install @ai-sdk/deepseek
```

{% hint style="info" %}
The API key for **DeepSeek** is read from the `DEEPSEEK_API_KEY` env var.
{% endhint %}

#### Supported models

<table data-full-width="false"><thead><tr><th width="317">Model</th><th>Usage</th></tr></thead><tbody><tr><td><code>deepseek-chat</code></td><td><code>@model('deepseek:deepseek-chat')</code></td></tr></tbody></table>

### Cerebras

#### Prerequisites

```bash
npm install @ai-sdk/cerebras
```

{% hint style="info" %}
The API key for **Cerebras** is read from the`CEREBRAS_API_KEY` environment variable.
{% endhint %}

#### Supported models

<table data-full-width="false"><thead><tr><th width="318">Model</th><th>Usage</th></tr></thead><tbody><tr><td><code>llama3.1-8b</code></td><td><code>@model('</code>cerebras<code>:llama3.1-8b')</code></td></tr><tr><td><code>llama3.1-70b</code></td><td><code>@model('</code>cerebras<code>:llama3.1-70b')</code></td></tr><tr><td><code>llama3.3-70b</code></td><td><code>@model('</code>cerebras<code>:llama3.3-70b')</code></td></tr></tbody></table>

### Groq

#### Prerequisites

```bash
npm install @ai-sdk/groq
```

{% hint style="info" %}
The API key for **Groq** is read from the *`GROQ_API_KEY`* environment variable.
{% endhint %}

#### Supported models

<table data-full-width="false"><thead><tr><th width="321">Model</th><th>Usage</th></tr></thead><tbody><tr><td><code>llama-3.3-70b-versatile</code></td><td><code>@model('groq:llama-3.3-70b-versatile')</code></td></tr><tr><td><code>llama-3.1-8b-instant</code></td><td><code>@model('groq:llama-3.1-8b-instant')</code></td></tr><tr><td><code>mixtral-8x7b-32768</code></td><td><code>@model('groq:mixtral-8x7b-32768')</code></td></tr><tr><td><code>gemma2-9b-it</code></td><td><code>@model('groq:gemma2-9b-it')</code></td></tr></tbody></table>

{% hint style="success" %}
We can support all models and providers found here, if there's one missing or not working let us know: <https://sdk.vercel.ai/docs/foundations/providers-and-models>
{% endhint %}
