Available AIM containers

Available AIM containers#

Cohere Labs#

111B parameter language model with configurable reasoning and tool use capabilities.

Meta Llama#

Multilingual 405B parameter instruction-tuned language model for dialogue use cases.

Multilingual 8B parameter instruction-tuned language model for dialogue use cases.

Multilingual 1B parameter instruction-tuned language model for dialogue and on-device use cases.

Multilingual 3B parameter instruction-tuned language model for dialogue and on-device use cases.

Multilingual 70B parameter instruction-tuned language model for dialogue use cases.

Mistral AI#

14B parameter instruction-tuned language model with vision and function calling capabilities.

675B parameter granular MoE multimodal model with 41B active parameters and vision capabilities.

24B parameter instruction-tuned language model with vision and function calling capabilities.

Sparse MoE language model with 141B total parameters across 8 experts and function calling support.

Sparse MoE language model with 47B total parameters across 8 experts.

OpenAI#

Open-weight 117B parameter MoE model with 5.1B active parameters and configurable reasoning.

Open-weight 21B parameter MoE model with 3.6B active parameters for lower-latency use cases.

Qwen#

235B parameter MoE language model with 22B active parameters and dual thinking modes.

Qwen/Qwen3-32B (stable)

32.8B parameter dense language model with dual thinking modes and multilingual support.

deepseek-ai#

671B parameter MoE reasoning model with 37B active parameters and 128K context length.

671B parameter MoE reasoning model with 37B active parameters, updated version of DeepSeek-R1.

671B parameter MoE model with 37B active parameters supporting thinking and non-thinking modes.

671B parameter MoE model with 37B active parameters, refined for language consistency and agent tasks.