Mistral AI has launched Mistral Large 2, a 123 billion parameter AI model featuring a 128K context window, support for dozens of languages, and enhanced performance across various benchmarks.The model is also compatible with over 80 coding languages.

Designed for single-node inference with long-context applications, Mistral Large 2 demonstrates improved performance on various benchmarks. The pretrained version achieves 84.0% accuracy on MMLU, showcasing enhanced code generation and reasoning capabilities. The model also exhibits improved instruction following and conversational abilities, while reducing the tendency to "hallucinate" or generate inaccurate information.

Mistral Large 2 features enhanced function calling and retrieval skills, making it a versatile tool for a wide range of applications. The model is available under the Mistral Research License for non-commercial use, with a commercial license option for self-deployment. Researchers and businesses can access Mistral Large 2 via Mistral AI's la Plateforme and through partnerships with major cloud service providers including Google Cloud Platform, Azure AI Studio, Amazon Bedrock, and IBM watsonx.ai.

Additionally, Mistral AI has expanded fine-tuning capabilities on la Plateforme for Mistral Large, Mistral Nemo, and Codestral models, offering users more flexibility in customiing the models for specific tasks.



Share this post
The link has been copied!