Cloud Blog: Announcing Mistral AI’s Large-Instruct-2411 and Codestral-2411 on Vertex AI

Source URL: https://cloud.google.com/blog/products/ai-machine-learning/announcing-mistral-ais-large-instruct-2411-and-codestral-2411-on-vertex-ai/
Source: Cloud Blog
Title: Announcing Mistral AI’s Large-Instruct-2411 and Codestral-2411 on Vertex AI

Feedly Summary: In July, we announced the availability of Mistral AI’s models on Vertex AI: Codestral for code generation tasks, Mistral Large 2 for high-complexity tasks, and the lightweight Mistral Nemo for reasoning tasks like creative writing. Today, we’re announcing the availability of Mistral AI’s newest models on Vertex AI Model Garden: Mistral-Large-Instruct-2411 is now generally available, and the new Codestral-2411 will be available in the coming weeks.

Large-Instruct-2411: Mistral AI’s latest version of Mistral Large is an advanced dense large language model (LLM) of 123B parameters with strong reasoning, knowledge and coding capabilities extending its predecessor with better long context, function calling and system prompt. The model is ideal for use cases that include complex agentic workflows with precise instruction following and JSON outputs, or large context applications requiring strong adherence for retrieval-augmented generation (RAG), and code generation. 
Codestral-2411: Mistral AI’s latest version of Codestral was explicitly designed for code generation tasks, improving on its predecessor on latency. It helps developers write and interact with code through a shared instruction and completion API endpoint.

You can access and deploy the new Mistral AI Large-Instruct-2411 model on Vertex AI through our Model-as-a-Service (MaaS) or self-service offering today.

aside_block
), (‘btn_text’, ‘Start building for free’), (‘href’, ‘http://console.cloud.google.com/freetrial?redirectPath=/vertex-ai/’), (‘image’, None)])]>

What can you do with the new Mistral AI models on Vertex AI?
By building with Mistral’s models on Vertex AI, you can:

Select the best model for your use case:  Choose from a range of Mistral AI models, including efficient options for low-latency needs and powerful models for complex tasks like agentic workflows. Vertex AI makes it easy to evaluate and select the optimal model.
Experiment with confidence: Mistral AI models are available as fully managed Model-as-a-Service on Vertex AI. You can explore Mistral AI models through simple API calls and comprehensive side-by-side evaluations within our intuitive environment.
Manage models without overhead: Simplify how you deploy the new Mistral AI models at scale with fully managed infrastructure designed for AI workloads and the flexibility of pay-as-you-go pricing.
Tune the models to your needs: In the coming weeks, you will be able to fine-tune Mistral AI’s models to create bespoke solutions, with your unique data and domain knowledge. 
Craft intelligent agents: Create and orchestrate agents powered by Mistral AI models, using Vertex AI’s comprehensive set of tools, including LangChain on Vertex AI. Integrate Mistral AI models into your production-ready AI experiences with Genkit’s Vertex AI plugin.
Build with enterprise-grade security and compliance: Leverage Google Cloud’s built-in security, privacy, and compliance measures. Enterprise controls, such as Vertex AI Model Garden’s new organization policy, provide the right access controls to make sure only approved models can be accessed.

Get started with Mistral AI models on Google Cloud
These additions continue Google Cloud’s commitment to open and flexible AI ecosystems that help you build solutions best-suited to your needs. Our collaboration with Mistral AI is a testament to our open approach, within a unified and an enterprise ready environment. Vertex AI provides a curated collection of first-party, open-source, and third-party models, many of which — including the new Mistral AI models — can be delivered as a fully-managed Model-as-a-service (MaaS) offering — providing you with the simplicity of a single bill and enterprise-grade security on our fully-managed infrastructure. 
To start building with Mistral’s newest models, visit Model Garden and select the Mistral Large or Codestral model tiles. The models are also available on Google Cloud Marketplace here: Mistral Codestral and Mistral Large.
You can check out our sample code and documentation to help you get started.

AI Summary and Description: Yes

**Summary:** The text details the recent availability of Mistral AI’s models on Google Cloud’s Vertex AI, including advanced language models for various tasks such as code generation and complex reasoning. It emphasizes the models’ capabilities, deployment ease, and enterprise-grade security features, showcasing their relevance to professionals in AI and cloud computing sectors.

**Detailed Description:**
The announcement is centered around the integration of Mistral AI’s latest models into Google Cloud’s Vertex AI, providing significant advancements in AI capabilities for users. Here are the major points highlighted:

– **Model Availability:**
– Mistral AI’s models such as Large-Instruct-2411 and Codestral-2411 are positioned to enhance capabilities in varied applications from coding to complex reasoning.
– The Large-Instruct-2411 model features 123 billion parameters with enhanced reasoning and coding capabilities suitable for complex workflows.

– **Model Optimization:**
– Improvements in the Codestral-2411 model focus on latency for code generation tasks, which could accelerate development workflows.
– The models are designed to support a variety of output formats, including JSON, and are adept at understanding and processing longer context inputs.

– **Deployment and Management:**
– Mistral AI models are accessible via Model-as-a-Service (MaaS) framework, allowing users to deploy models effortlessly and experiment through API calls.
– The infrastructure is fully managed, eliminating overhead for developers, and supports flexible payment options.

– **Customization and Control:**
– Users will soon have the ability to fine-tune these models according to their specific domain needs and data.
– Google Cloud emphasizes strong security and compliance measures with tools designed to enforce organizational policies and controls over model access.

– **Integration with Ecosystem:**
– The text suggests collaboration with other tools in the ecosystem, highlighting the use of LangChain and Genkit’s plugin for enhanced capabilities in creating intelligent agents.

– **Enterprise Features:**
– Focus on enterprise-grade security, privacy, and compliance, which are paramount for organizations when utilizing cloud-based AI solutions.
– The announcement reinforces Google Cloud’s commitment to providing open and flexible environments for AI development, making it easier for enterprises to incorporate these technologies.

**Key Insights:**
– **Relevance for Security Professionals:** The integration of strong security and compliance frameworks in the deployment of AI models is crucial. It reflects a growing demand for robust governance in AI applications, particularly in cloud environments where data sensitivity and regulatory compliance are paramount.
– **Industry Implications:** The AI models’ enhancements in terms of capability and ease of use suggest a push towards more sophisticated AI applications in business, potentially impacting various sectors such as software development, data science, and IT operations.

Overall, the release and capabilities of Mistral AI’s models on Vertex AI signify significant advancements in both functionality and usability, providing a comprehensive solution for organizations looking to implement cutting-edge AI solutions securely and efficiently.