Home > Technology peripherals > AI > body text

Mistral AI two consecutive releases: 7B mathematical reasoning dedicated, Mamba2 architecture code large model

王林
Release: 2024-07-19 09:54:11
Original
235 people have browsed it
Netizens are curious whether Mathstral can solve the problem of "who is bigger, 9.11 or 9.9?"

Yesterday, the AI ​​​​circle was overwhelmed by a simple question like "Who is bigger, 9.11 or 9.9?" Big language models including OpenAI GPT-4o, Google Gemini, etc. all overturned. Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

This allows us to see that large language models cannot understand and give correct answers like humans when dealing with some numerical problems.

For numbers and complex mathematical problems, special models are more specialized.

Today, the French large model unicorn Mistral AI released a 7B large model "Mathstral" that focuses on mathematical reasoning and scientific discovery to solve advanced mathematical problems that require complex, multi-step logical reasoning.

This model is built based on Mistral 7B, supports a context window length of 32k, and follows the open source agreement Apache 2.0 license.

Mathstral was built to pursue an excellent performance-speed tradeoff, a development philosophy that Mistral AI actively promotes, especially with its fine-tuning capabilities.

Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

At the same time, Mathstral is an instruction model, which can be used or fine-tuned. Model weights have been placed on HuggingFace.

  • Model weights: https://huggingface.co/mistralai/mathstral-7B-v0.1

The picture below shows the MMLU performance difference between Mathstral 7B and Mistral 7B (press subject division).

Mathstral achieves state-of-the-art inference performance at its scale on a variety of industry standard benchmarks. Especially on the MATH data set, it achieved a pass rate of 56.6% and a pass rate of 63.47% on MMLU.

Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

At the same time, Mathstral’s pass rate on MATH (56.6%) is more than 20% higher than Minerva 540B. Additionally, Mathstral scored 68.4% on MATH with majority voting @64 and 74.6% using the reward model.

Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

This result also made netizens curious whether Mathstral can solve the problem of "who is bigger, 9.11 or 9.9?"

Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

Code large model: Codestral Mamba

Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

  • Model weights: https://huggingface.co/mistralai/mamba-codestral-7B-v0.1

with Released together with Mathstral 7B, there is also a Codestral Mamba model specifically used for code generation, which uses the Mamba2 architecture and also follows the Apache 2.0 license open source agreement. This is a guidance model with more than 7 billion parameters that researchers can use, modify and distribute for free.

It is worth mentioning that Codestral Mamba was designed with the help of Mamba authors Albert Gu and Tri Dao.

For a long time, the Transformer architecture has supported half of the AI ​​field. However, unlike Transformer, the Mamba model has the advantage of linear time reasoning and can theoretically model sequences of infinite length. The architecture allows users to interact with the model extensively and responsively without being limited by input length. This efficiency is especially important for code generation.

In benchmark testing, Codestral Mamba outperformed competing open source models CodeLlama 7B, CodeGemma-1.17B and DeepSeek in the HumanEval test.

Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

Mistral tested the model, which is available for free on Mistral’s a la Plateforme API, and can handle inputs of up to 256,000 tokens – twice as much as OpenAI’s GPT-4o.

With the release of Codestral Mamba, some netizens have used it in VSCode, and it is very smooth.

Mistral AI两连发:7B数学推理专用、Mamba2架构代码大模型

Reference link:
https://mistral.ai/news/mathstral/
https://mistral.ai/news/codestral-mamba/

The above is the detailed content of Mistral AI two consecutive releases: 7B mathematical reasoning dedicated, Mamba2 architecture code large model. For more information, please follow other related articles on the PHP Chinese website!

Related labels:
source:jiqizhixin.com
Statement of this Website
The content of this article is voluntarily contributed by netizens, and the copyright belongs to the original author. This site does not assume corresponding legal responsibility. If you find any content suspected of plagiarism or infringement, please contact admin@php.cn
Popular Tutorials
More>
Latest Downloads
More>
Web Effects
Website Source Code
Website Materials
Front End Template
About us Disclaimer Sitemap
php.cn:Public welfare online PHP training,Help PHP learners grow quickly!