Updated: 3 Dec| Published: 2 Dec
Mistral AI has launched the Mistral 3 family of ten open-source multilingual and multimodal AI models, headlined by Mistral Large 3, which uses a Mixture of Experts (MoE) architecture and is optimized for deployment on the NVIDIA GB200 NVL72 supercomputing system.The Mistral 3 lineup includes nine smaller "Ministral 3" variants designed for NVIDIA's edge platforms, with co-founder Guillaume Lample highlighting that even the smallest Ministral3 3B model outperforms models four times larger and can run locally on consumer devices without internet.Mistral AI, with funding exceeding $1.05 billion including investment from ASML, differentiates itself from proprietary U.S. competitors by releasing its models under the Apache 2.0 license and supporting public sector initiatives such as "AI for Citizens".
Tags

#France

#Paris

#GuillaumeLample

#MistralAI

#MistralLarge3

#NVIDIAGB200NVL72

#Ministral3

#NVIDIA

#Ministral33B

#ASML

#Apache20

#AIForCitizens

Read more
VentureBeat Nvidia Semafor
Discuss
Español
Français
Deutsch
Italiano
Svenska
26
Earth
USA
Europe
Tags

#MistralAI

#

GuillaumeLample

Mistral AI has launched the Mistral 3 family of ten open-source multilingual and multimodal AI models, headlined by Mistral Large 3, which uses a Mixture of Experts (MoE) architecture and is optimized for deployment on...

2 Dec/3 Dec
Threads Facebook X Instagram Bluesky
Terms Privacy
Stockholm, Sweden
+46 (0)8 525-171-42 Email