Switzerland Unveils First Open and Multilingual AI Model: Apertus

Tue 2nd Sep, 2025

Switzerland has announced the development of an innovative open-source multilingual AI model named Apertus. This initiative, a collaboration among several research institutions, aims to enhance the country's digital sovereignty.

Apertus is a large language model (LLM) designed by the ETH Zurich, the EPF Lausanne, and the Swiss National Supercomputing Centre (CSCS) based in Lugano. It is available in two sizes, with the smaller version made accessible for individual users.

The model is trained on a dataset encompassing over 1,000 languages, including Swiss German and Romansh, which have been largely overlooked in many existing AI frameworks. The training material consists of approximately 15 trillion tokens, with around 40% being non-English content.

According to the researchers, Apertus operates at a level comparable to Llama 3, although it may not yet rival the attention garnered by models such as the Chinese Deepseek, which previously disrupted the AI market dominated by American firms.

The name Apertus reflects the model's core principle of openness. Key components, such as architecture, model weights, intermediate checkpoints, and training datasets, are freely accessible. Furthermore, Apertus is the first major AI model to comply with the transparency requirements set forth by the EU AI Act. The developers emphasize that the model respects user privacy by honoring opt-out requests and removing personal data and unwanted content prior to its training.

One of the primary objectives of this initiative is to foster expertise in AI across research, society, and the economy in Switzerland. While the launch of Apertus is just the initial step, its creators view it as a foundational platform for future advancements. To showcase the model's capabilities, a hackathon is scheduled to be held during the Swiss AI Weeks in September.

Apertus will be available for download on the Hugging Face platform, with versions containing 8 and 70 billion parameters to cater to different user needs.


More Quick Read Articles »