Snowflake launches Arctic: An open enterprise-grade large language model

Snowflake Arctic adds truly open large language model with unmatched intelligence and efficiency to the Snowflake Arctic model family

Snowflake, a renowned Data Cloud company listed on NYSE as SNOW, has announced the launch of Snowflake Arctic, a groundbreaking large language model (LLM) designed to be the most open and enterprise-ready LLM available. Snowflake Arctic, featuring a unique Mixture-of-Experts (MoE) architecture, promises top-tier intelligence and unprecedented efficiency at scale. It has already surpassed several industry benchmarks in SQL code generation, instruction following, and more.

In a move to set new standards for openness in enterprise AI technology, Snowflake is releasing Arctic’s weights under the Apache 2.0 license, along with comprehensive details of its training. This open approach aims to encourage collaboration and innovation across the AI community. Arctic is part of Snowflake’s Arctic model family, which also includes text-embedding models optimized for retrieval use cases.

“This marks a monumental achievement for Snowflake, with our AI research team pushing the boundaries of AI innovation,” said Sridhar Ramaswamy, CEO of Snowflake. “Arctic not only offers industry-leading intelligence and efficiency but does so in a truly open manner, enhancing the potential of open source AI. Our work with Arctic will significantly boost our ability to provide reliable, efficient AI solutions to our customers.”

Arctic: The New Benchmark in Resource-Efficiency

Snowflake’s AI research team, leveraging a unique blend of industry-leading researchers and system engineers, developed Arctic in less than three months at approximately one-eighth of the cost typically associated with similar models. Trained on Amazon Elastic Compute Cloud (Amazon EC2) P5 instances, Arctic sets a new standard for the speed and cost-efficiency of training state-of-the-art enterprise-grade models.

Arctic’s MoE architecture is a game-changer, enhancing both training systems and model performance. It employs a carefully designed data composition tailored to enterprise requirements. Arctic boasts impressive efficiency, activating only 17 out of its 480 billion parameters at a time. This efficiency is evident as Arctic uses about 50% fewer parameters than DBRX and 75% fewer than Llama 3 70B during both inference and training. Furthermore, it outperforms leading open models in coding, SQL generation, and general language understanding.

Accelerating AI Innovation for Enterprises

With Arctic, Snowflake continues to empower enterprises with cutting-edge AI capabilities. Integrated into Snowflake Cortex, Arctic will expedite the development of production-grade AI applications at scale, all within the secure and governed environment of the Data Cloud.

In addition to Arctic, Snowflake’s Arctic family includes Arctic embed, a suite of state-of-the-art text-embedding models available under the Apache 2.0 license. These models, optimized for retrieval performance, are available on Hugging Face and will soon be integrated into Snowflake Cortex embed function.

Industry Reactions

Industry experts have lauded Snowflake’s Arctic as a significant step forward in democratizing AI and fostering open collaboration. Leaders from AI21 Labs, AWS, Coda, Hugging Face, Lamini, Landing AI, Microsoft, Perplexity, Reka, and Together AI have expressed excitement and support for Arctic’s release, highlighting its potential to drive AI innovation, democratization, and transformation across various sectors.

With its unique MoE architecture, Snowflake Arctic promises to accelerate AI innovation and empower enterprises to unlock the full potential of their data. As Snowflake continues to invest in AI research and partnerships, the future looks promising for businesses seeking to leverage the power of AI to drive innovation and create value.

Disclaimer: The views expressed in this article are those of the author and do not necessarily reflect the views of ET Edge Insights, its management, or its members

Scroll to Top