Home
News
Tech Grid
Interviews
Anecdotes
Think Stack
Press Releases
Articles
  • Cloud AI

Cirrascale Boosts AI with Ai2’s OLMo, Molmo, Tülu Models


Cirrascale Boosts AI with Ai2’s OLMo, Molmo, Tülu Models
  • by: GlobeNewswire
  • |
  • July 8, 2025

Cirrascale Cloud Services, a leader in AI infrastructure solutions, has partnered with the Allen Institute for Artificial Intelligence (Ai2) to bring state-of-the-art open-source AI models—OLMo, Molmo, and Tülu—to its Inference Platform. This collaboration enables organizations to rapidly deploy and commercialize advanced AI solutions, leveraging instant scalability and production-ready endpoints without building complex infrastructure.

Quick Intel

  • Cirrascale launches Ai2’s OLMo, Molmo, and Tülu models on its Inference Platform.

  • OLMo offers transparent, open-source language models in 7B, 13B, and 32B sizes.

  • Molmo delivers high-performance multimodal AI for image, text, and speech.

  • Tülu excels in instruction-following, surpassing models like GPT-4o.

  • Platform enables instant deployment and accelerator-optimized performance.

  • Showcased at RAISE Summit 2025 in Paris, booth No. 11.

Seamless AI Deployment with Cirrascale’s Inference Platform

Cirrascale’s Inference Platform simplifies AI model deployment by offering instant, scalable endpoints for enterprises and developers. The integration of Ai2’s OLMo, Molmo, and Tülu models eliminates the need for businesses to build costly infrastructure, enabling rapid experimentation and production-scale use. “Our new Inference Platform is designed for two core audiences: developers building differentiated models and needing an endpoint offering in order to commercialize quickly and enterprise customers with customized or fine-tuned models looking to deploy them at scale,” said Dave Driggers, CEO and Co-Founder, Cirrascale Cloud Services.

OLMo: Transparent Language Models for Research

OLMo, available in 7B, 13B, and 32B variants, is a fully open-source language model released under the Apache 2.0 license. Its transparency—complete with open weights, training data, and code—empowers researchers and developers to advance AI innovation. OLMo’s compact design ensures efficient performance, making it ideal for enterprises seeking cost-effective solutions without compromising quality.

Molmo: High-Performance Multimodal AI

Molmo, Ai2’s multimodal model family, excels in processing image, text, and speech data. By leveraging high-quality, curated datasets, Molmo outperforms proprietary models despite its smaller size. This efficiency makes it a powerful choice for businesses aiming to integrate advanced multimodal AI into applications like smart devices and customer service platforms.

Tülu: Leading Instruction-Following Model

Tülu 3, built on a post-training recipe applied to Llama-405B, achieves competitive or superior performance compared to DeepSeek v3 and GPT-4o. Its fully transparent pipeline, including open-source data and code, sets a new standard for instruction-following models. Tülu’s capabilities make it suitable for complex enterprise tasks requiring precise AI responses.

Platform Benefits for Enterprises

The Cirrascale Inference Platform supports multi-model deployment, allowing users to integrate custom or pre-existing models like OLMo, Molmo, and Tülu. Its accelerator optimization automatically selects the best hardware, ensuring faster innovation. The platform also simplifies management, enabling enterprises to maintain low-volume models on-premises while scaling demanding workloads in the cloud. “Since launching our family of truly open models last year, the AI community has been asking for API access. Today, in partnership with Cirrascale, we’re excited to deliver just that – an API to enable scalable, flexible, and cost-efficient integration,” said Sophie Lebrecht, COO for Ai2.

Cirrascale’s partnership with Ai2 marks a significant step in democratizing access to advanced AI models. By offering scalable, production-ready endpoints, the Inference Platform empowers organizations to innovate rapidly and deploy AI solutions efficiently. Attendees at the RAISE Summit 2025 in Paris can explore these capabilities at Cirrascale’s booth No. 11.

 

About Cirrascale Cloud Services

Cirrascale Cloud Services is a leading cloud and managed services provider dedicated to deploying tailored, state-of-the-art compute resources and high-speed storage solutions at scale. Our AI Innovation Cloud and Inference Platform services are purpose-built to enable clients to scale their training and inferencing workloads for generative AI, large language models, and high-performance computing.

News Disclaimer
  • Share