Home
News
Tech Grid
Data & Analytics
Data Processing Data Management Analytics Data Infrastructure Data Integration & ETL Data Governance & Quality Business Intelligence DataOps Data Lakes & Warehouses Data Quality Data Engineering Big Data
Enterprise Tech
Digital Transformation Enterprise Solutions Collaboration & Communication Low-Code/No-Code Automation IT Compliance & Governance Innovation Enterprise AI Data Management HR
Cybersecurity
Risk & Compliance Data Security Identity & Access Management Application Security Threat Detection & Incident Response Threat Intelligence AI Cloud Security Network Security Endpoint Security Edge AI
AI
Ethical AI Agentic AI Enterprise AI AI Assistants Innovation Generative AI Computer Vision Deep Learning Machine Learning Robotics & Automation LLMs Document Intelligence Business Intelligence Low-Code/No-Code Edge AI Automation NLP AI Cloud
Cloud
Cloud AI Cloud Migration Cloud Security Cloud Native Hybrid & Multicloud Cloud Architecture Edge Computing
IT & Networking
IT Automation Network Monitoring & Management IT Support & Service Management IT Infrastructure & Ops IT Compliance & Governance Hardware & Devices Virtualization End-User Computing Storage & Backup
Human Resource Technology Agentic AI Robotics & Automation Innovation Enterprise AI AI Assistants Enterprise Solutions Generative AI Regulatory & Compliance Network Security Collaboration & Communication Business Intelligence Leadership Artificial Intelligence Cloud
Finance
Insurance Investment Banking Financial Services Security Payments & Wallets Decentralized Finance Blockchain Cryptocurrency
HR
Talent Acquisition Workforce Management AI HCM HR Cloud Learning & Development Payroll & Benefits HR Analytics HR Automation Employee Experience Employee Wellness
Marketing
AI Customer Engagement Advertising Email Marketing CRM Customer Experience Data Management Sales Content Management Marketing Automation Digital Marketing Supply Chain Management Communications Business Intelligence Digital Experience SEO/SEM Digital Transformation Marketing Cloud Content Marketing E-commerce
Consumer Tech
Smart Home Technology Home Appliances Consumer Health AI
Interviews
Anecdotes
Think Stack
Press Releases
Articles
Tech Events 2025
  • Cloud Security

Mirantis MOSK 25.2 Powers AI-Ready Private Clouds


Mirantis MOSK 25.2 Powers AI-Ready Private Clouds
  • by: Source Logo
  • |
  • October 7, 2025

In an era where AI is transforming enterprise operations, Mirantis has unveiled MOSK 25.2, its latest iteration of OpenStack for Kubernetes, designed to streamline cloud management and bolster support for demanding GPU-intensive AI applications alongside conventional business workloads. This release comes at a critical time as organizations grapple with the need for scalable, secure infrastructure that maintains data sovereignty and operational efficiency.

Quick Intel

  • Mirantis MOSK 25.2 supports fully offline OpenStack deployments, ideal for regulated sectors like finance and defense requiring air-gapped AI infrastructure.
  • Integrates OpenStack 2025.1 "Epoxy" for enhanced performance in private cloud environments handling GPU workloads.
  • Upgrades to OVN 24.03 for superior networking scalability and security in Kubernetes-native setups.
  • Introduces scale-out L3 networking on bare metal, eliminating VLAN dependencies for multi-rack AI deployments.
  • Enables hybrid AI operations by recovering bare-metal GPU servers and integrating them seamlessly with virtual machines.
  • Provides proactive network monitoring with alerts to ensure reliability in sovereign and private cloud infrastructures.

Key Features and Enhancements

MOSK 25.2 addresses the escalating demands of AI adoption by enabling organizations to scale infrastructure for high-throughput training while ensuring data control and efficient orchestration. As highlighted by Deloitte's insights on infrastructure evolution, enterprises are prioritizing solutions that support data locality and performance in hybrid, sovereign, and private clouds. This version of MOSK introduces capabilities tailored for compute, networking, and storage management, allowing seamless handling of both AI-driven and traditional applications.

A standout advancement is the support for disconnected operations, permitting entire OpenStack clouds to function without internet access. This is particularly valuable for industries where security protocols mandate scanning and approval of all datacenter artifacts, such as government and defense. By facilitating alignment with upstream innovations while retaining full data control, MOSK 25.2 becomes essential for AI model training in sensitive environments.

Networking and Scalability Improvements

Networking receives significant upgrades in MOSK 25.2, promoting smarter and more resilient infrastructure. The inclusion of Open Virtual Network (OVN) 24.03 brings performance boosts and the latest security updates, offering a validated migration path from the legacy Open vSwitch (OvS) to a more contemporary backend suitable for large-scale OpenStack deployments. For those seeking alternatives, OpenSDN 24.1 provides a refreshed codebase with broader IPv6 support, enhancing compatibility in modern networks.

Scale-out networking features full Layer 3 capabilities on bare metal, enabling expansion across racks without relying on VLAN stretching. Coupled with proactive health monitoring—including connectivity checks and early alerts for switch or routing anomalies—this ensures uninterrupted operations in AI-ready private clouds. These enhancements reduce downtime and optimize resource allocation for Kubernetes-orchestrated environments.

Hybrid AI and Infrastructure Management

For hybrid deployments combining virtual machines and bare metal, MOSK 25.2 simplifies AI cloud management. It includes recovery mechanisms for bare-metal GPU servers even during network disruptions, ensuring continuous availability for intensive training tasks. Additionally, these servers can be effortlessly connected to project-specific networks alongside VMs, facilitating high-performance workflows without compromising security.

The platform maintains its strength in on-premises private clouds, supporting both cloud-native and legacy workloads through automated lifecycle management—from bare-metal provisioning to software configuration. Centralized tools for logging, monitoring, and alerting further empower enterprises to maintain reliability and sovereignty over application data in any deployment scenario.

As organizations navigate the complexities of AI infrastructure, MOSK 25.2 positions Mirantis as a leader in delivering Kubernetes-native solutions that balance innovation with control. This release not only future-proofs private and sovereign clouds but also empowers businesses to deploy scalable, secure environments that drive AI initiatives forward without operational hurdles.

About Mirantis

Mirantis delivers the fastest path to enterprise AI at scale, with full-stack AI infrastructure technology that removes GPU infrastructure complexity and streamlines operations across the AI lifecycle, from Metal-to-Model. Today, all infrastructure is AI infrastructure, and Mirantis provides the end-to-end automation, enterprise security and governance, and deep expertise in Kubernetes orchestration that organizations need to reduce time to market and efficiently scale cloud native, virtualized, and GPU-powered applications across any environment – on-premises, public cloud, hybrid, or edge.

  • MirantisPrivate CloudSovereign CloudGPU WorkloadsCloud Security
News Disclaimer
  • Share