
Gcore, a global provider of edge AI, cloud, network, and security solutions, has launched AI Cloud Stack, a sophisticated software platform that enables the rapid creation of private AI clouds featuring hyperscaler-level capabilities, powered by NVIDIA. Tailored for cloud service providers, telcos, and major enterprises, this solution converts raw NVIDIA GPU clusters into multi-tenant, cloudified infrastructure, unlocking swift revenue opportunities and optimal GPU efficiency.
As AI demands evolve, organizations increasingly require hybrid setups that blend public cloud flexibility with on-premises control to meet regulatory and operational needs. Gcore AI Cloud Stack tackles these complexities by delivering a comprehensive reference architecture and operational framework, from infrastructure-as-a-service to model-as-a-service. This allows service providers and enterprises to deploy large-scale AI environments without the typical delays in time-to-market or ROI challenges, ensuring seamless integration of compute, storage, and networking.
Seva Vayner, Product Director for Edge Cloud and AI at Gcore, said: "The future of AI infrastructure will be hybrid, spanning both public cloud and on-premises environments. Many industries face regulatory and operational requirements that demand on-prem deployments. Yet building and managing AI infrastructure at scale is complex and slow to deliver ROI. With Gcore AI Cloud Stack, we remove those barriers by providing a complete reference architecture, operational blueprint, and cloudification software solution encompassing from IaaS to MaaS. This enables organizations to transform accelerated computing resources into performant cloud environments quickly and at scale, accelerating their AI adoption, deployment, and profit."
By partnering with leaders like VAST Data and Nokia, the stack incorporates advanced multi-tenant storage via VAST AI OS and reliable, open networking architectures, enabling faster client onboarding and revenue generation.
Gcore AI Cloud Stack equips users with a full-spectrum toolkit for constructing, managing, and monetizing private AI infrastructure. Its cloudification layer converts bare-metal setups into flexible, usage-based models, while the operational core—powered by VAST AI OS—streamlines governance across scalable resources, secure connectivity, and AI-optimized services. Integration with Gcore's AI suite allows effortless deployment of training pipelines and serverless inference, complemented by hyperscaler-equivalent features such as automated billing, real-time observability, and orchestration support.
"Gcore brings together the key pieces, compute, networking, and storage, into a usable stack," said Dan Chester, CSP Director EMEA, VAST Data. "That integration helps service providers stand up AI clouds faster and onboard clients sooner, accelerating time to revenue. Combined with the advanced multi-tenant capabilities of VAST's AI Operating System, it delivers a reliable, scalable, and future-proof AI infrastructure. Gcore offers operators a valuable option to move quickly without building everything themselves."
Mark Vanderhaegen, Head of Business Development, Data Center Networks at Nokia commented: "We're pleased to collaborate with Gcore, a strong European ISV, to advance a networking reference architecture for AI clouds. Combining Nokia's open, programmable and reliable networking with Gcore's cloud software accelerates deployable blueprints that customers can adopt across data centers and the edge."
White-label capabilities further empower providers to offer services under their own branding, backed by Gcore's global infrastructure for low-latency, compliant performance.
NVIDIA AI Enterprise compatibility ensures quick incorporation of pretrained models, chatbots, and blueprints, minimizing development hurdles and accelerating commercialization. Already proven in production across thousands of NVIDIA Hopper GPUs in Europe, the stack includes full abstraction and monetization layers, positioning it as a turnkey path to profitable AI clouds. This launch reinforces Gcore's role in fostering AI-native environments that span public, private, and hybrid deployments, helping organizations harness AI's potential without infrastructural bottlenecks.
Gcore AI Cloud Stack emerges as a pivotal enabler for the hybrid AI era, delivering the speed, scalability, and security needed to turn GPU investments into thriving, revenue-generating ecosystems for forward-thinking providers and enterprises.
Gcore is a global infrastructure and software provider for AI, cloud, network, and security solutions. Headquartered in Luxembourg, Gcore operates its own sovereign infrastructure across six continents, delivering ultra-low latency and compliance-ready performance for mission-critical workloads. Its AI-native cloud stack combines software innovation with hyperscaler-grade functionality, enabling enterprises and service providers to build, train, and scale AI everywhere, across public, private, and hybrid environments. By integrating AI, compute, networking, and security into a single platform, Gcore accelerates digital transformation and empowers organizations to unlock the full potential of AI-driven services.