Home
News
Tech Grid
Interviews
Anecdotes
Think Stack
Press Releases
Articles
  • Cloud AI

GMI Cloud Joins NVIDIA DGX Cloud Lepton with Series A Funding


GMI Cloud Joins NVIDIA DGX Cloud Lepton with Series A Funding
  • by: PR Newswire
  • |
  • June 19, 2025

GMI Cloud, a leading GPU-as-a-Service provider, has partnered with NVIDIA to contribute to the DGX Cloud Lepton platform while securing Series A funding. This collaboration enhances access to high-performance GPU resources, empowering developers to build advanced AI solutions with speed and scalability.

Quick Intel

  • GMI Cloud joins NVIDIA DGX Cloud Lepton for AI infrastructure.
  • Secures Series A funding to expand GPU-based AI solutions.
  • Offers NVIDIA Blackwell GPUs for low-latency AI workloads.
  • Ensures compliance with regional data sovereignty requirements.
  • Integrates NVIDIA’s software stack for faster AI development.
  • 16-node GPU clusters available on Lepton marketplace in 2025.

Strategic NVIDIA DGX Cloud Partnership

GMI Cloud is among the first to contribute to NVIDIA DGX Cloud Lepton, a platform connecting developers to global GPU compute capacity. As an NVIDIA Cloud Partner, it provides high-performance infrastructure, including NVIDIA Blackwell GPUs, for low-latency inference and sovereign AI workloads. “DGX Cloud Lepton reflects everything we believe in at GMI Cloud: speed, sovereignty, and scale without compromise,” said Alex Yeh, CEO of GMI Cloud, emphasizing the partnership’s focus on scalable AI innovation.

Optimized AI Development Solutions

The DGX Cloud Lepton platform addresses the challenge of securing reliable GPU resources by integrating with NVIDIA’s software stack, including NIM microservices, NeMo, Blueprints, and Cloud Functions. GMI Cloud’s infrastructure supports cost-effective, high-performance GPU clusters with strategic regional availability to meet compliance and latency needs. This enables developers to streamline AI development, from large language models to autonomous systems, with fast deployment pipelines.

Series A Funding Fuels Expansion

GMI Cloud’s Series A funding will accelerate its U.S.-based, AI-optimized GPU infrastructure growth. The investment supports enterprise-grade inference services and rapid scaling for developers worldwide. By owning its full-stack infrastructure, GMI Cloud delivers cost-efficient solutions, ensuring developers can build AI without limits while leveraging NVIDIA’s advanced tools and Lepton’s marketplace, starting with 16-node clusters.

Empowering Global AI Innovation

The partnership enhances GMI Cloud’s ability to deliver scalable, high-performance AI solutions. Its integration with DGX Cloud Lepton provides developers with unparalleled access to GPU resources, meeting the demands of diverse AI applications. This collaboration, combined with the funding, positions GMI Cloud as a key player in driving the next era of AI development across industries.

GMI Cloud’s collaboration with NVIDIA and its Series A funding mark a significant step toward advancing AI infrastructure. By offering optimized GPU resources and seamless integration with NVIDIA’s software, GMI Cloud empowers developers to create innovative AI solutions efficiently, shaping the future of AI-driven technologies.

 

About GMI Cloud

GMI Cloud delivers full-stack, U.S.-based GPU infrastructure and enterprise-grade inference services built to scale AI products. Whether training foundation models or deploying real-time agents, GMI gives teams full control of performance, costs, and launch velocity. With on-demand and reserved GPU clusters for all workloads and projects, GMI helps AI teams build without limits. GMI Cloud is based out of Mountain View, CA.

News Disclaimer
  • Share