Home
News
Tech Grid
Interviews
Anecdotes
Think Stack
Press Releases
Articles
  • Home
  • /
  • News
  • /
  • AI
  • /
  • Enterprise AI
  • /
  • Lumai Launches World's First Optical Computing System for Billion-Parameter LLM Inference
  • Enterprise AI

Lumai Launches World's First Optical Computing System for Billion-Parameter LLM Inference


Lumai Launches World's First Optical Computing System for Billion-Parameter LLM Inference
  • by: GlobeNewswire
  • |
  • April 29, 2026

Lumai, the optical compute company addressing scalable AI, today announced its Lumai Iris inference server – the world's first optical computing system to successfully run billion-parameter large language models (LLMs) in real time. This marks a milestone in AI infrastructure, demonstrating for the first time the commercial viability of optical compute for large-scale AI inference workloads. Lumai Iris servers accelerate inference workloads using light instead of silicon-based processing.

Quick Intel

  • Lumai Iris Nova delivers up to 90% lower energy consumption than conventional architectures.

  • Runs real-time inference on Llama 8B and 70B using hybrid processor.

  • Uses light in three-dimensional volume to overcome 2D constraints of conventional chips.

  • Executes millions of operations simultaneously through massive spatial parallelism.

  • Spun out of University of Oxford optics research in 2021.

  • Backed by UK government's Advanced Research and Invention Agency (ARIA).

Powering the Inference Era with Optical Computing

AI has entered a new phase where deployment, not training, is defining real-world impact. However, as inference workloads surge, data centers are running up against hard power and scalability limits, and traditional silicon-based architectures are struggling to keep pace. According to the International Energy Agency, global data center power demand will double by 2030. The finite power available to data centers is forcing the industry to seek more efficient approaches to compute. Traditional silicon architectures are hitting fundamental physical limits in scaling, power and thermal efficiency. Each new generation of silicon offers small scaling improvements while requiring significantly more power and cost to scale.

A New Architecture for AI Compute

Optical computing enables radically more efficient execution of core AI operations. Lumai's optical computing technology, born from years of research at the University of Oxford, uses light in three-dimensional volume to overcome the two-dimensional constraints of conventional chips. By utilizing massive spatial parallelism, millions of operations are executed simultaneously, resulting in low-cost, high token throughput for compute-bound workloads. Lumai's technology also excels in the prefill stage of disaggregated inference architectures, processing tokens at maximum scale and efficiency. Iris Nova runs real-time inference on Llama 8B and 70B using a hybrid processor. Its sophisticated hybrid architecture combines digital processing for system control and software with an optical tensor engine that performs core mathematical operations, ensuring seamless integration into data centers.

Availability

The Lumai Iris Nova inference server is available now for evaluation. Future systems in the Iris family (Aura and Tetra) will extend performance and efficiency further, supporting broader deployment across hyperscale and enterprise environments.

As Dr. Xianxin Guo, CEO and Co-Founder of Lumai, stated: "As the industry transitions into the inference era, we are simultaneously crossing the threshold into the post-silicon era. By shifting the computation paradigm from electrons to photons, Lumai can deliver an order-of-magnitude increase in performance with significant energy savings."

Suraj Bramhavar, Program Director at ARIA, added: "The demands on existing AI processors necessitate an urgent search for alternative scaling pathways. Lumai is leading the charge in demonstrating that optical processors could provide one such pathway, and ARIA is excited to partner with them to explore the shift beyond our traditional digital computing paradigm."

About Lumai

Lumai, the optical compute company, is building the next-generation AI infrastructure for the Inference Era. Spun out of world-leading optics research at the University of Oxford in 2021, Lumai's mission is to unlock sustainable intelligence at global scale – delivering materially faster inference, significantly higher execution efficiency, and up to 90% lower energy consumption than conventional GPU architecture.

  • LLM InferenceAI Hardware
News Disclaimer
  • Share