Moreh, an AI infrastructure software company, unveiled its distributed inference system optimized for AMD hardware and highlighted collaborations with Tenstorrent and SGLang at the AI Infra Summit 2025 in Santa Clara, California, held September 9–11, 2025. The event, attracting 3,500 attendees and over 100 partners, focused on full-stack AI infrastructure for hardware providers, hyperscalers, and enterprise IT specialists.
Event: AI Infra Summit 2025, world's largest AI infrastructure conference, September 9–11, Santa Clara, CA.
Key Announcement: Moreh's distributed inference system on AMD, outperforming NVIDIA in efficiency for models like DeepSeek.
Collaborations: Joint development with SGLang for AMD-based systems; next-gen AI semiconductor with Tenstorrent.
Market Focus: Provides cost-competitive alternatives to NVIDIA for deep learning inference.
Leadership: CEO Gangwon Jo presented benchmarks and PoC projects with LLM companies.
Company: Moreh develops AI engines and foundation LLMs via subsidiary Motif Technologies.
During the Enterprise AI session on September 10, Moreh CEO Gangwon Jo demonstrated the company's distributed inference system, which optimizes the latest deep learning models more efficiently than NVIDIA. Benchmarks showed superior performance for models like DeepSeek, emphasizing high efficiency and scalability. Jo highlighted Moreh's position as AMD's strongest global software partner, currently conducting proof-of-concept (PoC) projects with leading large language model (LLM) companies.
Moreh co-hosted a presentation with SGLang, a leader in deep learning inference software, and organized booth and networking sessions to strengthen ties with the North American AI ecosystem. The partnership aims to jointly develop an AMD-based distributed inference system, accelerating expansion in the growing deep learning inference market. Additionally, Moreh unveiled a next-generation AI semiconductor system integrating its software with Tenstorrent's hardware, offering diverse, cost-effective alternatives to NVIDIA's dominance.
“Through close collaboration with AMD, Tenstorrent, and SGLang, we aim to establish ourselves as a global company providing customers with diverse AI computing alternatives,” said CEO Gangwon Jo. Moreh is advancing its core AI infrastructure engine and, through its foundation LLM subsidiary Motif Technologies, building comprehensive capabilities across the model domain. These efforts position Moreh to capture significant market share in the evolving AI infrastructure landscape.
Moreh's presentations and partnerships at the AI Infra Summit underscore its commitment to efficient, scalable AI solutions, empowering enterprises with innovative hardware-software integrations for the next era of deep learning.
Moreh is an AI infrastructure software company developing core engines for distributed inference and foundation LLMs. Through collaborations with AMD, Tenstorrent, and SGLang, Moreh provides high-efficiency AI solutions, optimizing performance for advanced models and enabling cost-competitive alternatives in the global AI market.