Home
News
Tech Grid
Data & Analytics
Data Processing Data Management Analytics Data Infrastructure Data Integration & ETL Data Governance & Quality Business Intelligence DataOps Data Lakes & Warehouses Data Quality Data Engineering Big Data
Enterprise Tech
Digital Transformation Enterprise Solutions Collaboration & Communication Low-Code/No-Code Automation IT Compliance & Governance Innovation Enterprise AI Data Management HR
Cybersecurity
Risk & Compliance Data Security Identity & Access Management Application Security Threat Detection & Incident Response Threat Intelligence AI Cloud Security Network Security Endpoint Security Edge AI
AI
Ethical AI Agentic AI Enterprise AI AI Assistants Innovation Generative AI Computer Vision Deep Learning Machine Learning Robotics & Automation LLMs Document Intelligence Business Intelligence Low-Code/No-Code Edge AI Automation NLP AI Cloud
Cloud
Cloud AI Cloud Migration Cloud Security Cloud Native Hybrid & Multicloud Cloud Architecture Edge Computing
IT & Networking
IT Automation Network Monitoring & Management IT Support & Service Management IT Infrastructure & Ops IT Compliance & Governance Hardware & Devices Virtualization End-User Computing Storage & Backup
Human Resource Technology Agentic AI Robotics & Automation Innovation Enterprise AI AI Assistants Enterprise Solutions Generative AI Regulatory & Compliance Network Security Collaboration & Communication Business Intelligence Leadership Artificial Intelligence Cloud
Finance
Insurance Investment Banking Financial Services Security Payments & Wallets Decentralized Finance Blockchain
HR
Talent Acquisition Workforce Management AI HCM HR Cloud Learning & Development Payroll & Benefits HR Analytics HR Automation Employee Experience Employee Wellness
Marketing
AI Customer Engagement Advertising Email Marketing CRM Customer Experience Data Management Sales Content Management Marketing Automation Digital Marketing Supply Chain Management Communications Business Intelligence Digital Experience SEO/SEM Digital Transformation Marketing Cloud Content Marketing
Consumer Tech
Smart Home Technology Home Appliances Consumer Health AI
Interviews
Think Stack
Press Releases
Articles
Resources
  • Enterprise AI

NeuReality Unveils 1.6 Tbps NR2 AI-SuperNIC for Scalable AI Infrastructure


NeuReality Unveils 1.6 Tbps NR2 AI-SuperNIC for Scalable AI Infrastructure
  • Source: Source Logo
  • |
  • September 10, 2025

NeuReality, a leader in AI infrastructure, has revealed its next-generation 1.6 Tbps NR2 AI-SuperNIC at the AI Infrastructure Summit, marking a significant advancement in scale-out AI networking. Designed to address the growing demands of AI inference and training, the NR2 AI-SuperNIC supports Ultra Ethernet Consortium (UEC) specifications and introduces in-network computing capabilities, setting a new standard for performance and scalability in AI data centers.

Quick Intel

  • NeuReality launches 1.6 Tbps NR2 AI-SuperNIC for AI infrastructure.

  • Supports Ultra Ethernet Consortium (UEC) 1.0 for low-latency networking.

  • Enhances scalability for AI training and inference workloads.

  • Features in-network computing to optimize GPU and XPU performance.

  • NR2 AI-SuperNIC available to select customers in H2 2026.

  • NR1 solution receives UEC-compliant software upgrade.

Next-Generation NR2 AI-SuperNIC: Redefining AI Networking

The NR2 AI-SuperNIC builds on the foundation of NeuReality’s NR1 AI-NIC, delivering a wire-speed of 1.6 Tbps and integrating advanced in-network computing capabilities. This innovation leverages an upgraded AI-Hypervisor and DSP processors to support scalable AI training and inference, addressing bottlenecks in high-performance networking. By optimizing Ethernet throughput and latency, the NR2 AI-SuperNIC ensures efficient data movement across AI clusters, making it ideal for single-rack setups to large-scale AI factories.

UEC Compliance for Seamless AI Scalability

With full support for UEC 1.0 specifications, the NR2 AI-SuperNIC ensures ultra-low latency and end-to-end interoperability in AI inference clusters. Alongside the NR1’s existing TCP and ROCEv2 support, the UEC Ethernet compatibility enhances the NR2’s ability to handle multimodal AI data, including images, audio, and video. A software upgrade for the NR1 solution also brings UEC 1.0 compliance, enabling existing systems to benefit from improved networking performance.

Addressing AI Infrastructure Challenges

As AI models grow in complexity, traditional infrastructure struggles with cost, scalability, and efficiency. NeuReality’s NR2 AI-SuperNIC tackles these issues by offloading communication overhead and optimizing GPU and XPU utilization. “Our mission is to drive the architectural shift the AI infrastructure industry needs,” said Moshe Tanach, CEO at NeuReality. “From silicon to systems, our NR1 AI-CPU and the NR2 AI-SuperNIC embody the performance, openness and scalability tomorrow’s AI workloads demand.”

Flexible Deployment for Diverse AI Workloads

The NR2 AI-SuperNIC is designed for versatility, deployable as a standalone NIC card, co-packaged with GPUs, or on micro-server boards. This flexibility supports a range of AI workloads, from generative AI to reasoning and chain-of-thought processes, which demand high compute and data transfer rates. With availability to select customers in the second half of 2026 and mass production in 2027, the NR2 sets a new benchmark for AI infrastructure efficiency.

Future of AI Infrastructure with NR2 AI-CPU

NeuReality’s broader vision includes the NR2 AI-CPU, which will support up to 128 cores based on Arm Neoverse Compute Subsystems V3. Optimized for real-time model coordination, token streaming, and KV-cache optimizations, the NR2 AI-CPU complements the AI-SuperNIC to deliver a modular, high-performance solution. This approach aims to eliminate structural inefficiencies in traditional CPU-GPU-NIC architectures, paving the way for cost-effective and energy-efficient AI data centers.

NeuReality’s advancements signal a transformative shift in AI infrastructure, prioritizing performance, scalability, and energy efficiency. By addressing the limitations of legacy systems, the NR2 AI-SuperNIC and AI-CPU solutions position NeuReality as a key player in enabling the next generation of AI applications. Join NeuReality at the AI Infra Summit 2025 in Santa Clara to explore live demos and witness the future of AI infrastructure.

About NeuReality

Founded in 2019, NeuReality is a pioneer in purpose-built AI inferencing architecture powered by the NR1® Chip – the first AI-CPU for inference orchestration. Based on an open, standards-based approach, the NR1 is fully compatible with any AI accelerator. NeuReality’s mission is to make AI accessible and ubiquitous by lowering barriers associated with prohibitive cost, power consumption, and complexity, and to scale AI inference adoption through its disruptive technology. It employs 80 people across facilities in Israel, Poland, and the U.S.

  • AI InfrastructureNR2AI Super NICUltra Ethernet ConsortiumAI Data CentersScalable AI
News Disclaimer
  • Share