Home
News
Tech Grid
Interviews
Anecdotes
Think Stack
Press Releases
Articles
  • Agentic AI

Arrcus Achieves 3x Growth in 2025, Launches AINF Fabric


Arrcus Achieves 3x Growth in 2025, Launches AINF Fabric
  • by: Source Logo
  • |
  • February 19, 2026

Arrcus has announced exceptional 3x bookings growth in 2025, driven by strong demand for its ArcOS network operating system and ACE platform across datacenter, telco, and enterprise customers. Deployed on thousands of nodes globally, these solutions deliver flexibility, rapid feature innovation, and substantial capital and operating cost reductions compared to traditional incumbents. Building on this momentum, Arrcus introduced the Arrcus Inference Network Fabric (AINF), a purpose-built networking solution designed to optimize real-time and agentic AI inferencing by intelligently steering traffic across distributed inference nodes, caches, and datacenters.

Quick Intel

  • Arrcus achieves 3x bookings growth in 2025 for mission-critical switching and routing applications across thousands of production nodes.
  • Launches Arrcus Inference Network Fabric (AINF), an AI-policy-aware fabric to accelerate inferencing with higher throughput (Tokens per second), reduced Time to First Token (TTFT), lower end-to-end latency (E2EL), and up to 30% cost savings.
  • AINF enables dynamic routing based on policies for latency targets, data sovereignty, model preferences, and power constraints, integrating with frameworks like vLLM, SGLang, and Triton.
  • Supports distributed AI inferencing requirements including low latency, availability, power grid limits, sovereignty, and cost efficiency for enterprises and network operators.
  • Quote from Shekar Ayyar, Chairman and CEO of Arrcus: “AINF extends Arrcus’ leadership in distributed networking by delivering the first fabric designed to meet the latency, sovereignty, and power constraints of large-scale AI inferencing.”
  • Analyst Roy Chua (AvidThink): “As inferencing scales across distributed environments, this kind of workload-aware networking will be essential to maximizing AI-enabled application performance.”

Driving Growth Through Open Networking

Arrcus’ success in 2025 stems from its open, software-defined approach that supports best-of-breed hardware and delivers superior economics and agility. Customers benefit from reduced dependency on legacy vendors, faster innovation cycles, and scalable deployments that meet demanding production requirements in datacenters, telco networks, and enterprise environments.

Optimizing Distributed AI Inferencing with AINF

As agentic and physical AI drive explosive growth in inferencing workloads, traditional networking struggles with latency sensitivity, model diversity, power constraints, data sovereignty, and cost. AINF addresses these by introducing a policy abstraction layer that translates application intent into real-time infrastructure decisions. Operators define rules—such as latency SLAs, geographic boundaries, or power limits—and AINF steers inference queries to the optimal node or cache, ensuring the right model is served from the right location at the right time.

Key technical elements include query-based inference routing, interconnect routers, edge networking, Kubernetes orchestration, and prefix-aware KV cache optimization. The fabric integrates seamlessly with leading inference engines and supports partner solutions for load balancing, security, and power management.

Industry Validation and Ecosystem Support

Analysts highlight AINF’s role in the evolving AI infrastructure landscape, where Ethernet-based fabrics are projected to dominate as inferencing overtakes training. Partners including Broadcom, Fujitsu, Lightstorm, UfiSpace, Edgecore, Lanner, and investors such as Hitachi Ventures and Prosperity7 Ventures endorse Arrcus’ vision for intelligent, policy-aware networking that unlocks scalable, efficient AI inferencing across distributed environments.

This dual announcement reinforces Arrcus’ position as a leader in distributed networking, now extending its expertise to meet the unique demands of next-generation AI workloads.

  • Agentic AIAI NetworkingAI Infrastructure
News Disclaimer
  • Share