Aria Networks today announced the general availability of the Networks that Think – the world's first AI-native network built from the ground up to maximize Token Efficiency. At the core of the network is Deep Networking, a fundamentally different approach to how networks operate. Token Efficiency is the defining metric of the AI factory era and the single best proxy for whether an AI cluster is delivering on its investment. Token Efficiency directly relates to Model Flop Utilization (MFU) and cost per token – improvements in either translate directly into improvements in revenue.
Aria Networks launches Deep Networking, the world's first AI-native network built to maximize Token Efficiency.
The company raises $125M from Sutter Hill Ventures, Atreides Management, Valor Equity Partners, and Eclipse Ventures.
Gavin Baker of Atreides Management joins Aria's board alongside Stefan Dyckerhoff of Sutter Hill Ventures.
Deep Networking combines hardened SONiC, end-to-end telemetry, and intelligent agents across every layer of the stack.
A 1% improvement in MFU recoups the entire cost of the network, which comprises only 10-15% of total cluster cost.
Aria's switch platform delivers leading 800GbE and 1.6T switching in liquid-cooled and air-cooled form factors.
The network is at the center of this equation, not merely as a bottleneck but as a potential multiplier. When the network underperforms, it drags down every other component in the stack. When it is optimized, it lifts them all. While the network comprises only 10-15% of the total cluster cost, its impact is substantial. A mere 1% improvement in MFU recoups the entire cost of the network.
Suboptimal network performance prevents the full realization of gains from all other infrastructure investments: in training, it affects how quickly gradients are synchronized; in disaggregated inference, it affects how efficiently KV caches are transferred, and how seamlessly jobs are scheduled across thousands of xPUs. Inference clusters especially are getting larger and more complex, introducing bigger networking challenges – not just for the backend, but also for the frontend.
Legacy networking solutions treat telemetry as an afterthought and rely on static configurations that were designed for a different era. Deep Networking changes that, and is built on five pillars, all of which must be present to deliver the desired outcome:
AI Optimized Hardware, and Hardened SONiC. Aria's switch platform, built from the ground up on AI-native SONiC, delivers leading 800GbE and 1.6T switching in liquid-cooled and air-cooled form factors.
Fine-grained, end to end telemetry. 100–10,000x finer resolution than traditional tools, collected across switches, transceivers, and hosts in a single unified view.
Intelligent agents at every layer. Specialized agents evaluate signals, extract insights, and take action at the appropriate resolution – from the switching ASIC all the way up to cloud orchestration.
Networking expertise built in. Every agent and every decision is grounded in deep networking domain knowledge – the system doesn't just see data, it understands what it means.
Continuous updates. New capabilities are developed seamlessly and continuously, keeping the network at the forefront of performance for every new workload.
The combination of these five elements creates a flywheel: the more workloads the system sees, the smarter it gets – delivering a seamlessly optimized network.
Deep Networking is not just a technology architecture, it is a set of outcomes that operators experience from day one:
Seamless, automatic network fine-tuning. The platform continuously fine-tunes every aspect of the networking fabric for the specific cluster it serves, without manual intervention – across routing, load balancing, congestion management, and failover.
Intent-based configuration. Operators express what they need, and the platform configures the fabric accordingly.
Real-time, adaptive performance optimization. The system continuously evaluates network state and takes action in real time to keep accelerators productive and every token flowing.
Agentic partnership with operators. Operators gain fine-grained telemetry data at their fingertips, can ask questions about any alert in natural language, and collaborate with Aria's agents.
Embedded Field Deployment Engineers. Aria's FDEs are embedded directly within the customer's team, managing the full lifecycle from architecture to performance tuning.
Ethernet has become the dominant fabric for new AI back-end deployments, driven by its openness, ubiquity, and multi-vendor scalability. Liquid cooling adoption is projected to reach 76% of AI servers this year as rack densities quickly approach 1MW. The transition to 1.6T is accelerating faster than 800G ever did, with over 22 million ports expected to ship by 2027. Aria's switch platform delivers leading 800GbE and 1.6T switching in liquid-cooled and air-cooled form factors with no vendor lock-in. Aria Networks already has customer orders in hand and is actively deploying.
About Aria Networks
The networking industry was built for a different era. Aria Networks was built for this one. Founded in 2025 and headquartered in Palo Alto, Aria Networks is building the networking company for the AI era, from scratch, with AI at the center of everything. Aria Networks' approach, Deep Networking, combines hardened SONiC, end-to-end telemetry, intelligent agents, deep domain context, and continuous cloud-delivered updates to maximize token efficiency – backed by Sutter Hill Ventures, Atreides Management, Valor Equity Partners, and Eclipse Ventures.