The challenge of managing fragmented application architectures across virtual machines, containers, and serverless functions is being directly addressed by a new, unified platform. Traefik Labs has unveiled its Application Intelligence Platform, designed to provide consistent application-layer routing, security, and observability across all modern and legacy infrastructure types, from public cloud to fully air-gapped environments.
Traefik Labs launches a platform unifying routing, security, and observability across VMs, containers, and serverless.
The platform supports any environment, including public cloud and air-gapped facilities.
Traefik Proxy 3.6 adds native Knative support for serverless Kubernetes workloads.
New multi-layer routing enables context-aware traffic decisions based on user identity.
Full support for Kubernetes Gateway API v1.4 ensures standards-based portability.
The solution aims to eliminate operational complexity from infrastructure-specific gateways.
Enterprises often struggle with fragmented application delivery, where VMs, containers, and serverless functions each require different gateway solutions. This leads to operational complexity, inconsistent security, and limited visibility. Traefik's platform operates at the application layer, understanding HTTP requests and user context to make intelligent routing decisions that traditional IP-level systems cannot, applying the same policies and controls regardless of the underlying infrastructure.
"The industry has sadly accepted infrastructure-specific application delivery as inevitable, forcing teams to manage separate stacks for VMs, containers, and serverless," said Sudeep Goswami, CEO at Traefik Labs. "We're proving that's a false choice. Organizations can have unified application intelligence that works everywhere while remaining open and standards-based."
The platform delivers unified intelligence across three key infrastructure layers. It bridges traditional VM workloads with cloud-native infrastructure, provides seamless integration across the entire container ecosystem from Docker to any Kubernetes distribution, and with the new Traefik Proxy 3.6, adds native Knative support for serverless Kubernetes workloads. This completes the platform's coverage, allowing enterprises to adopt serverless with the same application intelligence applied to VMs and containers.
The latest release of the core proxy introduces three significant capabilities that power the platform. The native Knative support brings enterprise-grade application intelligence to serverless workloads, automatically discovering services and managing traffic. Multi-layer routing enables context-aware decisions by authenticating users at a parent level and then making intelligent routing decisions at child levels based on enriched context like user role or subscription tier. Finally, full conformance with Kubernetes Gateway API v1.4 provides a proven, standards-based approach for application networking.
This unified approach positions Traefik as a unique solution in the market, offering operational simplification and consistent policy enforcement across heterogeneous environments. By providing a standards-based framework that works identically from the public cloud to air-gapped facilities, the platform empowers organizations with true operational independence and a path for gradual application modernization.
About Traefik Labs
Traefik Labs empowers organizations to adopt and scale cloud-native architectures through its unified platform for application connectivity, API management, and AI governance. The company delivers seamless integration across multi-cloud, hybrid, on-premises, and air-gapped environments without vendor lock-in.
Traefik Proxy, the company's flagship open-source project, ranks among Docker Hub's top 10 projects with over 3.4 billion downloads and 57,000 stars on GitHub. Traefik's platform extends beyond traditional API management to address AI infrastructure challenges, including AI Gateway with NVIDIA Safety NIMs, MCP Gateway for agent governance, and comprehensive offline deployment.