
Liqid, a global leader in software-defined composable infrastructure, announced new portfolio additions on July 16, 2025, designed to optimize enterprise AI workloads in on-premises datacenters and edge environments. These solutions, including the Liqid Matrix 3.6, EX-5410P GPU platform, EX-5410C memory solution, and LQD-5500 NVMe storage, deliver high performance, agility, and efficiency, achieving up to 2x more tokens per watt and 50% higher tokens per dollar for AI applications.
Liqid launches new composable infrastructure for AI, HPC, and VDI workloads.
Liqid Matrix 3.6 offers unified management of GPU, memory, and storage.
EX-5410P supports 600W GPUs with PCIe Gen5 for high-density performance.
EX-5410C leverages CXL 2.0 for up to 100TB of composable memory.
LQD-5500 provides 128TB NVMe storage with 50GB/s bandwidth.
Reduces power consumption by up to 2x, boosting ROI for AI infrastructure.
Liqid’s new offerings address the growing demand for AI inference, reasoning, and agentic use cases in on-premises and edge settings. “With generative AI moving on-premises, it’s pushing datacenter and edge infrastructure to its limits,” said Edgar Masri, CEO of Liqid. The solutions include:
Liqid Matrix 3.6: A unified software interface for real-time management of GPU, memory, and storage, integrating with Kubernetes, VMware, Slurm, and Ansible for 100% resource utilization.
EX-5410P PCIe Gen5 GPU Platform: Supports up to 10 high-power GPUs (e.g., NVIDIA H200, Intel Gaudi 3) with UltraStack (30 GPUs per server) and SmartStack (30 GPUs across 20 nodes) configurations, reducing power and cooling costs.
EX-5410C CXL 2.0 Memory Solution: Enables up to 100TB of composable memory for large language models (LLMs) and in-memory databases, with UltraStack and SmartStack options for dynamic allocation.
LQD-5500 NVMe Storage: Delivers 128TB capacity, 50GB/s bandwidth, and 6M IOPS for AI and real-time analytics, ensuring enterprise-grade scalability.
Liqid’s composable infrastructure eliminates static inefficiencies, achieving up to 100% GPU and memory utilization. By leveraging PCIe Gen5 and CXL 2.0 fabrics, the solutions offer ultra-low-latency, high-bandwidth interconnects, cutting power consumption by up to 2x and boosting tokens per dollar by 50%. This aligns with the $2.7 billion AI data center market’s growth, as enterprises seek sustainable, high-performance solutions.
The composable infrastructure market is projected to grow at a 25% CAGR through 2027, driven by AI and hybrid cloud adoption. Liqid’s solutions, validated by ESG for accelerating provisioning and improving ROI, serve industries like healthcare, finance, and government. Partnerships with NVIDIA, Dell, and Samsung, and deployments at Amazon and Alibaba, underscore Liqid’s leadership.
Liqid’s latest portfolio empowers enterprises to scale AI infrastructure efficiently, minimizing costs and environmental impact while meeting dynamic workload demands.
Liqid is the leader in software-defined composable infrastructure, delivering flexible, high-performance, and efficient on-premises datacenter and edge solutions for AI inferencing, VDI, and HPC, as well as solutions for financial services, higher education, healthcare, telecommunications service providers, media & entertainment, and government organizations.