Home
News
Tech Grid
Data & Analytics
Data Processing Data Management Analytics Data Infrastructure Data Integration & ETL Data Governance & Quality Business Intelligence DataOps Data Lakes & Warehouses Data Quality Data Engineering Big Data
Enterprise Tech
Digital Transformation Enterprise Solutions Collaboration & Communication Low-Code/No-Code Automation IT Compliance & Governance Innovation Enterprise AI Data Management HR
Cybersecurity
Risk & Compliance Data Security Identity & Access Management Application Security Threat Detection & Incident Response Threat Intelligence AI Cloud Security Network Security Endpoint Security Edge AI
AI
Ethical AI Agentic AI Enterprise AI AI Assistants Innovation Generative AI Computer Vision Deep Learning Machine Learning Robotics & Automation LLMs Document Intelligence Business Intelligence Low-Code/No-Code Edge AI Automation NLP AI Cloud
Cloud
Cloud AI Cloud Migration Cloud Security Cloud Native Hybrid & Multicloud Cloud Architecture Edge Computing
IT & Networking
IT Automation Network Monitoring & Management IT Support & Service Management IT Infrastructure & Ops IT Compliance & Governance Hardware & Devices Virtualization End-User Computing Storage & Backup
Human Resource Technology Agentic AI Robotics & Automation Innovation Enterprise AI AI Assistants Enterprise Solutions Generative AI Regulatory & Compliance Network Security Collaboration & Communication Business Intelligence Leadership Artificial Intelligence Cloud
Finance
Insurance Investment Banking Financial Services Security Payments & Wallets Decentralized Finance Blockchain Cryptocurrency
HR
Talent Acquisition Workforce Management AI HCM HR Cloud Learning & Development Payroll & Benefits HR Analytics HR Automation Employee Experience Employee Wellness
Marketing
AI Customer Engagement Advertising Email Marketing CRM Customer Experience Data Management Sales Content Management Marketing Automation Digital Marketing Supply Chain Management Communications Business Intelligence Digital Experience SEO/SEM Digital Transformation Marketing Cloud Content Marketing E-commerce
Consumer Tech
Smart Home Technology Home Appliances Consumer Health AI
Interviews
Think Stack
Press Releases
Articles
Resources
  • Enterprise AI

Liqid Unveils Next-Gen Composable Infrastructure for Enterprise AI


Liqid Unveils Next-Gen Composable Infrastructure for Enterprise AI
  • by: Source Logo
  • |
  • July 16, 2025

Liqid, a global leader in software-defined composable infrastructure, announced new portfolio additions on July 16, 2025, designed to optimize enterprise AI workloads in on-premises datacenters and edge environments. These solutions, including the Liqid Matrix 3.6, EX-5410P GPU platform, EX-5410C memory solution, and LQD-5500 NVMe storage, deliver high performance, agility, and efficiency, achieving up to 2x more tokens per watt and 50% higher tokens per dollar for AI applications.

Quick Intel

  • Liqid launches new composable infrastructure for AI, HPC, and VDI workloads.

  • Liqid Matrix 3.6 offers unified management of GPU, memory, and storage.

  • EX-5410P supports 600W GPUs with PCIe Gen5 for high-density performance.

  • EX-5410C leverages CXL 2.0 for up to 100TB of composable memory.

  • LQD-5500 provides 128TB NVMe storage with 50GB/s bandwidth.

  • Reduces power consumption by up to 2x, boosting ROI for AI infrastructure.

Innovative AI Infrastructure Solutions

Liqid’s new offerings address the growing demand for AI inference, reasoning, and agentic use cases in on-premises and edge settings. “With generative AI moving on-premises, it’s pushing datacenter and edge infrastructure to its limits,” said Edgar Masri, CEO of Liqid. The solutions include:

  • Liqid Matrix 3.6: A unified software interface for real-time management of GPU, memory, and storage, integrating with Kubernetes, VMware, Slurm, and Ansible for 100% resource utilization.

  • EX-5410P PCIe Gen5 GPU Platform: Supports up to 10 high-power GPUs (e.g., NVIDIA H200, Intel Gaudi 3) with UltraStack (30 GPUs per server) and SmartStack (30 GPUs across 20 nodes) configurations, reducing power and cooling costs.

  • EX-5410C CXL 2.0 Memory Solution: Enables up to 100TB of composable memory for large language models (LLMs) and in-memory databases, with UltraStack and SmartStack options for dynamic allocation.

  • LQD-5500 NVMe Storage: Delivers 128TB capacity, 50GB/s bandwidth, and 6M IOPS for AI and real-time analytics, ensuring enterprise-grade scalability.

Performance and Efficiency Gains

Liqid’s composable infrastructure eliminates static inefficiencies, achieving up to 100% GPU and memory utilization. By leveraging PCIe Gen5 and CXL 2.0 fabrics, the solutions offer ultra-low-latency, high-bandwidth interconnects, cutting power consumption by up to 2x and boosting tokens per dollar by 50%. This aligns with the $2.7 billion AI data center market’s growth, as enterprises seek sustainable, high-performance solutions.

Market Context and Impact

The composable infrastructure market is projected to grow at a 25% CAGR through 2027, driven by AI and hybrid cloud adoption. Liqid’s solutions, validated by ESG for accelerating provisioning and improving ROI, serve industries like healthcare, finance, and government. Partnerships with NVIDIA, Dell, and Samsung, and deployments at Amazon and Alibaba, underscore Liqid’s leadership.

Liqid’s latest portfolio empowers enterprises to scale AI infrastructure efficiently, minimizing costs and environmental impact while meeting dynamic workload demands.

 

About Liqid

Liqid is the leader in software-defined composable infrastructure, delivering flexible, high-performance, and efficient on-premises datacenter and edge solutions for AI inferencing, VDI, and HPC, as well as solutions for financial services, higher education, healthcare, telecommunications service providers, media & entertainment, and government organizations.

News Disclaimer
  • Share