Home
News
Tech Grid
Data & Analytics
Data Processing Data Management Analytics Data Infrastructure Data Integration & ETL Data Governance & Quality Business Intelligence DataOps Data Lakes & Warehouses Data Quality Data Engineering Big Data
Enterprise Tech
Digital Transformation Enterprise Solutions Collaboration & Communication Low-Code/No-Code Automation IT Compliance & Governance Innovation Enterprise AI Data Management HR
Cybersecurity
Risk & Compliance Data Security Identity & Access Management Application Security Threat Detection & Incident Response Threat Intelligence AI Cloud Security Network Security Endpoint Security Edge AI
AI
Ethical AI Agentic AI Enterprise AI AI Assistants Innovation Generative AI Computer Vision Deep Learning Machine Learning Robotics & Automation LLMs Document Intelligence Business Intelligence Low-Code/No-Code Edge AI Automation NLP AI Cloud
Cloud
Cloud AI Cloud Migration Cloud Security Cloud Native Hybrid & Multicloud Cloud Architecture Edge Computing
IT & Networking
IT Automation Network Monitoring & Management IT Support & Service Management IT Infrastructure & Ops IT Compliance & Governance Hardware & Devices Virtualization End-User Computing Storage & Backup
Human Resource Technology Agentic AI Robotics & Automation Innovation Enterprise AI AI Assistants Enterprise Solutions Generative AI Regulatory & Compliance Network Security Collaboration & Communication Business Intelligence Leadership Artificial Intelligence Cloud
Finance
Insurance Investment Banking Financial Services Security Payments & Wallets Decentralized Finance Blockchain
HR
Talent Acquisition Workforce Management AI HCM HR Cloud Learning & Development Payroll & Benefits HR Analytics HR Automation Employee Experience Employee Wellness
Marketing
AI Customer Engagement Advertising Email Marketing CRM Customer Experience Data Management Sales Content Management Marketing Automation Digital Marketing Supply Chain Management Communications Business Intelligence Digital Experience SEO/SEM Digital Transformation Marketing Cloud Content Marketing
Consumer Tech
Smart Home Technology Home Appliances Consumer Health AI
Interviews
Think Stack
Press Releases
Articles
Resources
  • Enterprise AI

Cerebras API Program Boosts AI with Dataiku, Vercel


Cerebras API Program Boosts AI with Dataiku, Vercel
  • Source: Source Logo
  • |
  • September 10, 2025

Cerebras Systems has introduced its API Certification Program, partnering with Dataiku, Vercel, Portkey, and TrueFoundry to deliver ultra-fast AI inference, enhancing enterprise AI adoption with up to 70x faster performance than GPU-based solutions.

Quick Intel

  • Cerebras launches API Certification Program with leading AI platforms.

  • Partners include Dataiku, Vercel, Portkey, and TrueFoundry.

  • Offers sub-50 ms inference speeds, 70x faster than GPU solutions.

  • Integrations enable secure, scalable AI deployment for enterprises.

  • Hugging Face and OpenRouter enhance developer access and cost efficiency.

  • Over 200 startups have reduced AI costs by 40% with Cerebras.

Advancing Enterprise AI with Strategic Partnerships

Cerebras Systems, a pioneer in generative AI, announced its API Certification Program on September 10, 2025, in Sunnyvale, California. The program integrates Cerebras’ industry-leading AI inference capabilities, which deliver up to 70x faster performance than GPU-based solutions, with leading platforms like Dataiku, Vercel, Portkey, and TrueFoundry. These partnerships aim to democratize ultra-fast AI inference, enabling enterprises to scale AI workloads seamlessly while maintaining security and compliance.

Dataiku LLM Mesh Integration

Cerebras’ integration with Dataiku’s Universal AI Platform leverages its LLM Mesh to provide enterprise data teams with access to the world’s fastest AI inference through a secure API gateway. “With Dataiku, customers have the freedom to run enterprise AI on top of any tech stack—and now they gain the ability to choose Cerebras for inference compute at unprecedented speed,” said Jed Dougherty, VP of Platform Strategy at Dataiku. This alliance streamlines deployment, enhances model accuracy, and supports new AI use cases with Dataiku’s agnostic architecture and Cerebras’ Wafer-Scale Engine (WSE).

Portkey AI Gateway Partnership

Partnering with Portkey, the world’s most popular open-source AI Gateway, Cerebras delivers unmatched inference speed and cost-efficiency, achieving over 1,100 tokens per second with 99.99% uptime. “Cerebras is one of the few inference providers in the market with a rare combination of extremely low latencies, high uptime, and reasonable costs,” said Portkey CEO Rohit Agarwal. The integration provides enterprises with observability, cost management, and guardrails, enabling faster production deployment.

TrueFoundry AI Control Plane

Cerebras’ integration with TrueFoundry’s AI Gateway offers unified access to its inference capabilities via OpenAI-compatible APIs. “By integrating Cerebras with TrueFoundry’s AI Gateway, we’re enabling organizations to combine breakthrough performance with the controls and flexibility they need to confidently run AI in production,” said Anuraag Gutgutia, co-founder of TrueFoundry. Features like rate limiting and on-prem deployment ensure governance and data sovereignty for enterprise AI workloads.

Vercel AI Cloud Integration

Cerebras’ partnership with Vercel AI Cloud simplifies high-performance AI inference for web developers. “By connecting Cerebras’ inference infrastructure with our AI SDK and AI Gateway, developers gain the tools to build ultra-responsive, production-ready applications without complexity,” said Harpreet Arora, AI Product Lead at Vercel. This integration enables real-time content personalization and conversational interfaces, streamlining workflows through Vercel’s AI SDK and Gateway.

Expanding Ecosystem and Impact

Building on collaborations with Hugging Face and OpenRouter, Cerebras enhances developer access with sub-50 ms responses and up to 50% lower per-token costs. “We are thrilled to add these world class partners to our ecosystem,” said Alan Chhabra, EVP of Worldwide Partners at Cerebras. Over 200 startups and mid-market customers have reduced AI infrastructure costs by 40% and accelerated model iteration by 5x, showcasing the program’s transformative potential.

About Cerebras Systems

Cerebras Systems is a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types. We have come together to accelerate generative AI by building from the ground up a new class of AI supercomputer. Our flagship product, the CS-3 system, is powered by the world’s largest and fastest commercially available AI processor, our Wafer-Scale Engine-3. CS-3s are quickly and easily clustered together to make the largest AI supercomputers in the world, and make placing models on the supercomputers dead simple by avoiding the complexity of distributed computing. 

  • AIEnterprise AIGenerative AIAPI CertificationCerebras
News Disclaimer
  • Share