Tiiny AI Inc. has unveiled the Tiiny AI Pocket Lab, a pocket-sized device officially verified by Guinness World Records as the "Smallest MiniPC (100B LLM Locally)." This groundbreaking personal AI supercomputer is capable of running up to a 120-billion-parameter large language model (LLM) entirely on-device, without the need for cloud connectivity, servers, or high-end GPUs, marking a significant shift toward private, portable, and energy-efficient intelligence.
Quick Intel
Tiiny AI Pocket Lab earns a Guinness World Record as the smallest miniPC capable of running a 100B+ LLM locally.
The pocket-sized device runs up to 120-billion-parameter LLMs fully offline, eliminating cloud dependency for privacy and reliability.
It operates within a 65W power envelope, offering a sustainable alternative to energy-intensive data centers.
Core tech breakthroughs—TurboSparse and PowerInfer—enable server-grade AI performance on compact hardware.
The device supports a ready-to-use open-source ecosystem with one-click installation of major models like Llama and GPT-OSS.
This launch challenges the cloud-centric AI model, prioritizing personal, private, and portable intelligence.
The Tiiny AI Pocket Lab represents a paradigm shift in the AI industry, directly addressing growing concerns around cloud-based AI, including sustainability, cost, reliability, and privacy. "Cloud AI has brought remarkable progress, but it also created dependency, vulnerability, and sustainability challenges," said Samar Bhoj, GTM Director of Tiiny AI. By bringing the computational power required for advanced LLMs into a portable format, the device makes PhD-level reasoning, multi-step analysis, and secure processing of sensitive information accessible anywhere, without an internet connection.
This achievement is made possible by two core proprietary technologies. TurboSparse, a neuron-level sparse activation technique, dramatically improves inference efficiency without sacrificing model intelligence. PowerInfer, an open-source heterogeneous inference engine, dynamically distributes heavy LLM workloads across the CPU and a dedicated Neural Processing Unit (NPU). Together, they enable the Pocket Lab to deliver performance previously requiring professional GPU setups, all within a 30W Thermal Design Power (TDP) and a typical system power draw of just 65W.
The device is built around a custom ARMv9.2 12-core CPU and a heterogeneous AI module delivering approximately 190 Tera Operations Per Second (TOPS). It comes equipped with 80GB of LPDDR5X memory and a 1TB SSD, providing ample space for models and local data storage with bank-level encryption. Crucially, Tiiny AI supports an extensive open-source ecosystem, allowing users one-click deployment of leading models like Llama, Qwen, and Mistral, as well as AI agent frameworks. This combination of hardware and software creates a versatile platform for developers, researchers, creators, and professionals.
The launch strategically positions Tiiny AI in the fast-growing LLM market, which Grand View Research projects will expand from $7.4 billion in 2025 to $35.4 billion by 2030. By offering a sustainable and private alternative to cloud processing, especially for the "golden zone" of personal AI (10B–100B parameters), Tiiny AI taps into demand for AI that is both powerful and personal. The company, founded in 2024 by a team of engineers from leading institutions and firms, is backed by multi-million-dollar seed funding, underscoring investor confidence in its vision to decentralize advanced AI.
The Tiiny AI Pocket Lab is more than a technical marvel; it is a statement on the future trajectory of artificial intelligence. It challenges the industry's centralized cloud infrastructure by proving that powerful, large-model intelligence can be personal, portable, and private. As the device prepares for its full feature release at CES in January 2026, it sets a new benchmark for what is possible in edge AI, potentially accelerating a broader shift toward user-owned, energy-conscious, and secure intelligent computing.
About Tiiny AI Inc.
Tiiny AI Inc. is a US deep-tech AI startup pioneering personal AI supercomputing. It brings cloud-grade large-model intelligence fully on-device, private, offline, and accessible to everyone. With its founding team formed in 2024 by engineers from MIT, Stanford, HKUST, SJTU, Intel, and Meta, the company develops breakthrough technologies such as TurboSparse and PowerInfer that enable up-to-120B LLMs to run on pocket-sized consumer devices for the first time. Tiiny AI's mission is to make advanced AI accessible, private, and personal.