Start your day with intelligence. Get The OODA Daily Pulse.

Home > Briefs > Technology > IBM’s open source Granite 4.0 Nano AI models are small enough to run locally directly in your browser

IBM’s open source Granite 4.0 Nano AI models are small enough to run locally directly in your browser

In an industry where model size is often seen as a proxy for intelligence, IBM is charting a different course — one that values efficiency over enormity, and accessibility over abstraction. The 114-year-old tech giant’s four new Granite 4.0 Nano models, released today, range from just 350 million to 1.5 billion parameters, a fraction of the size of their server-bound cousins from the likes of OpenAI, Anthropic, and Google. These models are designed to be highly accessible: the 350M variants can run comfortably on a modern laptop CPU with 8–16GB of RAM, while the 1.5B models typically require a GPU with at least 6–8GB of VRAM for smooth performance — or sufficient system RAM and swap for CPU-only inference. This makes them well-suited for developers building applications on consumer hardware or at the edge, without relying on cloud compute.

Full report : IBM releases four open-source Granite 4.0 Nano AI models ranging from 350 million to 1.5 billion parameters, designed to run on consumer hardware and even in web browsers.

Tagged: AI IBM