Google launches new Ironwood chip to boost performance of AI applications

Google launches new Ironwood chip to boost performance of AI applications

Alphabet unveiled its seventh-generation AI chip, called Ironwood chip, on Wednesday. The chip boosts the performance of AI applications. It targets “inference” computing — the rapid data processing used when users interact with tools like ChatGPT.

Ironwood handles the quick calculations needed to generate chatbot replies or other AI responses. This chip stands as a rare alternative to Nvidia’s powerful AI processors. It marks a key part of Google’s multi-billion dollar, decade-long investment in custom AI hardware.

Google’s tensor processing units (TPUs) remain exclusive to its engineers or cloud customers. This strategy gives its internal AI teams an edge over some competitors.

In earlier versions, Google split TPUs into two types. One built large AI models. The other ran AI applications more cheaply by removing model-building features.

Ironwood focuses on running AI applications. It can work in clusters of up to 9,216 chips, said Google VP Amin Vahdat. The new design combines features from previous versions and adds more memory. This makes it more efficient for serving AI tasks.

“Inference is becoming much more important,” said Vahdat.

Ironwood delivers twice the performance per watt compared to last year’s Trillium chip. Google uses its custom chips to build and deploy its Gemini AI models.

The company didn’t say who manufactures the chip.

Alphabet shares jumped 9.7% after President Donald Trump reversed course on tariffs.

Scroll to Top