Google Alphabet presents its AI chip, Trillium
Alphabet, the parent company of Google, announced the launch of a new offering in its family of artificial intelligence data center chips, called Trillium.
Trillium, according to the company, is its most advanced AI-specific hardware, used in Tensor processing units, or TPUs, also developed for AI.
Custom chips for Google's AI data centers are one of the few options on the market that don't come from Nvidia. One of the few elements that avoid a dangerous monopoly position.
According to reports, while Nvidia currently enjoys an 80% market share in the AI chip segment, Google dominates the remaining 20%.
It's important to note that Google doesn't sell the chips. The company, however, rents them through its cloud computing platform.
Trillium speed and efficiency
“Trillium TPUs achieve an impressive 4.7x increase in peak compute performance per chip compared to v5e TPUs,” Google said in a blog post.
In its latest offering, the company also claims to have doubled the capacity and bandwidth of high-bandwidth memory (HBM) and doubled the bandwidth of inter-chip interconnect (ICI) compared to TPU v5e .
“Additionally, Trillium features third-generation SparseCore, an accelerator specialized in processing ultra-large embeddings, common in advanced classification and recommendation workloads,” the blog post reads.
He also added that they can train the next wave of 'foundation models' at reduced latency and lower cost.
The sixth-generation Trillium TPUs are more than 67% more energy efficient than the v5e TPU, according to Google. Power consumption is one of the key factors in choosing chips for AI.
Additional features and how it will help future AI models
Additionally, Trillium can scale up to 256 TPUs in a single high-bandwidth, low-latency pod.
“In addition to this pod-level scalability, with multislice technology and Titanium intelligence processing units (IPUs), Trillium TPUs can scale up to hundreds of pods, connecting tens of thousands of chips in a supercomputer-scale construction interconnected by a multi-petabit per second data center network,” the blog reads.
To achieve a 4.7x increase in computing performance per Trillium chip, the company expanded the size of the matrix multiplication units (MXU) and increased the clock speed.
“Trillium TPUs will power the next wave of AI models and agents, and we look forward to helping our customers unlock these advanced capabilities,” the company said in the blog post.
A boost for Google and Gemini cloud computing services
The new chip will help companies like Deep Genomics, Deloitte and others that use Google Cloud services.
“Support for training and serving long-context multimodal models on Trillium TPUs will also allow Google DeepMind to train and serve future generations of Gemini models more quickly, efficiently, and with lower latency than ever before,” the company said.
Trillium TPUs are part of Google Cloud's AI Hypercomputer, a supercomputing architecture designed specifically for cutting-edge AI workloads.
“Gemini 1.5 Pro is Google's largest and most capable AI model and has been trained using tens of thousands of TPU accelerators,” said Jeff Dean, Chief Scientist of Google Deepmind and Google Research.
“Our team is excited about the announcement of the sixth generation of TPU, and we look forward to increasing performance and efficiency for training and inference at scale of our Gemini models.”
The use of these new chips should allow Google to respond to OpenAI which recently presented an upgrade to its AI model, affirming its superiority over matter .
Thanks to our Telegram channel you can stay updated on the publication of new Economic Scenarios articles.
The article Google Alphabet presents its AI chip, Trillium comes from Economic Scenarios .
This is a machine translation of a post published on Scenari Economici at the URL https://scenarieconomici.it/google-alphabet-presenta-il-suo-chip-per-la-ai-trillium/ on Wed, 15 May 2024 14:52:19 +0000.