While many AI companies have focused on training large language models, Groq seeks to make them run as fast as possible using chips it developed called LPUs, or language processing units.
The gambit is that as AI models get better, inference — the part where the AI makes decisions or answers questions — will demand more computing power than training will, positioning Groq to reap the rewards.
Groq's special (and tightly patented) sauce is its specialized chip design says Ross.
Dean asked Ross's team to design a chip based on a specific type of integrated circuit they were using, and the result was Google's first tensor processing unit, a chip designed specifically for AI.
Watching the AlphaGo program land a complex "shoulder hit" move on its opponent was validation for Ross that faster inference meant better, smarter AI.
Persons:
Groq, Ross, —, Google's, Jeff Dean, Dean, Ross's, Lee Sedol
Organizations:
Rivals, Nvidia, Google