The model, codenamed as “Olympus”, has 2 trillion parameters, the people said, which could make it one of the largest models being trained.
OpenAI's GPT-4 model, one of the best models available, is reported to have one trillion parameters.
As head scientist of artificial general intelligence (AGI) at Amazon, Prasad brought in researchers who had been working on Alexa AI and the Amazon science team to work on training models, uniting AI efforts across the company with dedicated resources.
It has also partnered with AI model startups such as Anthropic and AI21 Labs, offering them to Amazon Web Services (AWS) users.
Training bigger AI models is more expensive given the amount of computing power required.
Persons:
OpenAI's, Rohit Prasad, Andy Jassy, Prasad, Krystal Hu, Gerry Doyle
Organizations:
Reuters, Alexa, AI21 Labs, Amazon Web Services, Amazon, Thomson
Locations:
San Francisco