Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "Krystal Hu Stephen Nellis"


3 mentions found


May 31 (Reuters) - Specialized cloud computing provider CoreWeave has raised $200 million in funding from its existing investor Magnetar Capital, highlighting investor interest in backing infrastructure powering the generative AI boom. The funding, which valued the company at more than $2 billion, comes weeks after CoreWeave raised $221 million from investors including Magnetar Capital and Nvidia (NVDA.O). CoreWeave specializes in providing cloud computing services based on graphics processing units (GPUs), the category of chip pioneered by Nvidia that has become central to artificial intelligence (AI) services like OpenAI's ChatGPT. CoreWeave sells computing power to those AI companies, competing with cloud computing service providers such as Microsoft Azure and Amazon's AWS. (MSFT.O), (AMZN.O)CoreWeave aims to stand out by building its data centers differently for AI work, using a networking technology called InfiniBand to link computers together instead of Ethernet cables that are the current standard in most data centers, McBee said.
Persons: CoreWeave, Brannin, McBee, Krystal Hu, Stephen Nellis, Himani Sarkar Organizations: Magnetar, Magnetar Capital, Nvidia, ChatGPT, Microsoft, AWS, Thomson Locations: New York, San Francisco
Generative AI gobbles up reams of computing power, amplifying the urgency of Meta's capacity scramble, said five of the sources. Executives also that spring set about reorganizing Meta's AI units, naming two new heads of engineering in the process, including Janardhan, the author of the September memo. According to four of the sources, Meta did not prioritize building generative AI products until after the launch of ChatGPT in November. Zuckerberg announced a new top-level generative AI team in February that he said would "turbocharge" the company's work in the area. Carvill, the Meta spokesperson, said the company has been building generative AI products on different teams for more than a year.
The code is an AI model, an algorithm that is trained on sets of data and can then learn from new data to perform a variety of tasks. Databricks CEO Ali Ghodsi said the release was aimed at demonstrating a viable alternative to training a kind of AI model called a large language model with enormous resources and computing power. OpenAI, valued at $29 billion, trains its AI models with huge troves of data on a supercomputer from investor Microsoft Corp (MSFT.O). Databricks wants enterprises to train their own AI models using its software. "My belief is that in the end, you will make these models smaller, smaller and smaller, and they will be open-sourced," Ghodsi said.
Total: 3