Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "Ian Buck"


4 mentions found


Nvidia launches new AI chip configuration
  + stars: | 2023-08-08 | by ( Max A. Cherney | ) www.reuters.com   time to read: +2 min
REUTERS/Robert Galbraith/File PhotoAug 8 (Reuters) - Nvidia (NVDA.O) announced a new configuration on Tuesday for its advanced artificial intelligence chips that is designed to speed generative AI applications. The configuration is optimized to perform AI inference functions that effectively power generative AI applications such as ChatGPT. Nvidia's Grace Hopper Superchip design stitches together one of the company's H100 graphics processing units (GPU) with an Nvidia-designed central processor. The underlying AI models that power the generative AI apps that are capable of producing human-like text and images continue to grow in size. Nvidia plans to sell two flavors: a version that includes two chips that customers can integrate into systems, and a complete server system that combines two Grace Hopper designs.
Persons: Robert Galbraith, Grace Hopper Superchip, Ian Buck, Nvidia's Grace Hopper, Buck, Grace Hopper, Max A, Marguerita Choy Organizations: Nvidia, REUTERS, Thomson Locations: Santa Clara , California, San Francisco
Currently, Nvidia dominates the market for AI chips with over 80% market share, according to some estimates. But Nvidia's chips are in short supply as tech giants, cloud providers and startups vie for GPU capacity to develop their own AI models. Nvidia's new chip, the GH200, has the same GPU as the company's current highest-end AI chip, the H100. Oftentimes, the process of working with AI models is split into at least two parts: training and inference. Companies including Google and Amazon are also designing their own custom AI chips for inference.
Persons: Jensen Huang, Walid Berrazeg, Bard, OpenAI's, Huang, Ian Buck, Buck Organizations: Getty, Nvidia, AMD, Google, ARM Locations: Taiwan
Nvidia is the world's top maker of graphics processing units (GPUs), which are in high demand because they can be used to speed up artificial intelligence work. But Nvidia's GPU chips are typically paired with what is called a central processing unit (CPU), a market that has been dominated by Intel and AMD for decades. The University of Bristol system will be used for climate science and drug discovery research, among other things. That's actually six times more performance and energy efficiency than the university's previous system, Isambard 2," Ian Buck, general manager and vice president of accelerated computing at Nvidia, said during a press briefing. Reporting by Stephen Nellis in San Francisco; Editing by Bill BerkrotOur Standards: The Thomson Reuters Trust Principles.
OAKLAND, Calif, Nov 16, (Reuters) - U.S. chip designer and computing firm Nvidia Corp (NVDA.O) on Wednesday said it is teaming up with Microsoft Corp (MSFT.O) to build a “massive” computer to handle intense artificial intelligence computing work in the cloud. The AI computer will operate on Microsoft’s Azure cloud, using tens of thousands of graphics processing units (GPUs), Nvidia’s most powerful H100 and its A100 chips. “We're seeing a broad groundswell of AI adoption ... and the need for applying AI for enterprise use cases.”In addition to selling Microsoft the chips, Nvidia said it will partner with the software and cloud giant to develop AI models. Buck said Nvidia would also be a customer of Microsoft’s AI cloud computer and develop AI applications on it to offer services to customers. This is important as heavy AI computing work requires thousands of chips to work together across several servers.
Total: 4