Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "Vultr"


3 mentions found


The company started buying Nvidia GPUs in 2021, getting to the AI cloud computing business earlier than most. CEO J.J. Kardwell describes the company as the largest privately-held cloud computing platform in the world. It also makes the Vultr boss well-positioned to explain a phrase frequently on the lips of AI luminaries like Nvidia CEO Jensen Huang: sovereign AI. Sovereign cloud is the delivery of cloud infrastructure as a service, typically inside a country. BI: AI computing is extremely energy intensive, and concerns about the US grid abound.
Persons: , it's, J.J, Kardwell, Jensen Huang, we'll Organizations: Vultr, Service, Nvidia, Singtel, Department of Defense Locations: Singapore, Malaysia
But the real focus will be on whether widening U.S. curbs on sales of its high-end chips to China could hamper that run. The results will also be a major test for the AI-powered rally that has helped drive up the U.S. stock market this year, with the Philadelphia semiconductor index (.SOX) up nearly 50% in 2023. The Biden administration last month banned China sales of the H800 and A800 chips that Nvidia had created after previous curbs on exports to the country. Reuters GraphicsCHINA CHIPSBefore the latest China export curbs, demand for Nvidia's H800 chip, a slower version of its flagship AI chip, had outpaced rivals as it was better than the alternatives. "It's possible Nvidia's massive growth will make revenue from China less material over time," Morningstar analysts said.
Persons: Dado Ruvic, Kyle Rodda, Biden, Bernstein, Stacy Rasgon, Arsheeya Bajwa, Chavi Mehta, Devika Organizations: NVIDIA, REUTERS, Nvidia, Wall Street Journal, Reuters Graphics, LSEG, AMD, Web Services, Google, Microsoft, Oracle Cloud Infrastructure, Lambda, Morningstar, Thomson Locations: China, Philadelphia, Reuters Graphics CHINA, Bengaluru
The H200, as the chip is called, will overtake Nvidia's current top H100 chip. The primary upgrade is more high-bandwidth memory, one of the costliest parts of the chip that defines how much data it can process quickly. Nvidia dominates the market for AI chips and powers OpenAI's ChatGPT service and many similar generative AI services that respond to queries with human-like writing. The H200 has 141-gigabytes of high-bandwidth memory, up from 80 gigabyte in its previous H100. Nvidia also buys memory from Korea's SK Hynix (000660.KS), which said last month that AI chips are helping to revive sales.
Persons: OpenAI's, Stephen Nellis, Sam Holmes Organizations: Nvidia, Google, Oracle, Nvidia's, Micron Technology, Korea's SK Hynix, Amazon Web Services, Microsoft, Oracle Cloud Infrastructure, Lambda, Thomson Locations: San Francisco
Total: 3