Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "Andrew Feldman"


15 mentions found


Valued at $4 billion in 2021, Cerebras is reportedly seeking to roughly double that in its IPO. The customer, G42, is backed by Microsoft , and it's entirely responsible for the $1.43 billion purchase commitment. G42 can pick up $500 million more in Cerebras shares if it commits to spend $5 billion on the company's computing clusters. The major Wall Street banks, for their part, are finding other ways to play in the burgeoning AI infrastructure market. Fitch, who said he sold out of his Nvidia stock years ago, told CNBC that the benefits outweigh the risks.
Persons: Andrew Feldman, Ramsey Cardy, Cerebras, , David Golden, it's, prepayment, CFIUS, Mike Gallagher, Gina Raimondo, Goldman Sachs, Morgan Stanley, JPMorgan Chase, Young, There's, Peter Thiel, Eva Marie Uzcategui, there's, Thiel, Jim Fitch, Fitch, Feldman Organizations: Cerebras Systems, Nvidia, Revolution Ventures, JPMorgan Chase, Microsoft, AstraZeneca, GlaxoSmithKline, Mayo Clinic, Treasury, Foreign Investment, Reuters, Chinese Communist Party, JPMorgan, Citigroup, Barclays, BDO, KPMG, Deloitte, Ernst, Riverstone Networks, CoreWeave, Clarium Capital Management LLC, Bloomberg, Getty, Mizuho Securities, Venture, CNBC, Devices Locations: Toronto, U.S, Sunnyvale , California, Abu Dhabi, China, Miami , Florida, Los Angeles, Florida
There are signs across AI models, chips, and new form factors that the market is getting frothy. Investors spent the summer wondering if top AI stocks could continue to justify soaring valuations in the face of absent returns from their massive AI spending. Now, signs have emerged that they're not yet done with generative AI mania. OpenAI reaches dizzying new heightsSam Altman's OpenAI secured a $157 billion valuation after raising $6.6 billion in its latest funding round. In short, a lossmaking startup must justify its $157 billion valuation.
Persons: Cerebras, , Andrew Feldman, Ramsey Cardy Cerebras, here's, Abu, Cerebras —, Altman's OpenAI, OpenAI, Elon Musk's xAI, OpenAI's, Ilya Sutskever, Gary Marcus, OpenAI's Sam Altman, David Sacks, Darius Rafieyan, Mira Murati, Mark Zuckerberg, Andrej Sokolow, frothiness, Jensen Huang, Alex Heath, Rahul Prasad, Snapchat Organizations: Nvidia, Service, Investors, Microsoft, Saudi Aramco, Bloomberg, OpenAI, LLMs, Financial Times, Anthropic, Craft Ventures, Tiger Global, The New York Times, Getty, company's Connect, Meta, Orion Locations: Sunnyvale, Abu Dhabi, Silver, Saudi, Silicon Valley,
Andrew Feldman, co-founder and CEO of Cerebras Systems, speaks at the Collision conference in Toronto on June 20, 2024. Cerebras competes with Nvidia , whose graphics processing units are the industry's choice for training and running AI models. In addition to selling chips, Cerebras offers cloud-based services that rely on its own computing clusters. For the full year of 2023, Cerebras reported a net loss of $127.2 million on revenue of $78.7 million. Cloud providers Amazon , Google and Microsoft have developed their own AI chips.
Persons: Andrew Feldman, Cerebras Organizations: Cerebras Systems, Systems, Nasdaq, Cerebras, Nvidia, Google, Microsoft Locations: Toronto, UAE
Liang told Business Insider he expects 90% of AI computing workloads will be in inference in the not-too-distant future. AdvertisementThat's why several startups are charging aggressively into the inference market — emphasizing where they might outperform the goliath in the space. Speed is an important factor when multiple AI models talk to each other and waiting for an answer can dampen the magic of generative AI. The number of tokens per second that can be consumed (when a prompt goes in) and generated (when a response comes out) is a common metric for AI computing speed. AdvertisementCerebras's AI chip is roughly size of a dinner plate.
Persons: , Rodrigo Liang, Colleen Kress, workloads, Liang, Andrew Feldman, Bernstein, Colette Kress, Kress, Nvidia's, SambaNova, Anthropic's Claude, Nvidia isn't, Jensen, Dylan Patel, Patel Organizations: Service, Business, SambaNova Systems, Nvidia, ARM, AMD, o1, BI Locations: Artificialanalysis.ai, TCO
Few would envy the task of competing against market leader Nvidia, which has between 70% and 90% market share, according to Mizuho Securities. Still, Nvidia gets most of the hype — and the revenue. Though the company will not confirm it, The Information reported last month that Cerebras has confidentially filed for an IPO. Still, Feldman expects that even consumer-directed AI models will grow larger, and larger models need faster chips. Competing with that mindshare is an uphill battle as Nvidia faces more competition and goes on defense.
Persons: , Andrew Feldman, I'm, Feldman, Cerebras, David, He's, Cerebras's, Ramsey Cardy Cerebras, Susan Organizations: Service, Palo, Nvidia, Mizuho Securities, Business, Big Tech, Intel, Google, AMD, GSK, AstraZeneca, IBM, Eclipse Ventures Locations: California, Palo Alto , California, It's
So far, the only part of Earth AI seems hell-bent on dominating is the power grid. AI data centers are doubling the pace of electricity demand growth in the US to the extent that demand could exceed supply in just two years without action, according to Bernstein Research. The potential shortfall could mean higher prices for the computing power AI developers of all sizes are clamoring for, along with ample opportunity for investors willing to build up supply. How are AI companies planning aheadIt's not just the total power needed for AI computing infrastructure, it's the unique cadence of the power need and the cooling AI chips require. Amazon is clearing some of these hurdles by colocating some data centers with nuclear power sites.
Persons: , that's, catchup, Bernstein, Jensen Huang, Huang, Agrawal, we're, Andrew Feldman, Feldman Organizations: Service, Bernstein Research, Business, Nvidia, Vertiv, Lambda, Cerebras Systems Locations: UAE, South Africa, Kenya, Nigeria, Egypt, Iceland
The supercomputer, which was unveiled on Thursday by Cerebras, a Silicon Valley start-up, was built with the company’s specialized chips, which are designed to power artificial intelligence products. The chips stand out for their size — like that of a dinner plate, or 56 times as large as a chip commonly used for A.I. Cerebras said it had built the supercomputer for G42, an A.I. G42 said it planned to use the supercomputer to create and power A.I. “What we’re showing here is that there is an opportunity to build a very large, dedicated A.I.
Persons: Cerebras, , Andrew Feldman, Organizations: Cerebras Locations: Santa Clara, Calif
[1/2] A view of Condor Galaxy supercomputing systems for artificial intelligence work made by Cerebras Systems, in Santa Clara, California, U.S., in this undated handout photo received on July 19, 2023. Courtesy of Rebecca Lewington of Cerebras Systems/Handout via REUTERSJuly 20 (Reuters) - Cerebras Systems on Thursday said that it has signed an approximately $100 million deal to supply the first of three artificial intelligence (AI) supercomputers to the United Arab Emirates-based technology group G42. "Cerebras has what they call a 'white glove' service that made it easy for us" to build AI systems on its machines, G42 Cloud CEO Talal AlKaissi told Reuters. The contract to complete the first of the three systems announced on Thursday is worth about $100 million, Cerebras CEO Andrew Feldman said. "What we're saying is that the $100 million contract takes us through Condor Galaxy 1... That's the unit, the building block."
Persons: Rebecca Lewington, Cerebras, Talal AlKaissi, Andrew Feldman, Stephen Nellis, Krystal Hu, Rashmi Organizations: Condor Galaxy supercomputing, Cerebras Systems, REUTERS, Systems, United Arab, Nvidia Corp, Nvidia, Condor Galaxy, Mudabala, Reuters, Thomson Locations: Santa Clara , California, U.S, United Arab Emirates, Abu Dhabi, Silver, San Francisco, New York
[1/2] A view of Condor Galaxy supercomputing systems for artificial intelligence work made by Cerebras Systems, in Santa Clara, California, U.S., in this undated handout photo received on July 19, 2023. Abu Dhabi-based G42, a tech conglomerate with nine operating companies that include datacenter and cloud service businesses, says it plans to use the Cerebras systems to sell AI computing services to health care and energy companies. G42 has raised $800 million from U.S. tech investment firm Silver Lake, which has backing from Mudabala, the UAE's soverign wealth fund. "Cerebras has what they call a 'white glove' service that made it easy for us" to build AI systems on its machines, G42 Cloud CEO Talal AlKaissi told Reuters. G42 Cloud's AlKaissi declined to comment on the terms of the deal.
Persons: Rebecca Lewington, Cerebras, Andrew Feldman, Feldman, Talal AlKaissi, Stephen Nellis, Krystal Hu, Rashmi Organizations: Condor Galaxy supercomputing, Cerebras Systems, REUTERS, Systems, United Arab, Nvidia Corp, Nvidia, Condor Galaxy, Cerebras, UAE, Mudabala, Reuters, Thomson Locations: Santa Clara , California, U.S, United Arab Emirates, Abu Dhabi, Silver, San Francisco, New York
Producer Jennifer Lawrence poses. REUTERS/Sarah Meyssonnier/File PhotoLONDON, June 12 (Reuters) - Oscar-winner Jennifer Lawrence credits her new R-rated comedy "No Hard Feelings" for making her want to get back to work. Lawrence, 32, who took a two-year break from acting between 2019 and 2021 and had a son in early 2022, found the script too good to pass up. "So, I quickly changed my tune and we were on set four months later," Lawrence, who also produced the movie, saidIn "No Hard Feelings" Lawrence plays Maddie, an Uber driver who finds herself car-less and at risk of losing her home. "No Hard Feelings" will be in cinemas globally from mid-June.
Persons: Jennifer Lawrence, Sarah Meyssonnier, Lawrence, Maddie, Uber, Percy, Gene Stupnitsky, Maddy, Andrew Feldman, Jen, Andrew, Hanna Rantala, Bill Berkrot Organizations: Cannes, Harvard, Thomson Locations: Cannes, France, London
OAKLAND, California, March 28 (Reuters) - Artificial intelligence chip startup Cerebras Systems on Tuesday said it released open source ChatGPT-like models for the research and business community to use for free in an effort to foster more collaboration. Silicon Valley-based Cerebras released seven models all trained on its AI supercomputer called Andromeda, including smaller 111 million parameter language models to a larger 13 billion parameter model. Cerebras said the smaller models can be deployed on phones or smart speakers while the bigger ones run on PCs or servers, although complex tasks like large passage summarization require larger models. Most of the AI models today are trained on Nvidia Corp's (NVDA.O) chips, but more and more startups like Cerebras are trying to take share in that market. The models trained on Cerebras machines can also be used on Nvidia systems for further training or customization, said Feldman.
Generative AI and large language models like OpenAI’s ChatGPT require massive amounts of computing power to run, and typically rely on chips like Nvidia’s graphics-processing units, or GPUs, that are specialized for these types of calculations. Graphcore sells primarily to AI startups looking to build and train models at lower cost, he said, and the company is benefiting from the proliferation of those startups. Shane Rau, who leads International Data Corp.’s semiconductor research, said chip startups are increasingly pivoting to focus their products on supporting large language models. Still, he added, “you’re going to see a combination of real adaptation and marketing.”“There will be the pressure to say: ‘Hey, we’re already relevant, our AI chip technology’s already relevant to generative AI’,” said Mr. Rau. Some chip makers say they expect yet another surge in demand once businesses more widely adopt generative AI.
Beneath the buzz, the next-generation developer framework Ray was key in the viral model's training. "ChatGPT combined a lot of the previous work on large language models with reinforcement as well. Before deploying Ray, OpenAI used a hodgepodge set of custom tools built on top of "neural programmer-interpreter" model. All these tools, Ray and JAX included, are in service to a new generation of combustion engines for the internet called large language models. Multiple companies, both startups and giants, are building their own large language models including Meta, Hugging Face, OpenAI, and Google.
[1/2] Startup Cerebras System's new AI supercomputer Andromeda is seen at a data center in Santa Clara, California, U.S. October 2022. Rebecca Lewington/Cerebras Systems/Handout via REUTERSOAKLAND, Calif. Nov 14(Reuters) - Silicon Valley startup Cerebras Systems, known in the industry for its dinner plate-sized chip made for artificial intelligence work, on Monday unveiled its AI supercomputer called Andromeda, which is now available for commercial and academic research. Andromeda is built by linking up 16 Cerebras CS-2 systems, the company's latest AI computer built around the over-sized chip called the Wafer-Scale Engine 2. This is less than $35 million,” said Andrew Feldman, founder and CEO of Cerebras when asked about the Frontier supercomputer. Feldman said Andromeda is owned by Cerebras and built at a high performance data center in Santa Clara, California called Colovore.
Tocmai a fost anunţat unul capabil să sfideze şi timpul. Cel mai rapid procesor construit vreodată tocmai a fost prezentat. Se numeşte CS-1 Cerebras şi a fost contruit de compania cu acelaşi nume alături de Departamentul de Energie. Are 400 de mii de nuclee şi e mai rapid decât un supercomputer. De exemplu, poate să pornească simularea reacţiei unui nucleu al unei centrale electrice şi să încheie simularea înainte să se încheie acea reacţie.
Persons: Andrew Feldman Organizations: Energie, Systems
Total: 15