Top related persons:
Top related locs:
Top related orgs:

Search resuls for: "Cerebras"


15 mentions found


G42, founded by the emirate of Abu Dhabi, said it would utilize OpenAI's generative AI models in areas including financial services, energy, healthcare and public services. Mubadala-backed G42 has been at the forefront of AI adoption in the UAE. Abu Dhabi's Advanced Technology Research Council (ATRC) also made its Falcom 40B large language model (LLM) open source for research and commercial use in May. G42 said it would use Microsoft's (MSFT.O) Azure data centers as part of its AI infrastructure to boost regional adoption. OpenAI CEO Sam Altman had visited Abu Dhabi in June as part of his AI "world tour", where he advocated for AI developments to be regulated by international bodies.
Persons: Dado Ruvic, OpenAI, Abu, Sam Altman, Arsheeya, Pooja Desai Organizations: REUTERS, United Arab, U.S, Cerebras Systems, Abu, Technology Research, Thomson Locations: Abu Dhabi, United Arab Emirates, UAE, Bengaluru
Startup Cerebras System's new AI supercomputer Andromeda is seen at a data center in Santa Clara, California, U.S. October 2022. Rebecca Lewington/Cerebras Systems/Handout via REUTERS/File Photo Acquire Licensing RightsAug 30 (Reuters) - A group of engineers, researchers and a Silicon Valley-based chip company collaborated to release advanced Arabic language software that can power generative AI applications. The new large language model called Jais contains 13 billion parameters that was made from a big batch of data combining Arabic and English, a portion of which is from computer code. The new language model was created with the help of supercomputers produced by the Silicon Valley-based Cerebras Systems, which designs dinner plate-sized chips that compete with Nvidia's (NVDA.O) powerful AI hardware. The group trained the Jais model on a Cerebras' supercomputer called a Condor Galaxy.
Persons: Rebecca Lewington, Cerebras, Mohamed bin, Timothy Baldwin, Baldwin, Max A, Josie Kao Organizations: Cerebras, REUTERS, Systems, United Arab Emirates, University of Artificial Intelligence, Reuters, Condor Galaxy, Thomson Locations: Santa Clara , California, U.S, University, Abu Dhabi, San Francisco
[1/2] A view of Condor Galaxy supercomputing systems for artificial intelligence work made by Cerebras Systems, in Santa Clara, California, U.S., in this undated handout photo received on July 19, 2023. Courtesy of Rebecca Lewington of Cerebras Systems/Handout via REUTERSJuly 20 (Reuters) - Cerebras Systems on Thursday said that it has signed an approximately $100 million deal to supply the first of three artificial intelligence (AI) supercomputers to the United Arab Emirates-based technology group G42. "Cerebras has what they call a 'white glove' service that made it easy for us" to build AI systems on its machines, G42 Cloud CEO Talal AlKaissi told Reuters. The contract to complete the first of the three systems announced on Thursday is worth about $100 million, Cerebras CEO Andrew Feldman said. "What we're saying is that the $100 million contract takes us through Condor Galaxy 1... That's the unit, the building block."
Persons: Rebecca Lewington, Cerebras, Talal AlKaissi, Andrew Feldman, Stephen Nellis, Krystal Hu, Rashmi Organizations: Condor Galaxy supercomputing, Cerebras Systems, REUTERS, Systems, United Arab, Nvidia Corp, Nvidia, Condor Galaxy, Mudabala, Reuters, Thomson Locations: Santa Clara , California, U.S, United Arab Emirates, Abu Dhabi, Silver, San Francisco, New York
[1/2] A view of Condor Galaxy supercomputing systems for artificial intelligence work made by Cerebras Systems, in Santa Clara, California, U.S., in this undated handout photo received on July 19, 2023. Abu Dhabi-based G42, a tech conglomerate with nine operating companies that include datacenter and cloud service businesses, says it plans to use the Cerebras systems to sell AI computing services to health care and energy companies. G42 has raised $800 million from U.S. tech investment firm Silver Lake, which has backing from Mudabala, the UAE's soverign wealth fund. "Cerebras has what they call a 'white glove' service that made it easy for us" to build AI systems on its machines, G42 Cloud CEO Talal AlKaissi told Reuters. G42 Cloud's AlKaissi declined to comment on the terms of the deal.
Persons: Rebecca Lewington, Cerebras, Andrew Feldman, Feldman, Talal AlKaissi, Stephen Nellis, Krystal Hu, Rashmi Organizations: Condor Galaxy supercomputing, Cerebras Systems, REUTERS, Systems, United Arab, Nvidia Corp, Nvidia, Condor Galaxy, Cerebras, UAE, Mudabala, Reuters, Thomson Locations: Santa Clara , California, U.S, United Arab Emirates, Abu Dhabi, Silver, San Francisco, New York
The supercomputer, which was unveiled on Thursday by Cerebras, a Silicon Valley start-up, was built with the company’s specialized chips, which are designed to power artificial intelligence products. The chips stand out for their size — like that of a dinner plate, or 56 times as large as a chip commonly used for A.I. Cerebras said it had built the supercomputer for G42, an A.I. G42 said it planned to use the supercomputer to create and power A.I. “What we’re showing here is that there is an opportunity to build a very large, dedicated A.I.
Persons: Cerebras, , Andrew Feldman, Organizations: Cerebras Locations: Santa Clara, Calif
She spoke following a keynote presentation in San Francisco during which Su showed an AI system on the MI300X chip writing a poem about the city. "The more memory that you have, the larger the set of models" the chip can handle, Su said. But unlike past presentations where AMD has talked up a major customer for a new chip, AMD did not say who will adopt the MI300X or a smaller version called the MI300A. Nvidia, whose shares have surged 170% so far this year, dominates the AI computing market with a market share of 80% to 95%, according to analysts. Aside from the AI market, AMD said it has started shipping high volumes of a general purpose central processor chip called "Bergamo" to companies such as Meta Platforms (META.O).
Persons: Lisa Su, Dave Brown, Stephen Nellis, Su, We've, Kevin Krewell, Alexis Black Bjorlin, Nvidia's, Chintala, Sag, Leslie Adler, David Gregorio, Nick Zieminski, Mark Porter Organizations: AMD, Amazon Web, REUTERS, Devices Inc, Nvidia Corp, Reuters, Nvidia, TIRIAS Research, Intel Corp, Systems, SambaNova Systems, Google, Facebook, Nvidia's, Meta, Moor, Thomson Locations: San Francisco , U.S, Santa Clara , California, San Francisco, Bergamo
Chief Executive Lisa Su said AMD has started shipping its "Bergamo" central processor. They spoke at an event in San Francisco where AMD was set to introduce its new artificial intelligence chip. Nvidia dominates the AI computing market with 80% to 95% of market share, according to analysts. Last month, Nvidia's market capitalization briefly touched $1 trillion after the company said it expected a jump in revenue after it secured new chip supplies to meet surging demand. Kevin Krewell, principal analyst at TIRIAS Research, said AMD will have to catch up on both software and research trends.
Persons: Lisa Su, Dave Brown, Stephen Nellis, Alexis Black Bjorlin, Su, Kevin Krewell, Krewell, Leslie Adler, David Gregorio, Nick Zieminski Organizations: AMD, Amazon Web, REUTERS, Devices, Facebook, Nvidia Corp, Nvidia, Intel Corp, Systems, SambaNova Systems, Google, TIRIAS Research, Thomson Locations: San Francisco , U.S, Bergamo, San Francisco
AMD Chief Executive Lisa Su will give a keynote address at an event in San Francisco on the company's strategy in the data center and AI markets. Analysts expect fresh details about a chip called the MI300, AMD's most advanced graphics processing unit (GPU), the category of chips that companies like OpenAI use to develop products such as ChatGPT. Nvidia dominates the AI computing market with 80% to 95% of market share, according to analysts. Last month, Nvidia's market capitalization briefly touched $1 trillion after the company said it expected a jump in revenue after it secured new chip supplies to meet surging demand. Kevin Krewell, principal analyst at TIRIAS Research, said AMD will have to catch up on both software and research trends.
Persons: Lisa Su, Su, Kevin Krewell, Krewell, Stephen Nellis, Leslie Adler Organizations: Devices, Nvidia Corp, AMD, Nvidia, Intel Corp, Systems, SambaNova Systems, Google, TIRIAS Research, Thomson Locations: San Francisco
OAKLAND, California, March 28 (Reuters) - Artificial intelligence chip startup Cerebras Systems on Tuesday said it released open source ChatGPT-like models for the research and business community to use for free in an effort to foster more collaboration. Silicon Valley-based Cerebras released seven models all trained on its AI supercomputer called Andromeda, including smaller 111 million parameter language models to a larger 13 billion parameter model. Cerebras said the smaller models can be deployed on phones or smart speakers while the bigger ones run on PCs or servers, although complex tasks like large passage summarization require larger models. Most of the AI models today are trained on Nvidia Corp's (NVDA.O) chips, but more and more startups like Cerebras are trying to take share in that market. The models trained on Cerebras machines can also be used on Nvidia systems for further training or customization, said Feldman.
Generative AI and large language models like OpenAI’s ChatGPT require massive amounts of computing power to run, and typically rely on chips like Nvidia’s graphics-processing units, or GPUs, that are specialized for these types of calculations. Graphcore sells primarily to AI startups looking to build and train models at lower cost, he said, and the company is benefiting from the proliferation of those startups. Shane Rau, who leads International Data Corp.’s semiconductor research, said chip startups are increasingly pivoting to focus their products on supporting large language models. Still, he added, “you’re going to see a combination of real adaptation and marketing.”“There will be the pressure to say: ‘Hey, we’re already relevant, our AI chip technology’s already relevant to generative AI’,” said Mr. Rau. Some chip makers say they expect yet another surge in demand once businesses more widely adopt generative AI.
Beneath the buzz, the next-generation developer framework Ray was key in the viral model's training. "ChatGPT combined a lot of the previous work on large language models with reinforcement as well. Before deploying Ray, OpenAI used a hodgepodge set of custom tools built on top of "neural programmer-interpreter" model. All these tools, Ray and JAX included, are in service to a new generation of combustion engines for the internet called large language models. Multiple companies, both startups and giants, are building their own large language models including Meta, Hugging Face, OpenAI, and Google.
[1/2] Startup Cerebras System's new AI supercomputer Andromeda is seen at a data center in Santa Clara, California, U.S. October 2022. Rebecca Lewington/Cerebras Systems/Handout via REUTERSOAKLAND, Calif. Nov 14(Reuters) - Silicon Valley startup Cerebras Systems, known in the industry for its dinner plate-sized chip made for artificial intelligence work, on Monday unveiled its AI supercomputer called Andromeda, which is now available for commercial and academic research. Andromeda is built by linking up 16 Cerebras CS-2 systems, the company's latest AI computer built around the over-sized chip called the Wafer-Scale Engine 2. This is less than $35 million,” said Andrew Feldman, founder and CEO of Cerebras when asked about the Frontier supercomputer. Feldman said Andromeda is owned by Cerebras and built at a high performance data center in Santa Clara, California called Colovore.
Nov 14 (Reuters) - Silicon Valley artificial intelligence (AI) computing startup SambaNova Systems said on Monday it delivered eight units of its latest AI system to the U.S. Argonne National Laboratory, which is expanding its AI offering to researchers. With AI work taking center stage in research, Argonne National Laboratory has been testing out various AI chips and systems, Rick Stevens, associate lab director for computing, environment and life sciences at Argonne lab told Reuters. In addition to SambaNova, AI systems from startups including Cerebras Systems, Groq Inc, Graphcore, and Intel Corp (INTC.O) owned Habana Labs have been tested. "We're working with new emerging AI hardware architectures, and we get early hardware and then we play with it, we use it on our science applications," said Stevens. Steven said the lab was evaluating AI hardware for its next supercomputer to see if SambaNova and others can be included.
Soumith Chintala is a creator and the lead of Facebook's core machine learning tool, PyTorch. Some of today's most popular emerging AI tools, such as OpenAI's text generation tool GPT-3, were made possible by an AI technique called Transformers. And in an industry where machine learning tools and techniques are evolving at a blistering pace, a half-decade might as well be half a century. Nvidia has traditionally held a dominant position thanks to the widespread adoption of GPUs in machine learning. JAX excels at splitting complex machine learning tasks across multiple pieces of hardware, drastically simplifying the unwieldy existing tools and making it easier to manage increasingly large machine learning problems.
Tocmai a fost anunţat unul capabil să sfideze şi timpul. Cel mai rapid procesor construit vreodată tocmai a fost prezentat. Se numeşte CS-1 Cerebras şi a fost contruit de compania cu acelaşi nume alături de Departamentul de Energie. Are 400 de mii de nuclee şi e mai rapid decât un supercomputer. De exemplu, poate să pornească simularea reacţiei unui nucleu al unei centrale electrice şi să încheie simularea înainte să se încheie acea reacţie.
Persons: Andrew Feldman Organizations: Energie, Systems
Total: 15