Kim Ziesemer
cerebras@zmcommunications.com
(BUSINESS WIRE)--Cerebras Systems today announced the signing of a Memorandum of Understanding (MoU) with Aramco, under which they aim to bring high performance AI inference to industries, universities, and enterprises in Saudi Arabia. Aramco plans to build, train, and deploy world-class large language models (LLMs) using Cerebras’ industry-leading CS-3 systems, in order to help accelerate AI innovation.
Aramco’s new high-performance AI computing infrastructure will be expected to focus on advancing the adoption of AI and providing local industries, enterprises, and universities with access to Cerebras’ CS-3 AI systems. These organizations aim to use Cerebras’ industry-leading AI systems to develop cutting-edge LLMs through sizing and tuning for optimal performance, catering to local business requirements.
Andrew Feldman, Cerebras co-founder and CEO, said: “We are privileged to be working with Aramco to bring high performance, low latency compute and new AI applications to local industries, enterprises, and universities. Together, we plan to accelerate the possibilities of AI, helping to enhance capabilities and create new opportunities for local businesses to foster creativity, unlock value, and promote sustainability.”
Nabil Al Nuaim, Aramco SVP of Digital & Information Technology, said: “This MoU with Cerebras aims to accelerate our abilities to develop an AI-powered digital innovation economy in Saudi Arabia by helping to support the integration of advanced AI solutions, unlocking new opportunities for the country and localizing cutting-edge technologies with regional expertise.”
Under the new MoU, Aramco plans to equip its cloud computing business with the new CS-3 systems to accelerate LLM and AI application development.
For more information, please visit https://cerebras.ai/.
About Cerebras Systems
Cerebras Systems is a team of pioneering computer architects, computer scientists, deep learning researchers, and engineers of all types. We have come together to accelerate generative AI by building a new class of AI supercomputer from the ground up. Our flagship product, the CS-3 system, is powered by the world’s largest and fastest AI processor, our Wafer-Scale Engine-3. CS-3s are quickly and easily clustered together to make the largest AI supercomputers in the world, and they make placing models on the supercomputers dead simple by avoiding the complexity of distributed computing. Leading corporations, research institutions, and governments use Cerebras solutions for the development of pathbreaking proprietary models and to train open-source models with millions of downloads. Cerebras solutions are available through the Cerebras Cloud and on premise. For further information, visit www.cerebras.ai or follow us on LinkedIn or X.
View source version on businesswire.com: https://www.businesswire.com/news/home/20240911851268/en/
Kim Ziesemer
cerebras@zmcommunications.com