Openai Partners With AWS In $38B Cloud Pact To Scale AI Workloads Using Nvidia Gpus
Amazon Web Services (AWS) and OpenAI announced on Monday a multi-year, strategic partnership that provides AWS's infrastructure to run and scale the ChatGPT maker's core artificial intelligence (AI) workloads, effective immediately.
The companies said that under the $38 billion agreement, which is expected to continue growing over the next seven years, OpenAI is accessing AWS compute, comprising hundreds of thousands of state-of-the-art Nvidia (NVDA) Graphic Processing Units (GPUs).
This would help with the ability to expand to tens of millions of CPUs to rapidly scale agentic workloads.“Clustering the NVIDIA GPUs-both GB200s and GB300s-via Amazon EC2 UltraServers on the same network enables low-latency performance across interconnected systems, allowing OpenAI to efficiently run workloads with optimal performance,” Amazon said.
Get updates to this developing story directly on Stocktwits.
For updates and corrections, email newsroom[at]stocktwits[dot]com.
Legal Disclaimer:
MENAFN provides the
information “as is” without warranty of any kind. We do not accept
any responsibility or liability for the accuracy, content, images,
videos, licenses, completeness, legality, or reliability of the information
contained in this article. If you have any complaints or copyright
issues related to this article, kindly contact the provider above.

Comments
No comment