April 28 - OpenAI is making its latest artificial intelligence models and the Codex coding agent available through Amazon's cloud platform, the companies announced on Tuesday. In addition, OpenAI said it is introducing a service that enables developers to create production-ready AI agents using its most advanced models on Amazon Web Services.
At an event in San Francisco, Matt Garman, CEO of AWS, said the change reflects customers' long-standing request to keep their AI workloads and data within the AWS environment. "This is what our customers have been asking for a really long time. Their production applications run in AWS. Their data is AWS," he said, adding that customers will no longer need to leave AWS to access the AI models they want.
The announcement came a day after OpenAI and Microsoft renegotiated a contract that had previously allowed Microsoft to exclusively sell OpenAI's models on its Azure cloud platform. According to the companies, the revised terms create the opportunity for OpenAI to form additional partnerships with cloud providers, including Amazon and Google.
Amazon and OpenAI have deepened commercial ties in recent months. Amazon made a $50 billion investment in OpenAI. In return, OpenAI committed to spend $100 billion on Amazon Web Services over the next eight years. The companies also agreed that OpenAI will be able to access two gigawatts of compute capacity driven by Amazon's in-house Trainium AI chips.
The deal arrives as OpenAI seeks to secure more computing capacity to expand its enterprise business. The company faces competition from rival Anthropic, whose Claude models have drawn steady enterprise interest and helped position that startup as a leading competitor in the AI market.
Market indicators included in the announcement noted Microsoft shares were up 0.96% while Amazon shares were down 1.12% at the time of reporting.
Context and implications
The moves mean OpenAI's frontier models and developer tools will be available inside Amazon's cloud ecosystem, and OpenAI has committed significant future spending on AWS. AWS has also pledged substantial compute capacity through Trainium chips for OpenAI's use. The companies described the arrangements as catering to customer demand for staying within a single cloud environment for both applications and data.
Details on specific deployment timelines for particular models or the developer service were not provided beyond the general rollout announcement.