OpenAI jumps from Microsoft bed to Amazon bedrock • The Register

OpenAI’s best models are officially available on Amazon Web Services’ Bedrock managed inference and agent platform.

The collaboration, announced Tuesday at an AWS event in San Francisco, provides an alternative path to access Altman and the company’s growing GPT library without having to expose your data to OpenAI’s APIs.

Amazon says companies want to build AI-enhanced agents and other tools using OpenAI’s models, but have been stopped by security policy, data privacy and sovereignty concerns.

By opening its models to a trusted third party, OpenAI can circumvent many of these concerns. Bringing its models to AWS also means that customers don’t need to jump through as many hoops to adopt its models, since Amazon has already made efforts to connect its services to Bedrock.

Alongside the managed inference service, OpenAI’s models will also be available on Amazon’s Bedrock Managed Agents and AgentCore platforms, which provide tools and blueprints for creating enterprise agents and connecting them to enterprise data and services. AWS also simultaneously announced a host of new agentic AI tools for its own end customers, including Quick, a personalized assistant similar to Microsoft Copilot but for applications from multiple vendors, and various new versions of Connect, which was originally Amazon’s hosted CRM product but is expanding to help customers automate tasks in human resources, healthcare, and supply chain management.

Finally, companies will be able to connect OpenAI’s Codex code agent to models running in AWS data centers, providing some assurance that their codebases won’t end up in Altman’s next model.

For now, access to OpenAI models on AWS remains in limited preview, with the second most recent GPT-5.4 model from manufacturer LLM available now; The latest GPT-5.5 will arrive in the coming weeks, according to remarks by AWS CEO Matt Garman at an event in San Francisco on Tuesday.

Tuesday’s announcement delivers on a promise OpenAI made in February to make its models available on AWS in exchange for up to $35 billion in new funding. However, to claim all of this, OpenAI will need to run two gigawatts of Amazon’s Trainium accelerators.

It also appears that much of this was possible due to Microsoft’s willingness to open its relationship with OpenAI in exchange for being released from its revenue sharing commitments.

Under the new terms, Microsoft remains OpenAI’s primary cloud provider and retains access to the model developer’s technology. OpenAI, meanwhile, is free to sleep with whoever it wants, whether it’s Amazon or someone else.

As such, the new terms mean that OpenAI’s partnership with Amazon may not be a one-off, but rather a template for future infrastructure and services deals. ®