AWS adds Mistral, Google, and NVIDIA models to Bedrock in mass expansion

AWS has pushed 18 new models onto Amazon Bedrock in what it’s calling its largest single expansion of the platform, a move that reads less like innovation and more like strategic hedging. The company’s pitch is flexibility: swap models without rewriting code, test everything, pick what works. It’s a sensible approach in a market where model performance shifts quarterly and vendor lock-in is a genuine concern.

The headline additions are Mistral AI’s Mistral Large 3 and Ministral 3, both available first on Bedrock. Mistral Large 3 is optimised for long-context work and multimodal tasks, while Ministral 3 is positioned as compact and general-purpose. AWS has also thrown in Google’s Gemma 3, MiniMax’s M2, NVIDIA’s Nemotron, and OpenAI‘s GPT OSS Safeguard. It’s a broad spread, touching open weight models from multiple providers.

Alongside the third-party additions, AWS has released its Nova 2 family, which it claims delivers “industry-leading price-performance” across reasoning, multimodal processing, and conversational AI. That’s a convenient claim when you control the infrastructure and the pricing. Nova 2’s real test won’t be AWS’s internal benchmarks but whether customers find it cheaper and better than running Mistral or Google models on the same platform.

The strategy here isn’t subtle. AWS is positioning Bedrock as the default model marketplace, a place where organisations can test and deploy without committing to a single vendor. It’s a response to the reality that most enterprises don’t want to build their own model infrastructure and don’t trust any single AI provider to dominate their stack. By offering 18 models at once, AWS is making itself indispensable as the distribution layer.

The open weight focus deserves attention. AWS has clearly decided that open models are commercially viable and that customers want the option to fine-tune and control their deployments. This isn’t altruism; it’s a bet that open weight models will eat into proprietary offerings and that AWS wants to be the platform where that happens. It also hedges against any single model provider becoming too powerful or too expensive.

For South African organisations evaluating AI infrastructure, Bedrock’s model diversity is appealing but not without trade-offs. Access to 18 models sounds like freedom, but it also means decision fatigue and the operational burden of figuring out which model suits which task. AWS’s swap-without-rewriting promise is useful, but only if your team has the capacity to test and benchmark properly. The platform’s value increases with scale, which means smaller operations might find themselves paying for optionality they don’t use.

The Nova 2 release is also worth watching. AWS has a history of undercutting third-party services with its own alternatives, and if Nova 2 proves genuinely competitive on price and performance, it could quietly erode demand for the other models Bedrock hosts. That’s not necessarily a problem for customers, but it does suggest AWS’s neutrality has limits.

Bedrock’s expansion signals that the AI infrastructure layer is consolidating around a few major platforms, and AWS clearly intends to be one of them. The company’s advantage is breadth: more models, more integrations, more infrastructure services. Whether that breadth translates into better outcomes for customers depends on execution, but AWS has at least positioned itself as the safe, flexible option in a market that’s still figuring out what it wants.

Zeen Social Icons