Amazon Bedrock adds support for six fully-managed open weights models

Amazon Bedrock adds support for six fully-managed open weights models

Amazon Bedrock now supports six new models spanning frontier reasoning and agentic coding: DeepSeek V3.2, MiniMax M2.1, GLM 4.7, GLM 4.7 Flash, Kimi K2.5, and Qwen3 Coder Next. These six models bring customers access to the most capable open weights models available today, delivering frontier-class performance at significantly lower inference costs. They collectively cover the full spectrum of enterprise AI workloads: DeepSeek V3.2 and Kimi K2.5 push the frontier on reasoning and agentic intelligence, GLM 4.7 and Minimax 2.1 set new standards for autonomous coding with massive output windows, and Qwen3 Coder Next and GLM 4.7 Flash offer lightweight, cost-efficient alternatives purpose-built for production deployment.

These models on Amazon Bedrock are powered by Project Mantle, a new distributed inference engine for large-scale machine learning model serving on Amazon Bedrock. Project Mantle simplifies and expedites onboarding of new models onto Amazon Bedrock, provides highly performant and reliable serverless inference with sophisticated quality of service controls, unlocks higher default customer quotas with automated capacity management and unified pools, and provides out-of-the-box compatibility with OpenAI API specifications.

To learn more and get started, visit Amazon Bedrock console or the service documentation here. To get started with Amazon Bedrock OpenAI API-compatible service endpoints, visit documentation here.

Categories: general:products/amazon-bedrock,marketing:marchitecture/artificial-intelligence

Source: Amazon Web Services



Latest Posts

Pass It On
Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *