Amazon Lex now allows you to use Large Language Models (LLMs) as the primary option to understand customer intent across voice and chat interactions. With this capability, your voice and chat bots can better understand customer requests, handle complex utterances, maintain accuracy despite spelling errors, and extract key information from verbose inputs. When customer intent is unclear, bots can intelligently ask follow-up questions to fulfill requests accurately. For example, when a customer says “I need help with my flight,” the LLM automatically clarifies whether the customer wants to check their flight status, upgrade their flight, or change their flight.
This feature is available in all AWS commercial regions where Amazon Connect and Lex operate. To learn more, visit the Amazon Lex documentation or explore the Amazon Connect website to learn how Amazon Connect and Amazon Lex deliver seamless end-customer self-service experiences.
Categories: marketing:marchitecture/artificial-intelligence,general:products/amazon-lex
Source: Amazon Web Services
Latest Posts
- (Updated) Migration update for Office 365 connectors retirement in Teams – webhook URL support [MC1181996]
![(Updated) Migration update for Office 365 connectors retirement in Teams – webhook URL support [MC1181996] 2 pexels googledeepmind 17483906](data:image/svg+xml;base64,PHN2ZyB3aWR0aD0iMSIgaGVpZ2h0PSIxIiB4bWxucz0iaHR0cDovL3d3dy53My5vcmcvMjAwMC9zdmciPjwvc3ZnPg==)
- AWS Backup adds cross-Region database snapshot copy to logically air-gapped vaults

- Amazon Bedrock AgentCore Browser now supports proxy configuration

- Announcing new Amazon EC2 general purpose M8azn instances


![(Updated) Migration update for Office 365 connectors retirement in Teams – webhook URL support [MC1181996] 2 pexels googledeepmind 17483906](https://mwpro.co.uk/wp-content/uploads/2025/06/pexels-googledeepmind-17483906-150x150.webp)



