Amazon's current SVP of devices and services, Dave Limp, said that the new Alexa can understand conversational phrases and respond appropriately, interpret context more effectively, and complete multiple requests from one command.
Limp explained that the new Alexa LLM "is a true generalisable large language model optimised for the Alexa use case; it's not what you find with a Bard or ChatGPT or any of these things."
The new Alexa is rolling it out slowly through a preview programme "in the coming months" -- and only in the US. This is because it wants to avoid Microsoft and Google's problems with their rollouts.
"When you connect an LLM to the real world, you want to minimise hallucinations -- and while we think we have the right systems in place ... there is no substitute for putting it out in the real world," Limp said.
The Alexa on steroids may not always be free in the long term.
"The idea of a superhuman assistant that can supercharge your smart home and more, work complex tasks on your behalf, could provide enough utility that Amazon will end up charging something for it down the road," Limp said.