Amazon wanted to turn Alexa into ChatGPT’s nemesis. The project has ended up being chaos

Amazon wanted to turn Alexa into ChatGPT’s nemesis. The project has ended up being chaos
Amazon wanted to turn Alexa into ChatGPT’s nemesis. The project has ended up being chaos
  • The voice assistant promised a notable renewal, but the months go by and no new news is received

  • Company employees talk about bureaucratic problems and “structural dysfunctions”

  • The theoretical development of an LLM for Alexa is going slower than expected

Last September, Amazon seemed ready to launch its particular revolution in the world of AI. At an event in Washington DC, those responsible for the firm showed us the new version of Alexa. His voice assistant invited us to chat with him about all kinds of issues. He wanted to be as dumb as a stone and it looked like he might get it. And then, silence.

Where is the new Alexa? It was assumed that after the presentation the deployment of Alexa, which really seemed promising, would quickly extend to the approximately 500 million devices with said assistant. However, Amazon did not give any further news on the evolution of the project, and only in January 2024 did rumors appear that Alexa could appear this year… with a subscription in hand. What’s going on? Why hasn’t that new version been released?

Chaos on Amazon. According to company sources cited in Fortune, the problem is with Amazon. The organization, they say, is plagued by “structural dysfunctions” that have repeatedly delayed the launch of a new Alexa powered by generative AI.

The demo was that. a demo. Although Alexa’s presentation last September was eye-catching, it was just a demo. That version was not ready for her deployment, and she remains not ready today. The LLM on which it is based is “far from cutting edge”, and the reasons are varied.

No data, no GPUs. The scientists interviewed who worked on the project explained in Fortune that Amazon did not have enough data to train the model, nor enough chips to train it and run it competitively.

Let’s better make a chatbot for the cloud. Instead of continuing to work on Alexa at full speed, Amazon seems to have changed focus and is developing a generative AI model aimed at Amazon Web Services. The company invested $4 billion in Anthropic, creator of Claude, but so far that hasn’t helped on one side or the other.

Amazon denies there are problems. An Amazon spokeswoman told Fortune that the data provided was old, adding that the company has access to hundreds of thousands of GPUs. She also denies that Alexa has been deprioritized or that they are not using Claude, but did not elaborate on how they are doing so.

Siri overtakes Alexa on the right. That’s at least what it seems after this week’s announcements at WWDC 2024. There Apple showed off an improved version of Siri with a more natural synthetic voice and the potential for the assistant to complete actions through apps it connects to. The integration with ChatGPT is another of the striking options of a voice assistant that was launched in 2011 and that now, even having forgotten about some of the devices that could enhance it, has a new opportunity. Alexa may be left behind.

Former Amazon employees are not optimistic. Mihail Eric, a machine learning expert who worked on Alexa AI, recently published a long post on the company was “riddled with technical and bureaucratic problems.”

What about Olympus? Last November it was revealed that Amazon had a project underway to create a gigantic LLM called Olympus with two trillion parameters. That is twice what GPT-4 is estimated to use, but one of those interviewed in Fortune indicates that this figure “is a joke.” In reality, he states, the largest model they have is around 470,000 million parameters (470B), a quarter of what Olympus was theoretically going to have. Meanwhile, the LLM of that new version of Alexa is around 100B, and now they are apparently working on fine-tuning it.

Slow. Alexa’s LLM progress is still modest. According to published data, Llama 3 was pre-trained with 15 billion tokens, while Alexa LLM has only been able to do so with 3 billion. Its fine-tuning is also worse than that of Meta’s model, which used 10 million data points compared to just one million in the case of Amazon.

Too many people, too much complexity. The project has moved far away from Jeff Bezos’ famous “two pizza” work groups, and now brings together some 10,000 employees who also work in different use scenarios, such as home, shopping, music or entertainment. That doesn’t make things easier, but even so, Amazon is confident—so do some of those interviewed—that this LLM for Alexa will end up appearing on the market.

Image | Jonathan Borba

In Xataka | Amazon has taken over the name ‘Alexa’. 90% less used to call newborn girls than before the first Echo

For Latest Updates Follow us on Google News


PREV Inflation in Colombia | Consumer Price Index June 2024 | Growth | Economy
NEXT New Volkswagen T-Cross, what changes and how much does the renewed compact SUV cost?