This is “Project Astra”, Google’s most human AI that can see, hear, speak, remember and assimilate | (none)

This is “Project Astra”, Google’s most human AI that can see, hear, speak, remember and assimilate | (none)
This is “Project Astra”, Google’s most human AI that can see, hear, speak, remember and assimilate | (none)

SEE SUMMARY

Automatic summary generated with Artificial Intelligence

Google presented Project Astra, its artificial intelligence assistant with “human abilities” that they plan to launch at the end of 2024. This assistant can be interrupted, change personality and interact with the objects around it.

Developed by BioBioChile

Google presented this Tuesday Project Astra, your artificial intelligence (AI) assistant with “human abilities” that allow you to see, hear, remember, assimilate and speak; a futuristic tool that they plan to launch at the end of 2024.

The announcement, which a few years ago would only be possible in a sci-fi movie script, was partly overshadowed by rival OpenAI, a leading AI company that on Monday unveiled a similar voice assistant feature.

In both cases, Users will be able to make a video call to the assistant and ask them all kinds of questions.

Google showed several examples – according to the company, recorded live and not manipulated in any way– in which one of their workers in London He asked the assistant what nickname he would give a pet, asked for help with coding and math programs, and also finding his glasses, after showing him a room.

Another quality that these technologies have is that They can be interrupted during their responses to move on to the next point in the conversation, and they can have different personalitiesalthough a woman’s voice was used in both examples.

“These agents were built on top of our Gemini model and other task-specific modelsand were designed to process information faster by continuously encoding video frames, combining video and voice input into a timeline of events, and caching this information for efficient retrieval,” the company explains in a statement.

Google’s “Project Astra” could be even more portable

Google knowsHe had an ace up his sleeve, surprising with the possibility of using this technology with smart glassesin addition to a phone, although the company did not make specific announcements in this regard.

At its latest developer event, Meta also noted that it is developing its smart glasses so that they can access their AI and answer users’ questions about what they see.

There are many technology companies that this year have opted for AI tools that interact with the user without the need for a phone or computer – such as The Rabbit R1 or Humane AI Pin – but none, so far, have achieved resounding success.

 
For Latest Updates Follow us on Google News
 

-

PREV EDUCATION 21 CÓRDOBA | Course to take advantage of Google digital tools
NEXT how to redeem the PlayStation Plus pack for Ultimate Team