Project Astra: Our vision for the future of AI assistants

Introducing Project Astra. We created a demo in which a tester interacts with a prototype of AI agents supported by our multimodal foundation model, Gemini.

There are two continuous takes: one with the prototype running on a Google Pixel phone and another on a prototype glasses device.

The agent takes in a constant stream of audio and video input. It can reason about its environment in real time and interact with the tester in a conversation about what it is seeing.

Learn more about Project Astra:

#GoogleIO2024

Watch the full Google I/O 2024 keynote:

To watch this keynote with American Sign Language (ASL) interpretation, please click here:

#GoogleIO

Subscribe to our Channel:
Find us on X:
Watch us on TikTok:
Follow us on Instagram:
Join us on Facebook:

Leave a Reply

Your email address will not be published. Required fields are marked *