This is big news. Search live is making its way to the Google app. Again, this is Jarvis-like but right in Search versus Gemini -> Google’s conversational ‘Search Live’ is now starting to roll out "For users who have the feature rolled out to them, there are a couple of new https://t.co/ek65AIw60e
We're evolving Vertex AI in @Firebase into Firebase AI Logic, giving you more ways to easily integrate Google’s generative AI models into your apps. 💫 Read the blog for more details from #GoogleIO → https://t.co/WVZYo0qohH https://t.co/UiMvBlPGts
Google lanza subtítulos en tiempo real con Gemini Live: así puedes activarlos en tu dispositivo https://t.co/EXb3oKZUGC
Google has begun testing a new feature called 'Search Live' in AI Mode within its Google app, incorporating conversational AI capabilities powered by its Gemini AI models. Users with access to the feature will see a waveform icon with a sparkle beneath the Search bar, similar to the Gemini Live icon, indicating the AI interaction mode. This feature enables users to speak to the Gemini AI and receive real-time search assistance, akin to an advanced voice mode. Additionally, Google has introduced real-time captioning through Gemini Live, which can be activated on compatible devices. The company is also enhancing its AI development tools by evolving Vertex AI in Firebase into Firebase AI Logic, allowing developers to integrate Google's generative AI models more easily into their applications. Google promotes hybrid large language models (LLMs) that combine on-device capabilities of Gemma with cloud-based Gemini models to optimize performance and privacy. These developments reflect Google's ongoing expansion of AI functionalities across its platforms and developer tools.