Photo Credit: Google
Google Maps is getting an all-new experience called 'immersive view', to deliver an enhanced digital model of buildings and streets around the world. The new mode uses advances in computer vision and artificial intelligence (AI) to deliver a rich viewing experience to users, Google said while announcing the new feature at its I/O 2022 consumer keynote on Wednesday. Google also announced the second beta of Android 13, its upcoming operating system. The company also announced updates coming to Google Workspace including automated transcriptions, portrait light, and portrait restore on Google Meet. Additionally, there were announcements related to Google Assistant, YouTube, skin tone representation, and others that you will find in detail here.
With the 'immersive view', Google Maps help provide the rich view of a neighborhood, landmark, restaurant, or a popular venue you search. It fuses together billions of Street View and aerial images alongside advances in computer vision and AI to provide the rich, digital model of virtual maps, the company said.
"Whether you're traveling somewhere new or scoping out hidden local gems, immersive view will help you make the most informed decisions before you go," Google said.
In addition to the immersive experience, the new view includes the time slider that users can use to even check out what the area looks like at different times of day and in various weather conditions. Users can also glide down to street level to look at nearby restaurants and see information such as live busyness and nearby traffic.
'Immersive view' uses Google Cloud to offer the digital view to users. So, it is device agnostic and can work with any phone and device, Google said. The experience has initially started rolling out in Los Angeles, London, New York, San Francisco, and Tokyo.
Google Maps is also expanding eco-friendly routing to more places including Europe. It was launched in the US and Canada in the recent past and has been used by people to travel 86 billion miles, saving over an estimated half a million metric tons of carbon emissions, Google claimed.
Further, Google is making its Live View available to developers through the new ARCore Geospatial API. It brings augmented reality (AR) to display arrows and directions on top of real-world viewing to help users navigate indoor areas such as airports, malls, and train stations.
Among other announcements, Google at the I/O 2022 consumer keynote announced the release of Android 13 beta 2. The update will carry a list of improvements and enhancements over the first beta release that debuted for select Pixel devices last month. Google also announced features including a unified Security & Privacy settings page, new media control featuring album's artwork, and a new photo picture that lets users to select the exact photos and videos that they want to grant access to. Android 13 will also include optimisations for tablets, including better multitasking capabilities and features including an updated taskbar with the ability to switch the single tablet view into a split screen.
Google Meet is getting the portrait restore that uses Google AI to help improve video quality and add enhancements even if a user is sitting in a dimly lit room and using an old webcam or have a poor Wi-Fi connectivity. Google said that the feature can help enhance video automatically.
In addition to the portrait restore feature, Google Meet is getting portrait light that is claimed to use machine learning to simulate studio-quality lighting in video feed. Users can also adjust the lighting position and brightness.
Google Meet is also getting de-reverberation that helps filter out echos in spaces with hard surfaces using machine learning. The company claims that the feature helps you sound "like you're in a mic-ed up conference room…even if you're in your basement."
Additionally, Google Meet is getting live sharing to sync content being shared in a virtual call and allow participants to control the media. Developers can also use live sharing APIs to integrate Meet into their apps.
Google is also bringing automated transcription later this year and meeting summarisation next year to enhance conversations on Google Meet.
Google Workspace is getting Google Meet's automated transcriptions to help users transcribe conversations directly in their documents. Google is also extending auto-summaries to Spaces to provide a digest of long conversations. Auto-summaries were introduced on Google Docs earlier this year.
Further, Google is adding security protections that were a part of Gmail to Google Slides, Docs, and Sheets. The company claimed that these protections will help users prevent from opening documents containing phishing links and malware using automatic alerts.
At the I/O 2022 consumer keynote, Google announced Look and Talk that is rolling out to in the US on Nest Hub Max to help people access Google Assistant without saying the "OK Google" or "Hey Google" hotword. Users will just need to look at the screen and then ask for what they need. The feature uses Face Match and Voice Match features based on machine learning and AI algorithms to recognise users and quickly enable voice interactions. Google said that the feature will be available as an opt-in offering.
Once enabled, you can use Look and Talk to interact with Google Assistant by just looking at the screen of your Nest Hub Max.
"Video from these interactions is processed entirely on-device, so it isn't shared with Google or anyone else," the company said.
Google also noted that the feature work across a range of skin tones and for people with diverse backgrounds.
In addition to the Look and Talk feature, Google is expanding quick phrases to Nest Hub Max to let users skip saying "Hey Google" for their most common daily tasks. This means that you will be able to turn on the lights in your room by saying, "Turn on the living room lights," without saying the hotword first.
Google Assistant is also getting new speech and language models that can help understand the nuances of human speech — like when someone is pausing, but not finished speaking, the company said.
Google at the I/O keynote announced that it is bringing auto-translated captions on YouTube to mobile users to help users view video captions in as many as 16 languages. Google is also bringing auto-translated captions to all Ukrainian YouTube content next month.
Google is releasing a new skin tone scale called the Monk Skin Tone (MST) Scale based on the research conducted by Harvard professor and sociologist Dr. Ellis Monk. This will help bring more inclusive of the spectrum of skin tones, the company said.
The new 10-shade skin tone scale will be integrated within various Google products over the coming months. Google is also releasing the scale openly to allow others in the tech industry to incorporate a vast range of skin tones into their experiences.
One of the key Google products that will start using the MST Scale will be Google Search. It will start showing users an option to help users further refine results by skin tone. Creators, brands and publishers will also be able to use an inclusive schema to label their content with attributes like skin tone, hair colour and hair texture.
Google Photos will also use the MST Scale to allow users to enhance their photos using a new set of Real Tone filters. These filters will be rolling out on Google Photos across Android, iOS, and the Web in the coming weeks.
For the latest tech news and reviews, follow Gadgets 360 on X, Facebook, WhatsApp, Threads and Google News. For the latest videos on gadgets and tech, subscribe to our YouTube channel. If you want to know everything about top influencers, follow our in-house Who'sThat360 on Instagram and YouTube.