<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Google I/O 2021: rumors, news, and announcements &#8211; The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2021-05-19T17:10:35+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/22432922/google-io-2021-rumors-news-announcements" />
	<id>https://www.theverge.com/rss/stream/22196963</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/rss/stream/22196963" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Sean Hollister</name>
			</author>
			
			<title type="html"><![CDATA[Android phones can finally tap to pay for public transit in the SF Bay Area]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/19/22443383/google-pay-clipper-card-android-san-francisco-bay-area-silicon-valley" />
			<id>https://www.theverge.com/2021/5/19/22443383/google-pay-clipper-card-android-san-francisco-bay-area-silicon-valley</id>
			<updated>2021-05-19T13:10:35-04:00</updated>
			<published>2021-05-19T13:10:35-04:00</published>
			<category scheme="https://www.theverge.com" term="Android" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="Mass Transit" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" /><category scheme="https://www.theverge.com" term="Transportation" />
							<summary type="html"><![CDATA[In 2010, the San Francisco Bay Area introduced a single tap-to-pay NFC card for practically all its public transit - and Google introduced the first NFC-equipped Android smartphone. Now, over a decade later, the two ideas are finally compatible. Today, you can finally digitize a Clipper card into most Android phones, or buy one there [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Photo by Amy Osborne/San Francisco Chronicle via Getty Images" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22524986/1298602323.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>In 2010, the San Francisco Bay Area introduced a single tap-to-pay NFC card for practically all its public transit - and Google introduced <a href="https://www.nfcw.com/2010/12/07/35385/google-unveils-first-android-nfc-phone-but-nexus-s-is-limited-to-tag-reading-only-for-now/">the first NFC-equipped Android smartphone</a>. Now, over a decade later, the two ideas are finally compatible. Today, <a href="https://www.clippercard.com/ClipperWeb/google-pay">you can finally digitize a Clipper card into most Android phones</a>, or buy one there to start, then tap it to ride 24 different transit systems in SF and the greater Silicon Valley.</p>
<p>Apple also <a href="https://www.theverge.com/2021/4/15/22386692/bart-caltrain-muni-ferry-apple-pay-clipper-card-sf-bay-area-iphone-watch">added the same functionality to iPhones and Apple Watches last month</a>, and here's how I described it at the time:</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>You can now use almost any recent iPhone or Apple Watch to board BART (which serves the Ea …</p></blockquote>
<p><a href="https://www.theverge.com/2021/5/19/22443383/google-pay-clipper-card-android-san-francisco-bay-area-silicon-valley">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Casey Newton</name>
			</author>
			
			<title type="html"><![CDATA[Google is reinventing Docs to fight a two-front war]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/19/22443297/google-docs-workplace-io-changes" />
			<id>https://www.theverge.com/2021/5/19/22443297/google-docs-workplace-io-changes</id>
			<updated>2021-05-19T09:30:22-04:00</updated>
			<published>2021-05-19T09:30:22-04:00</published>
			<category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Let's talk about some big changes announced to the platform where many of us get a lot of work done: Google Workspace, home to the suite of cloud-based tools that includes Docs. The relative stagnation of Docs in a rapidly evolving world of productivity tools has been an ongoing fascination for me. When I'm writing [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22523240/352_2021_05_18_352_chromescreen.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Let's talk about some big changes announced to the platform where many of us get a lot of work done: Google Workspace, home to the suite of cloud-based tools that includes Docs.</p>
<p>The relative stagnation of Docs in a rapidly evolving world of productivity tools has been an ongoing fascination for me. When I'm writing for myself, I use slick, modern tools like Notion, Bear, and (more recently) Substack. But when I write for others, it's most often in Docs, which launched 15 years ago and looks more or less the same as it has since the late 2000s.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>For casual users of Docs, the entire conversation can end here</p></blockquote></figure>
<p>Create a new document in any other  …</p>
<p><a href="https://www.theverge.com/2021/5/19/22443297/google-docs-workplace-io-changes">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Mitchell Clark</name>
			</author>
			
			<title type="html"><![CDATA[Spotify and YouTube Music will bring much needed offline tunes to Google’s Wear watches]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/18/22442880/spotify-youtube-music-download-offline-playlists-podcasts-updates" />
			<id>https://www.theverge.com/2021/5/18/22442880/spotify-youtube-music-download-offline-playlists-podcasts-updates</id>
			<updated>2021-05-18T21:11:25-04:00</updated>
			<published>2021-05-18T21:11:25-04:00</published>
			<category scheme="https://www.theverge.com" term="Entertainment" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Spotify" /><category scheme="https://www.theverge.com" term="Tech" /><category scheme="https://www.theverge.com" term="Wearable" />
							<summary type="html"><![CDATA[Spotify's product lead for cars and wearables teased an exciting new feature coming to Wear devices during Google's Developer Keynote on Tuesday: the ability for the streaming services' 356 million users to download music directly to their watch, and listen to it at times when they don't want to carry their phone (via XDA Developers). [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Image: Google" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22524192/Screen_Shot_2021_05_18_at_3.50.46_PM.png?quality=90&#038;strip=all&#038;crop=4.0037682524729,0,85.963259538389,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Spotify's product lead for cars and wearables teased an exciting new feature coming to Wear devices <a href="https://youtu.be/D_mVOAXcrtc?t=2515">during Google's Developer Keynote</a> on Tuesday: the ability for the streaming services' <a href="https://newsroom.spotify.com/company-info/">356 million users</a> to download music directly to their watch, and listen to it at times when they don't want to carry their phone (<a href="https://www.xda-developers.com/spotify-download-music-podcasts-wear-os/">via <em>XDA Developers</em></a>). The feature isn't included in the redesign that was just released, but Spotify says that it's currently in the works.</p>
<p>The announcement came alongside Google's reveal that it would be <a href="https://www.theverge.com/2021/5/18/22440483/samsung-smartwatch-google-wearos-tizen-watch">merging Wear OS with Samsung's Tizen</a>. During Tuesday's I/O keynote, Google promised that the updated OS would bring faster perfo …</p>
<p><a href="https://www.theverge.com/2021/5/18/22442880/spotify-youtube-music-download-offline-playlists-podcasts-updates">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jay Peters</name>
			</author>
			
			<title type="html"><![CDATA[Google I/O 2021: the 14 biggest announcements]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/18/22435419/google-io-2021-event-recap-biggest-announcements-pixel-android-12-wear-os-workspace" />
			<id>https://www.theverge.com/2021/5/18/22435419/google-io-2021-event-recap-biggest-announcements-pixel-android-12-wear-os-workspace</id>
			<updated>2021-05-18T15:37:30-04:00</updated>
			<published>2021-05-18T15:37:30-04:00</published>
			<category scheme="https://www.theverge.com" term="Android" /><category scheme="https://www.theverge.com" term="Apps" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Smartwatch" /><category scheme="https://www.theverge.com" term="Tech" /><category scheme="https://www.theverge.com" term="Wearable" /><category scheme="https://www.theverge.com" term="Web" />
							<summary type="html"><![CDATA[Google just finished its live Google I/O 2021 keynote, where the company unveiled a huge number of announcements, including a new look coming to Android, a bunch of features coming to its Google Workspace productivity suite, and even a new AI that talked as if it were Pluto. Nilay Patel and Dieter Bohn followed the [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22523492/Sundar_Pichai_03.jpeg?quality=90&#038;strip=all&#038;crop=0,0,95.152603231598,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Google just finished its live <a href="https://www.theverge.com/22432922/google-io-2021-rumors-news-announcements">Google I/O 2021</a> keynote, where the company unveiled a huge number of announcements, including a new look coming to Android, a bunch of features coming to its Google Workspace productivity suite, and even a new AI that talked as if it were Pluto.</p>
<p>Nilay Patel and Dieter Bohn followed the whole thing in real time right here <a href="https://www.theverge.com/e/22200447">on our live blog</a>.<strong> </strong>But if you just want to get caught up on the biggest news from the show, read on for our recap.</p>
<h2 class="wp-block-heading" id="DkHZX1"><a href="https://www.theverge.com/e/22203818">Android 12 has a radical and bubbly new look</a></h2><img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22520874/1._Android_12_Keyword_Header.jpeg?quality=90&amp;strip=all&amp;crop=0,0,100,100" alt="Android 12 on the Pixel 5" title="Android 12 on the Pixel 5" data-has-syndication-rights="1" data-caption="" data-portal-copyright="Image: Google">
<p>Google revealed that Android 12 will have a brand-new "Material You" design with a whole lot of new changes. It offers a lot of color an …</p>
<p><a href="https://www.theverge.com/2021/5/18/22435419/google-io-2021-event-recap-biggest-announcements-pixel-android-12-wear-os-workspace">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Allison Johnson</name>
			</author>
			
			<title type="html"><![CDATA[Google is trying to make its image processing more inclusive]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/18/22442515/google-camera-app-inclusive-image-equity-skintones" />
			<id>https://www.theverge.com/2021/5/18/22442515/google-camera-app-inclusive-image-equity-skintones</id>
			<updated>2021-05-18T15:34:48-04:00</updated>
			<published>2021-05-18T15:34:48-04:00</published>
			<category scheme="https://www.theverge.com" term="Android" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[It's a long-standing problem that dates back to the days of film: image processing tends to be tuned for lighter skin tones and not that of black and brown subjects. Google announced an effort to address that today in its own camera and imaging products, with a focus on making images of people of color [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Google says its tweaked image processing will avoid over-brightening black and brown faces. | Image: Google" data-portal-copyright="Image: Google" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22523549/Screen_Shot_2021_05_18_at_12.13.28_PM.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Google says its tweaked image processing will avoid over-brightening black and brown faces. | Image: Google	</figcaption>
</figure>
<p>It's a long-standing problem that <a href="https://www.youtube.com/watch?v=d16LNHIEJzs">dates back to the days of film</a>: image processing tends to be tuned for lighter skin tones and not that of black and brown subjects. Google announced an effort to address that today in its own camera and imaging products, with a focus on making images of people of color "more beautiful and more accurate." These changes will come to Google's own Pixel cameras this fall, and the company says it will share what it learns across the broader Android ecosystem.</p>
<p>Specifically, Google is making changes to its auto-white balance and exposure algorithms to improve accuracy for dark skin tones based on a broader data se …</p>
<p><a href="https://www.theverge.com/2021/5/18/22442515/google-camera-app-inclusive-image-equity-skintones">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jacob Kastrenakes</name>
			</author>
			
			<title type="html"><![CDATA[Google previews Project Starline, a next-gen 3D video chat booth]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/18/22442336/google-project-starline-3d-video-chat-platform" />
			<id>https://www.theverge.com/2021/5/18/22442336/google-project-starline-3d-video-chat-platform</id>
			<updated>2021-05-18T15:10:24-04:00</updated>
			<published>2021-05-18T15:10:24-04:00</published>
			<category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google is working on a next-gen video chat booth that makes the person you're chatting with appear in front of you in 3D. You can see them from different angles by moving around and even make eye contact, Google said during a preview of the project at its I/O conference today. The system is called [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22523533/Booth_Blog.max_1000x1000.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Google is working on a next-gen video chat booth that makes the person you're chatting with appear in front of you in 3D. You can see them from different angles by moving around and even make eye contact, Google said during a preview of the project at its I/O conference today.</p>
<p>The system is called "<a href="https://blog.google/technology/research/project-starline/">Project Starline</a>," and it's basically a really, really fancy video chat setup. The platform uses multiple cameras and sensors to capture a person's appearance and shape from different perspectives. It then stitches those together into a 3D model that's broadcast in real time to whomever they're chatting with. In Google's preview, Starline was use …</p>
<p><a href="https://www.theverge.com/2021/5/18/22442336/google-project-starline-3d-video-chat-platform">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Nicole Wetsman</name>
			</author>
			
			<title type="html"><![CDATA[Google announces health tool to identify skin conditions]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/18/22440754/google-health-ai-skin-condition-model-dermatology" />
			<id>https://www.theverge.com/2021/5/18/22440754/google-health-ai-skin-condition-model-dermatology</id>
			<updated>2021-05-18T14:47:35-04:00</updated>
			<published>2021-05-18T14:47:35-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="Health" /><category scheme="https://www.theverge.com" term="Science" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google's latest foray into health care is a web tool that uses artificial intelligence to help people identify skin, hair, or nail conditions. The company previewed the tool at I/O today, and it says it hopes to launch a pilot later this year. People can use their phone's camera to take three pictures of the [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Google developed a tool to help people identify skin conditions. | Image: Google" data-portal-copyright="Image: Google" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22521173/Copy_of_derm___hero_image_2.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Google developed a tool to help people identify skin conditions. | Image: Google	</figcaption>
</figure>
<p>Google's latest foray into health care is a web tool that uses artificial intelligence to help people identify skin, hair, or nail conditions. The company previewed the tool at I/O today, and it says it hopes to launch a pilot later this year.</p>
<p>People can use their phone's camera to take three pictures of the problem area - for example, a rash on their arm. They'll then answer a series of questions about their skin type and other symptoms. The tool then gives a list of possible conditions from a set of 288 that it's trained to recognize. It's not intended to diagnose the problem, the company said in a <a href="https://blog.google/technology/health/ai-dermatology-preview-io-2021/">blog post</a>.</p>
<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22521171/Copy_of_01_derm_summary.png?quality=90&amp;strip=all&amp;crop=0,0,100,100" alt="" title="" data-has-syndication-rights="1" data-caption="&lt;em&gt;The Google tool asks people to take three photos of a skin problem, and then it offers possible conditions.&lt;/em&gt; | Image: Google" data-portal-copyright="Image: Google">
<p>Google decided to tackle ski …</p>
<p><a href="https://www.theverge.com/2021/5/18/22440754/google-health-ai-skin-condition-model-dermatology">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Chris Welch</name>
			</author>
			
			<title type="html"><![CDATA[Google Photos will soon make animated photos from your still shots]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/18/22442466/google-photos-cinematic-moments-animated-android-ios-machine-learning" />
			<id>https://www.theverge.com/2021/5/18/22442466/google-photos-cinematic-moments-animated-android-ios-machine-learning</id>
			<updated>2021-05-18T14:41:10-04:00</updated>
			<published>2021-05-18T14:41:10-04:00</published>
			<category scheme="https://www.theverge.com" term="Apps" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google Photos will soon have a cool new trick: if you take two similar images with your phone's camera, the app will be able to create an animated, moving shot that combines them. It does this by using machine learning to synthesize movement between the two shots. Google creates new frames between them, resulting in [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22523420/cinematic.gif?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Google Photos will soon have a cool new trick: if you take two similar images with your phone's camera, the app will be able to create an animated, moving shot that combines them. It does this by using machine learning to synthesize movement between the two shots. Google creates new frames between them, resulting in a "vivid moving picture."  Google's Shimrit Ben-Yair made this sound like something that parents will love since now your multiple attempts at the same shot will allow for this added benefit.</p>
<p>The new feature is called "cinematic moments," and it will work on both Android and iOS, Ben-Yair said.</p>
<p>Google also announced a new locke …</p>
<p><a href="https://www.theverge.com/2021/5/18/22442466/google-photos-cinematic-moments-animated-android-ios-machine-learning">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>James Vincent</name>
			</author>
			
			<title type="html"><![CDATA[Google Maps’ Live View feature now offers more useful information about restaurants and businesses]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/18/22442451/google-maps-live-view-update-features-ai-ar" />
			<id>https://www.theverge.com/2021/5/18/22442451/google-maps-live-view-update-features-ai-ar</id>
			<updated>2021-05-18T14:37:27-04:00</updated>
			<published>2021-05-18T14:37:27-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google announced a bunch of new features for Google Maps at its 2021 I/O developer conference today, including upgrades to its handy Live View tool, which helps you navigate the world through augmented reality. Live View launched in beta in 2019, projecting walking directions through your camera's viewfinder, and was rolled out to airports, transit [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Image: Google" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22523430/Screen_Shot_2021_05_18_at_7.29.07_PM.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Google announced a <a href="https://blog.google/products/maps/five-maps-updates-io-2021">bunch of new features for Google Maps</a> at its 2021 I/O developer conference today, including upgrades to its handy Live View tool, which helps you navigate the world through augmented reality.</p>
<p>Live View launched <a href="https://www.theverge.com/2019/8/8/20776247/google-maps-live-view-ar-walking-directions-ios-android-feature">in beta in 2019</a>, projecting walking directions through your camera's viewfinder, and was <a href="https://www.theverge.com/2021/3/30/22357528/google-maps-directions-indoor-ar-live-view-fuel-efficient-weather-air-quality-layer">rolled out</a> to airports, transit stations, and malls earlier this year. Now, Live View will be accessible directly from Google Maps and will collate a lot of handy information, including how busy shops and restaurants are, recent reviews, and any uploaded photos.</p>
<p>It sounds particularly handy for exploring new destinations remot …</p>
<p><a href="https://www.theverge.com/2021/5/18/22442451/google-maps-live-view-update-features-ai-ar">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Chris Welch</name>
			</author>
			
			<title type="html"><![CDATA[Google and Samsung are merging Wear OS and Tizen]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/18/22440483/samsung-smartwatch-google-wearos-tizen-watch" />
			<id>https://www.theverge.com/2021/5/18/22440483/samsung-smartwatch-google-wearos-tizen-watch</id>
			<updated>2021-05-18T14:36:34-04:00</updated>
			<published>2021-05-18T14:36:34-04:00</published>
			<category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Samsung" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[In yet another sign of the growing alliance between Google and Samsung, today both companies announced that they are essentially combining Wear OS - Google's operating system - and the Tizen-based software platform that has been foundational to Samsung's wearables for many years. The resulting platform is currently being referred to simply as "Wear," though [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22523438/Screen_Shot_2021_05_18_at_2.31.35_PM.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>In yet another sign of the growing alliance between Google and Samsung, today both companies announced that they are essentially combining Wear OS - Google's operating system - and the Tizen-based software platform that has been foundational to Samsung's wearables for many years. The resulting platform is currently being referred to simply as "Wear," though that might not be the final name.</p>
<p>Benefits of the joint effort include significant improvements to battery life, 30 percent faster loading times for apps, and smoother animations. It also simplifies life for developers and will create one central smartwatch OS for the Android platform. G …</p>
<p><a href="https://www.theverge.com/2021/5/18/22440483/samsung-smartwatch-google-wearos-tizen-watch">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
	</feed>
