<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Google I/O 2025: All the news and announcements &#8211; The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2025-06-10T20:44:01+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/google/670250/google-io-news-announcements-gemini-ai-android-xr" />
	<id>https://www.theverge.com/rss/stream/670250</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/rss/stream/670250" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Wes Davis</name>
			</author>
			
			<title type="html"><![CDATA[Android Auto will get Spotify Jam and support for video apps and web browsers]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/674126/android-auto-spotify-jam-web-browsers-video-apps" />
			<id>https://www.theverge.com/?p=674126</id>
			<updated>2025-05-25T12:55:38-04:00</updated>
			<published>2025-05-25T12:55:38-04:00</published>
			<category scheme="https://www.theverge.com" term="Android" /><category scheme="https://www.theverge.com" term="Apps" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" /><category scheme="https://www.theverge.com" term="Transportation" />
							<summary type="html"><![CDATA[Android Auto is getting more than just Google's Gemini assistant after the Google I/O developer conference. The company has also announced or otherwise shown off a slew of changes coming to the infotainment operating system, including an updated Spotify app, a light mode, and the introduction of web browsers and video apps. Let's start with [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Spotify gets new templates for Android Auto." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/Spotify-Jam.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Spotify gets new templates for Android Auto.	</figcaption>
</figure>
<p class="has-text-align-none">Android Auto is getting more than just <a href="https://www.theverge.com/news/670954/volvo-google-gemini-ai-cars-android-automotive">Google's Gemini assistant</a> after the <a href="https://www.theverge.com/google/670250/google-io-news-announcements-gemini-ai-android-xr">Google I/O developer conference</a>. The company has also announced or otherwise shown off a slew of changes coming to the infotainment operating system, including an updated Spotify app, a light mode, and the introduction of web browsers and video apps.</p>
<p class="has-text-align-none">Let's start with Spotify. Google <a href="https://www.youtube.com/watch?v=ud09zuXHst4">revealed in a video</a> last week that the Spotify app for Android Auto is getting an overhaul through new media app templates the company is making available to developers. One feature the music service is adding to Android Auto is Spotify Jam, a feature that lets users share control of an a …</p>
<p><a href="https://www.theverge.com/news/674126/android-auto-spotify-jam-web-browsers-video-apps">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Allison Johnson</name>
			</author>
			
			<title type="html"><![CDATA[Google’s Veo 3 AI video generator is a slop monger’s dream]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/ai-artificial-intelligence/673719/google-veo-3-ai-video-audio-sound-effects" />
			<id>https://www.theverge.com/?p=673719</id>
			<updated>2025-05-23T19:50:40-04:00</updated>
			<published>2025-05-24T07:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Even at first glance, there's something off about the body on the street. The white sheet it's under is a little too clean, and the officers' movements are totally devoid of purpose. "We need to clear the street," one of them says with a firm hand gesture, though her lips don't move. It's AI, alright. [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="These boots were not made for walking, but you can use AI to make them do it anyway." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/ai-label-3.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	These boots were not made for walking, but you can use AI to make them do it anyway.	</figcaption>
</figure>
<p class="has-text-align-none">Even at first glance, there's something off about the body on the street. The white sheet it's under is a little too clean, and the officers' movements are totally devoid of purpose. "We need to clear the street," one of them says with a firm hand gesture, though her lips don't move. It's AI, alright. But here's the kicker: my prompt didn't include any dialogue.</p>
<p class="has-text-align-none">Veo 3, Google's new AI video generation model, added that line all on its own. Over the past 24 hours I've created a dozen clips depicting news reports, disasters, and goofy cartoon cats with convincing audio - some of which the model invented all on its own. It's more than a little …</p>
<p><a href="https://www.theverge.com/ai-artificial-intelligence/673719/google-veo-3-ai-video-audio-sound-effects">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Alex Heath</name>
			</author>
			
			<title type="html"><![CDATA[I/O versus io: Google and OpenAI can’t stop messing with each other]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/command-line-newsletter/673837/i-o-versus-io-google-openai-jony-ive" />
			<id>https://www.theverge.com/?p=673837</id>
			<updated>2025-05-23T16:02:44-04:00</updated>
			<published>2025-05-23T16:02:44-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Command Line" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="OpenAI" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[The leaders of OpenAI and Google have been living rent-free in each other's heads since ChatGPT caught the world by storm. Heading into this week's I/O, Googlers were on edge about whether Sam Altman would try to upstage their show like last year, when OpenAI held an event the day before to showcase ChatGPT's advanced [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/Command-Line-Site-Post-ALTMAN-PICHAI.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">The leaders of OpenAI and Google have been living rent-free in each other's heads since ChatGPT caught the world by storm. Heading into <a href="https://www.theverge.com/news/669408/google-io-2025-biggest-announcements-ai-gemini">this week's I/O</a>, Googlers were on edge about whether <strong>Sam Altman</strong> would try to upstage their show like last year, when OpenAI held an event the day before to showcase ChatGPT's advanced voice mode. </p>
<p class="has-text-align-none">This time, OpenAI dropped its bombshell the day after.</p>
<p class="has-text-align-none">OpenAI buying the "io" hardware division of <strong>Jony Ive's </strong>design studio, LoveFrom, is a delightfully petty bit of SEO sabotage, though I'm told the name stands for "input output" and was decided a while ago. Even still, <a href="https://www.theverge.com/news/671838/openai-jony-ive-ai-hardware-apple">the news</a> of Ive and Altman teaming up quic …</p>
<p><a href="https://www.theverge.com/command-line-newsletter/673837/i-o-versus-io-google-openai-jony-ive">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Emma Roth</name>
			</author>
			
			<title type="html"><![CDATA[Google I/O revealed more updates for Wallet, Wear OS, Google Play, and more]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/673808/google-io-wallet-wearos-play-store" />
			<id>https://www.theverge.com/?p=673808</id>
			<updated>2025-05-23T17:47:03-04:00</updated>
			<published>2025-05-23T15:38:14-04:00</published>
			<category scheme="https://www.theverge.com" term="Android" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" /><category scheme="https://www.theverge.com" term="Wearable" />
							<summary type="html"><![CDATA[The Google I/O keynote may have been all about AI, but there were a handful of other meaningful updates that didn't make it to the main stage. In addition to updates coming to Google Wallet, the company's developer sessions also revealed handy features that will roll out to smartwatches, the Google Play Store, and Google [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/02/STK093_GOOGLE_A.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">The Google I/O keynote may <a href="https://www.theverge.com/news/669408/google-io-2025-biggest-announcements-ai-gemini">have been all about AI</a>, but there were a handful of other meaningful updates that didn't make it to the main stage. In addition to updates coming to Google Wallet, the company's developer sessions also revealed handy features that will roll out to smartwatches, the Google Play Store, and Google TV.</p>
<p class="has-text-align-none">Here are some of the updates Google didn't highlight during the keynote.</p>
<h2 class="wp-block-heading">Live Updates are coming to your smartwatch</h2>
<p class="has-text-align-none">Google is preparing to bring Live Updates - a feature that lets users <a href="https://www.theverge.com/news/625473/google-maps-android-16-beta-live-updates">track the status of certain activities</a> in delivery, rideshare, and navigation apps - to your smartwatch. We already knew about Google …</p>
<p><a href="https://www.theverge.com/news/673808/google-io-wallet-wearos-play-store">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Allison Johnson</name>
			</author>
			
			<title type="html"><![CDATA[Google’s AI product names are confusing as hell]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/tech/671116/google-ai-product-names-confusing-gemini-deepmind-astra" />
			<id>https://www.theverge.com/?p=671116</id>
			<updated>2025-05-22T15:13:21-04:00</updated>
			<published>2025-05-22T13:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google executives took the stage this week at I/O to unveil their latest AI technology: Deep Think. Or was it Deep Search? Then there's the new subscription plan, Google AI Pro, which used to be Gemini Advanced, plus the new AI Ultra plan. Then there's Gemini in Chrome, which is different from AI Mode in [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="I would like to buy a vowel, please." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/DSC03112.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	I would like to buy a vowel, please.	</figcaption>
</figure>
<p class="has-text-align-none">Google executives <a href="https://www.theverge.com/google/670250/google-io-news-announcements-gemini-ai-android-xr">took the stage this week at I/O</a> to unveil their latest AI technology: Deep Think. Or was it Deep Search? Then there's the new subscription plan, Google AI Pro, which used to be Gemini Advanced, plus the new AI Ultra plan. Then there's Gemini in Chrome, which is different from AI Mode in search. Project Starline is now Google Beam, there are Gems and Jules, Astra and Aura… you get the idea. The products overlap in confusing ways, the naming conventions are diabolical, and I'm begging Google to return some semblance of sanity to its product line before we all lose our DeepMinds.</p>
<p class="has-text-align-none">In Google's defense, at least we're not callin …</p>
<p><a href="https://www.theverge.com/tech/671116/google-ai-product-names-confusing-gemini-deepmind-astra">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Wes Davis</name>
			</author>
			
			<title type="html"><![CDATA[Android 16 adds AI-powered weather effects that can make it rain on your photos]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/671749/android-16-adds-ai-powered-weather-effects-that-can-make-it-rain-on-your-photos" />
			<id>https://www.theverge.com/?p=671749</id>
			<updated>2025-05-21T12:29:45-04:00</updated>
			<published>2025-05-21T12:29:45-04:00</published>
			<category scheme="https://www.theverge.com" term="Android" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google Pixel" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google's latest Android 16 beta adds a bunch of new wallpaper and lock screen options for Pixel phones, including live-updating weather animations and a feature that automatically frames subjects of photos within a variety of bubbly shapes. When you select an image to use as a wallpaper in the beta, you can tap the sparkly [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Sorry, Mario." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/Android-16-wallpaper-weather.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Sorry, Mario.	</figcaption>
</figure>
<p class="has-text-align-none">Google's latest Android 16 beta adds a bunch of new wallpaper and lock screen options for Pixel phones, including live-updating weather animations and a feature that automatically frames subjects of photos within a variety of bubbly shapes.</p>
<p class="has-text-align-none">When you select an image to use as a wallpaper in the beta, you can tap the sparkly collection of starbursts that has become the de facto symbol for AI features to access the new effects. One of them, "Shape," washes your screen in a solid color, with a punchout frame in the middle centered on the subject of your photo, be it a person, animal, or object. You can choose from five different shape options:  …</p>
<p><a href="https://www.theverge.com/news/671749/android-16-adds-ai-powered-weather-effects-that-can-make-it-rain-on-your-photos">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Emma Roth</name>
			</author>
			
			<title type="html"><![CDATA[Google teases an Android desktop mode, made with Samsung’s help]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/671577/google-android-desktop-mode-samsung-dex" />
			<id>https://www.theverge.com/?p=671577</id>
			<updated>2025-06-10T16:44:01-04:00</updated>
			<published>2025-05-21T09:34:38-04:00</published>
			<category scheme="https://www.theverge.com" term="Android" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Samsung" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google is working with Samsung to bring a desktop mode to Android. During Google I/O's developer keynote, engineering manager Florina Muntenescu said the company is "building on the foundation" of Samsung's DeX platform "to bring enhanced windowing capabilities in Android 16," as spotted earlier by 9to5Google. Samsung first launched DeX in 2017, a feature that [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Windows in Android’s desktop mode can stretch and move across your screen." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/android-desktop.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Windows in Android’s desktop mode can stretch and move across your screen.	</figcaption>
</figure>
<p class="has-text-align-none">Google is working with Samsung to bring a desktop mode to Android. During Google I/O's developer keynote, engineering manager Florina Muntenescu <a href="https://www.youtube.com/live/GjvgtwSOCao?si=AoweLDWrR3haKZg9&amp;t=1250">said the company is</a> "building on the foundation" of Samsung's DeX platform "to bring enhanced windowing capabilities in Android 16," as <a href="https://9to5google.com/2025/05/20/android-16-desktop-mode-samsung-dex/">spotted earlier by <em>9to5Google</em></a>.</p>
<p class="has-text-align-none"><a href="https://www.theverge.com/2017/5/2/15495036/samsung-dex-station-galaxy-s8-review-desktop-dock">Samsung first launched DeX in 2017</a>, a feature that automatically adjusts your phone's interface and apps when connected to a larger display, allowing you to use your phone like a desktop device.</p>
<p class="has-text-align-none">A demo during the presentation revealed a Samsung DeX-like layout, with apps like Gmail, Chrome, YouTube, and Google Photos centered in the  …</p>
<p><a href="https://www.theverge.com/news/671577/google-android-desktop-mode-samsung-dex">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Emma Roth</name>
			</author>
			
			<title type="html"><![CDATA[Google has a big AI advantage: it already knows everything about you]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/tech/671201/google-personal-context-ai-advantage-data" />
			<id>https://www.theverge.com/?p=671201</id>
			<updated>2025-05-21T09:41:29-04:00</updated>
			<published>2025-05-21T08:30:00-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google's AI models have a secret ingredient that's giving the company a leg up on competitors like OpenAI and Anthropic. That ingredient is your data, and it's only just scratched the surface in terms of how it can use your information to "personalize" Gemini's responses. Google first started letting users opt in to its "Gemini [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Shahram Izadi, Google’s head of Android XR, talking about the advantage of using Gemini." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/lcimg-bb0866b3-a0b7-45fb-9ab9-7663c5205982.webp?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Shahram Izadi, Google’s head of Android XR, talking about the advantage of using Gemini.	</figcaption>
</figure>
<p class="has-text-align-none">Google's AI models have a secret ingredient that's giving the company a leg up on competitors like OpenAI and Anthropic. That ingredient is <em>your data</em>, and it's only just scratched the surface in terms of how it can use your information to "personalize" Gemini's responses.</p>
<p class="has-text-align-none"><a href="https://www.theverge.com/news/629022/gemini-google-search-history-personalization">Google first started letting users</a> opt in to its "Gemini with personalization" feature earlier this year, which lets the AI model tap into your search history "to provide responses that are uniquely insightful and directly address your needs." But now, Google is taking things a step further by unlocking access to even more of your information - all in the name of providing …</p>
<p><a href="https://www.theverge.com/tech/671201/google-personal-context-ai-advantage-data">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jay Peters</name>
			</author>
			
			<title type="html"><![CDATA[Google’s future is Google googling]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/google/671200/google-googling-ai-mode-project-mariner-i-o-2025" />
			<id>https://www.theverge.com/?p=671200</id>
			<updated>2025-05-23T11:58:05-04:00</updated>
			<published>2025-05-21T07:30:00-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Analysis" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google I/O was, as predicted, an AI show. But now that the keynote is over, we can see that the company's vision is to use AI to eventually do a lot of Googling for you. A lot of that vision rests on AI Mode in Google Search, which Google is starting to roll out to [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Google CEO Sundar Pichai at Google I/O 2025." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/lcimg-66e6ad53-8637-4218-a1b3-f38200b857cd.jpeg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Google CEO Sundar Pichai at Google I/O 2025.	</figcaption>
</figure>
<p class="has-text-align-none">Google I/O was, <a href="https://www.theverge.com/tech/667872/google-io-android-16-ai">as predicted</a>, an AI show. But now that the keynote is over, we can see that the company's vision is to use AI to eventually do a lot of Googling for you.</p>
<p class="has-text-align-none">A lot of that vision rests on AI Mode in Google Search, which Google is starting to roll out to everyone in the US. AI Mode offers a more chatbot-like interface right inside Search, and behind the scenes, Google is doing a lot of work to pull in information instead of making you scroll through a list of blue links.</p>
<p class="has-text-align-none">Onstage, Google presented an example of someone asking for things to do in Nashville over a weekend with friends who like food, music, and "exploring off the be …</p>
<p><a href="https://www.theverge.com/google/671200/google-googling-ai-mode-project-mariner-i-o-2025">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Victoria Song</name>
			</author>
			
			<title type="html"><![CDATA[We tried on Google’s prototype AI smart glasses]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/hands-on/671077/project-moohan-android-xr-google-io-2025-smart-glasses-wearables" />
			<id>https://www.theverge.com/?p=671077</id>
			<updated>2025-05-20T18:31:10-04:00</updated>
			<published>2025-05-20T17:27:12-04:00</published>
			<category scheme="https://www.theverge.com" term="AR" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google I/O 2025" /><category scheme="https://www.theverge.com" term="Hands-on" /><category scheme="https://www.theverge.com" term="Report" /><category scheme="https://www.theverge.com" term="Reviews" /><category scheme="https://www.theverge.com" term="Tech" /><category scheme="https://www.theverge.com" term="Wearable" />
							<summary type="html"><![CDATA[Here in sunny Mountain View, California, I am sequestered in a teeny-tiny box. Outside, there's a long line of tech journalists, and we are all here for one thing: to try out Project Moohan and Google's Android XR smart glasses prototypes. (The Project Mariner booth is maybe 10 feet away and remarkably empty.) While nothing [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Proof that there’s a display." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/05/Sequence-01.00_02_06_15.Still003.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Proof that there’s a display.	</figcaption>
</figure>
<p class="has-text-align-none">Here in sunny Mountain View, California, I am sequestered in a teeny-tiny box. Outside, there's a long line of tech journalists, and we are all here for one thing: to try out <a href="https://www.youtube.com/watch?v=mnSaPcVa2mY">Project Moohan and Google's Android XR smart glasses prototypes</a>. (The Project Mariner booth is maybe 10 feet away and remarkably empty.)</p>
<p class="has-text-align-none">While nothing was going to steal AI's spotlight at this year's keynote - 95 mentions! - Android XR has been generating a lot of buzz on the ground. But the demos we got to see here were notably shorter, with more guardrails, than <a href="https://www.theverge.com/2024/12/12/24319528/google-android-xr-samsung-project-moohan-smart-glasses">what I got to see back in December</a>. Probably because, unlike a few months ago, there are cameras everywher …</p>
<p><a href="https://www.theverge.com/hands-on/671077/project-moohan-android-xr-google-io-2025-smart-glasses-wearables">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
	</feed>
