<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Google’s Search On fall 2021 event: news and announcements &#8211; The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2021-09-29T17:42:17+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/9/29/22699191/google-search-on-fall-2021-news-announcements" />
	<id>https://www.theverge.com/rss/stream/22463232</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/rss/stream/22463232" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Dieter Bohn</name>
			</author>
			
			<title type="html"><![CDATA[Google search’s next phase: context is king]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/9/29/22698504/google-search-on-event-ai-mum-google-lens-update-changes" />
			<id>https://www.theverge.com/2021/9/29/22698504/google-search-on-event-ai-mum-google-lens-update-changes</id>
			<updated>2021-09-29T13:42:17-04:00</updated>
			<published>2021-09-29T13:42:17-04:00</published>
			<category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Report" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[At its Search On event today, Google introduced several new features that, taken together, are its strongest attempts yet to get people to do more than type a few words into a search box. By leveraging its new Multitask Unified Model (MUM) machine learning technology in small ways, the company hopes to kick off a [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Illustration: Alex Castro / The Verge" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10745893/acastro_180427_1777_0003.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>At its Search On event today, Google introduced several new features that, taken together, are its strongest attempts yet to get people to do more than type a few words into a search box. By leveraging its new Multitask Unified Model (MUM) machine learning technology in small ways, the company hopes to kick off a virtuous cycle: it will provide more detail and context-rich answers, and in return it hopes users will ask more detailed and context-rich questions. The end result, the company hopes, will be a richer and deeper search experience.</p>
<p>Google SVP Prabhakar Raghavan oversees search alongside Assistant, ads, and other products. He likes  …</p>
<p><a href="https://www.theverge.com/2021/9/29/22698504/google-search-on-event-ai-mum-google-lens-update-changes">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Justine Calma</name>
			</author>
			
			<title type="html"><![CDATA[Google Maps is making it easier to see wildfires and tree coverage]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/9/29/22698533/google-maps-wildfires-tree-coverage-tracker" />
			<id>https://www.theverge.com/2021/9/29/22698533/google-maps-wildfires-tree-coverage-tracker</id>
			<updated>2021-09-29T13:40:00-04:00</updated>
			<published>2021-09-29T13:40:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Environment" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Science" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google Maps has new features that should make it easier for users to see wildfires, tree canopy, and locations without formal addresses. It's all aimed at helping communities be "safer, more sustainable, and discoverable," according to the company. A new wildfire layer on Maps will begin rolling out globally this week, Google announced today. It'll [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Google’s Tree Canopy Lab | Image: Google" data-portal-copyright="Image: Google" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22885235/Tree_Canopy_Insights.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Google’s Tree Canopy Lab | Image: Google	</figcaption>
</figure>
<p>Google Maps has new features that should make it easier for users to see wildfires, tree canopy, and locations without formal addresses. It's all aimed at helping communities be "safer, more sustainable, and discoverable," according to the company.</p>
<p>A new wildfire layer on Maps will begin rolling out globally this week, Google announced today. It'll show most major fires, those prompting evacuations, across the world. Red splotches and pins on the layer will indicate where blazes are and how far they've spread. By tapping on any single wildfire, users can see more information, like how many acres have burned, what percentage of the fire has  …</p>
<p><a href="https://www.theverge.com/2021/9/29/22698533/google-maps-wildfires-tree-coverage-tracker">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Antonio G. Di Benedetto</name>
			</author>
			
			<title type="html"><![CDATA[Google expands shopping searches with Lens and in-store inventory checks]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/9/29/22696646/google-shopping-lens-search-inventory-check-ios-chrome" />
			<id>https://www.theverge.com/2021/9/29/22696646/google-shopping-lens-search-inventory-check-ios-chrome</id>
			<updated>2021-09-29T13:30:11-04:00</updated>
			<published>2021-09-29T13:30:11-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Business" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" /><category scheme="https://www.theverge.com" term="Web" />
							<summary type="html"><![CDATA[Shopping online isn't always a convenience. If you enjoy window shopping or browsing curated collections at a brick-and-mortar store for inspiration, finding something online you don't yet know you want or are unaware of is tricky if you start with a text search. Google is announcing new shopping search tools to try to alleviate this, [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Google’s new search features attempt to make online shopping easier. | GIF: Google" data-portal-copyright="GIF: Google" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22885534/Search_On_Lens_Mode.gif?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Google’s new search features attempt to make online shopping easier. | GIF: Google	</figcaption>
</figure>
<p>Shopping online isn't always a convenience. If you enjoy window shopping or browsing curated collections at a brick-and-mortar store for inspiration, finding something online you don't yet know you want or are unaware of is tricky if you start with a text search. Google is announcing new shopping search tools to try to alleviate this, with features that utilize Google Lens for finding products to buy from pictures online, broader search terms to help you browse clothing, and the ability to check in-store inventory from home. It claims the new tools will help shoppers "find what they're looking for in a more visual way." This comes after <a href="https://www.theverge.com/2020/4/21/21228741/google-shopping-free-listing-ads-search-coronavirus-covid">Goog …</a></p>
<p><a href="https://www.theverge.com/2021/9/29/22696646/google-shopping-lens-search-inventory-check-ios-chrome">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>James Vincent</name>
			</author>
			
			<title type="html"><![CDATA[Google Lens will soon search for words and images combined]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/9/29/22698014/google-lens-update-text-search-desktop-chrome" />
			<id>https://www.theverge.com/2021/9/29/22698014/google-lens-update-text-search-desktop-chrome</id>
			<updated>2021-09-29T13:23:06-04:00</updated>
			<published>2021-09-29T13:23:06-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Google is updating its visual search tool Google Lens with new AI-powered language features. The update will let users further narrow searches using text. So, for example, if you snap a photo of a paisley shirt in order to find similar items online using Google Lens, you can add the command "socks with this pattern" [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Google Lens wants to be more useful by pairing visual searches with accompanying text. | Image: Google" data-portal-copyright="Image: Google" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22884248/google_lens_shirt_sock_search_wide_slow.gif?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Google Lens wants to be more useful by pairing visual searches with accompanying text. | Image: Google	</figcaption>
</figure>
<p>Google is updating its visual search tool Google Lens with new AI-powered language features. The update will let users further narrow searches using text. So, for example, if you snap a photo of a paisley shirt in order to find similar items online using Google Lens, you can add the command "socks with this pattern" to specify the garments you're looking for.</p>
<p>Additionally, Google is launching a new "Lens mode" option in its iOS Google app, allowing users to search using any image that appears while searching the web. This will be available "soon," but it'll be limited to the US. Google is also launching Google Lens on desktop within the Chr …</p>
<p><a href="https://www.theverge.com/2021/9/29/22698014/google-lens-update-text-search-desktop-chrome">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>James Vincent</name>
			</author>
			
			<title type="html"><![CDATA[Google is using AI to help users explore the topics they’re searching for — here’s how]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/9/29/22696268/google-search-on-updates-ai-mum-explained" />
			<id>https://www.theverge.com/2021/9/29/22696268/google-search-on-updates-ai-mum-explained</id>
			<updated>2021-09-29T13:22:40-04:00</updated>
			<published>2021-09-29T13:22:40-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Report" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA["Can you get medicine for someone at the pharmacy?" It's a simple enough question for humans to understand, says Pandu Nayak, vice president of search at Google, but such a query represents the cutting-edge of machine comprehension. You and I can see that the questioner is asking if they can fill out a subscription for [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Illustration by Alex Castro / The Verge" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/10802091/acastro_180508_1777_google_IO_0003.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>"Can you get medicine for someone at the pharmacy?"</p>
<p>It's a simple enough question for humans to understand, says Pandu Nayak, vice president of search at Google, but such a query represents the cutting-edge of machine comprehension. You and I can see that the questioner is asking if they can fill out a subscription for <em>another person</em>, Nayak tells <em>The Verge</em>. But until recently, if you typed this question into Google, it would direct you to websites explaining how to fill out <em>your</em> prescription. "It missed the subtlety that the prescription was for someone else," he says.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>MUM is Google's biggest, brightest AI language model</p></blockquote></figure>
<p>The key to deliveri …</p>
<p><a href="https://www.theverge.com/2021/9/29/22696268/google-search-on-updates-ai-mum-explained">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
	</feed>
