<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Matt Morales | The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2025-01-28T15:30:40+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/author/matt-morales" />
	<id>https://www.theverge.com/authors/matt-morales/rss</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/authors/matt-morales/rss" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Matt Morales</name>
			</author>
			
			<title type="html"><![CDATA[How to make a telescope out of the sun]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2022/10/4/23380525/solar-gravitational-lens-exoplanets" />
			<id>https://www.theverge.com/2022/10/4/23380525/solar-gravitational-lens-exoplanets</id>
			<updated>2022-10-04T09:00:00-04:00</updated>
			<published>2022-10-04T09:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="NASA" /><category scheme="https://www.theverge.com" term="Science" /><category scheme="https://www.theverge.com" term="Space" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[One of the most exciting aspects of the James Webb Space Telescope (JWST) is its ability to image and gather information about exoplanets. But while JWST will give us tons of information about these celestial bodies, there&#8217;s something that it can&#8217;t do: take a high-resolution image of an earth-like exoplanet &#8212; specifically, an image where [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/24079564/Solar_Lens.jpeg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>One of the most exciting aspects of the James Webb Space Telescope (JWST) is its ability <a href="https://blogs.nasa.gov/webb/2022/09/01/nasas-webb-takes-its-first-ever-direct-image-of-distant-world/">to image and gather information about exoplanets</a>. But while JWST will give us tons of information<strong> </strong>about these celestial bodies, there&rsquo;s something that it can&rsquo;t do: take a high-resolution image of an earth-like exoplanet &mdash; specifically, an image where we can clearly see evidence of possible life on another world, such as land masses, clouds, and bodies of water.&nbsp;&nbsp;&nbsp;</p>

<p><a href="https://science.jpl.nasa.gov/people/turyshev/">Slava Turyshev of the NASA Jet Propulsion Laboratory</a> is working on a solution that would give us a clearer picture of an exoplanet.<strong> </strong>This method would use a phenomenon called gravitational lensing to capture that kind of an image. <a href="https://hubblesite.org/contents/articles/gravitational-lensing">Gravitational lensing</a> occurs when the gravity of a massive object, like a galaxy or star, bends the space-time around it. This curvature in space-time acts as a lens, causing the light from objects that are much further away to bend around it and become magnified. When viewed at the right angle and distance, the magnified light will appear as a ring, known as an Einstein ring.</p>

<p>Turyshev&rsquo;s proposed solar gravitational lens would use the sun as that massive object, magnifying the light of a distant exoplanet in order to construct a high-resolution image we otherwise couldn&rsquo;t visualize. We sat down with Turyshev to talk about what it would take to reach this goal and how he hopes to achieve it within just a few decades. Watch our video above to see more.</p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Matt Morales</name>
			</author>
			
			<title type="html"><![CDATA[How researchers are using old phones to screen for Alzheimer’s]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/23167672/google-pixel-health-screening-old-phones-infrared-alzheimers" />
			<id>https://www.theverge.com/23167672/google-pixel-health-screening-old-phones-infrared-alzheimers</id>
			<updated>2025-01-28T10:30:40-05:00</updated>
			<published>2022-06-16T10:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Featured Videos" /><category scheme="https://www.theverge.com" term="Google" /><category scheme="https://www.theverge.com" term="Google Pixel" /><category scheme="https://www.theverge.com" term="Health" /><category scheme="https://www.theverge.com" term="Science" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[When we talk about health and tech, usually the conversation is around newer devices like wearables. But newer isn&#8217;t always better, especially for one group of researchers using an old Pixel 4 to screen for neurological diseases using just the selfie cam from the phone. The DigiHealth Lab at UC San Diego, directed by Professor [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/23627053/Eye_Thumb_Container.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>When we talk about health and tech, usually the conversation is around newer devices like wearables. But newer isn&rsquo;t always better, especially for one<strong> </strong>group of researchers using an old Pixel 4 to screen for neurological diseases using just the selfie cam from the phone.</p>

<p><a href="https://digihealth.eng.ucsd.edu/">The DigiHealth Lab at UC San Diego</a>, directed by Professor <a href="https://www.ejaywang.com/">Edward Wang</a>, looks at ubiquitous technology like smartphones to figure out how they can be used to monitor our health. The idea is that, by building digital health tools that work on more common<strong> </strong>devices, they can increase access to more people &mdash; particularly people who might not be able to afford the latest smartwatch or fitness tech.</p>

<p>We spoke with Colin Barry from the lab to tell us more about how the modified Pixel 4 works and even run some diagnostic testing on ourselves. We also checked in with our reporter Nicole Wetsman to explain to us how this technology could have both positive impacts as well as some unexpected pitfalls.</p>
						]]>
									</content>
			
					</entry>
	</feed>
