<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Sophia Chen | The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2025-03-22T13:51:10+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/author/sophia-chen" />
	<id>https://www.theverge.com/authors/sophia-chen/rss</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/authors/sophia-chen/rss" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Sophia Chen</name>
			</author>
			
			<title type="html"><![CDATA[Drama over quantum computing&#8217;s future heats up]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/tech/633248/beyond-the-hype-of-quantum-computers" />
			<id>https://www.theverge.com/?p=633248</id>
			<updated>2025-03-22T09:51:10-04:00</updated>
			<published>2025-03-21T16:30:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[On March 18th, Chetan Nayak, a physicist leading Microsoft’s quantum team, presented new data on the company’s quantum computing chip at the American Physical Society’s Global Physics Summit in Anaheim, California. It was meant to calm a raging debate among physicists, but researchers remain skeptical of the results. “I never felt like there would be [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/03/257613_quantum_computing_CVirginia_B.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">On March 18th, Chetan Nayak, a physicist leading Microsoft’s quantum team, presented new data on <a href="https://news.microsoft.com/azure-quantum/">the company’s quantum computing chip</a> at the American Physical Society’s Global Physics Summit in Anaheim, California. It was meant to calm a raging debate among physicists, but researchers <a href="https://www.nature.com/articles/d41586-025-00829-2">remain skeptical</a> of the results. “I never felt like there would be one moment when everyone is fully convinced,” Nayak told <a href="https://www.nature.com/articles/d41586-025-00829-2"><em>Nature</em></a> in a March 18th article.&nbsp;</p>

<p class="has-text-align-none">The <a href="https://physicsworld.com/a/experts-weigh-in-on-microsofts-topological-qubit-claim/">controversy</a> centers on Microsoft’s February claim that it had built a new type of quantum hardware — a topological qubit, made from a pattern of electrons on a tiny wire. Microsoft claimed that the qubit is less prone to errors. That would make quantum computers easier to scale up to something big enough to actually be useful. But in the journal article accompanying the release, the editors wrote that Microsoft had not conclusively shown <a href="https://static-content.springer.com/esm/art%3A10.1038%2Fs41586-024-08445-2/MediaObjects/41586_2024_8445_MOESM2_ESM.pdf">the electrons forming the signature pattern</a>, known as <a href="https://physicsworld.com/a/majorana-modes-continue-to-elude/" data-type="link" data-id="https://physicsworld.com/a/majorana-modes-continue-to-elude/">Majorana zero modes</a>. <em>Nature</em> had <a href="https://www.wired.com/story/microsoft-retracts-disputed-quantum-computing-paper/" data-type="link" data-id="https://www.wired.com/story/microsoft-retracts-disputed-quantum-computing-paper/">retracted</a> <a href="https://www.nature.com/articles/d41586-021-00612-z" data-type="link" data-id="https://www.nature.com/articles/d41586-021-00612-z">a similar paper</a> by <a href="https://www.nature.com/articles/s41586-021-03373-x" data-type="link" data-id="https://www.nature.com/articles/s41586-021-03373-x">a Microsoft-affiliated team</a> in 2021.</p>

<figure class="wp-block-pullquote"><blockquote><p>When quantum computers become useful, ordinary consumers shouldn’t expect them as personal devices. </p></blockquote></figure>

<p class="has-text-align-none">“Discourse and skepticism are all part of the scientific process,” Microsoft spokesperson Craig Cincotta tells <em>The Verge</em>. He points to additional improvements since that accompanying article, where Microsoft says the team controlled and measured a specific aspect of the qubit.</p>

<p class="has-text-align-none">The newest data Microsoft presented on Tuesday is “just noise,” says physicist Sergey Frolov of the University of Pittsburgh. (On Tuesday, Nayak acknowledged that the signal was hard to see because of electrical noise.)&nbsp;</p>

<p class="has-text-align-none">In a statement, Nayak tells <em>The Verge</em> that Microsoft is confident in its device. “It is clear that the interest and excitement level are very high,” he says.&nbsp;</p>

<p class="has-text-align-none">On top of controversy, the industry suffers from hype. Quantum computer champions say that they will revolutionize materials science, encryption, and finance. Theoretical research indicates that they could one day beat regular computers in certain time-consuming tasks and open new realms of computing. But the timeline is uncertain. In January, Nvidia’s Jensen Huang <a href="https://www.cnbc.com/2025/03/20/nvidia-ceo-huang-says-was-wrong-about-timeline-for-quantum-computing.html">expressed doubt</a> that commercial quantum computing would exist in 15 years, triggering quantum computing stocks to fall. He tried to walk those comments back on March 20th, when he hosted “Quantum Day” at Nvidia’s GTC conference, but <a href="https://fortune.com/2025/03/21/nvidia-jensen-huang-quantum-computing-stocks-gtc-rigetti-dwave-ionq/" data-type="link" data-id="https://fortune.com/2025/03/21/nvidia-jensen-huang-quantum-computing-stocks-gtc-rigetti-dwave-ionq/">quantum-related stocks fell again</a>.  </p>

<p class="has-text-align-none">Nevertheless, quantum computing researchers have been hard at work. Over the recent months, Google, Amazon, and several startups have announced a series of incremental improvements. We’re left to wonder how much longer consumers will have to wait for quantum computing’s killer applications. Are quantum computers coming to your cloud or phone in the future? What and who are they for?</p>

<figure class="wp-block-pullquote"><blockquote><p>“Discourse and skepticism are all part of the scientific process.”</p></blockquote></figure>

<p class="has-text-align-none">Quantum computers won’t be able to tackle anything useful for at least another decade, says physicist Andrea Morello of the University of New South Wales in Australia. And that’s if investors don’t lose patience and jump ship. The technology remains a full-stack problem, from engineering the materials to make the qubits, to connecting the qubits together, to manufacturing the chips at scale — and not to mention software.&nbsp;</p>

<p class="has-text-align-none">Investors are sticking around because the payoff could be huge. Quantum computers offer a completely new paradigm for computing. Unlike a conventional computer, which encodes information as binary ones and zeros, a quantum computer represents information as a probability of one and zero, known as a superposition. Superposition is a concept from quantum mechanics: for example, an electron can exist as a superposition, or probability, of multiple locations. You can also think of superposition like a coin flipping in the air. Before it lands, it is neither heads nor tails, but in a superposition state of both. Similarly, the qubit can represent information as some probability of both one and zero.&nbsp;</p>

<p class="has-text-align-none">Researchers make physical qubits from different materials — for Google, Amazon, and IBM, each qubit is a small superconducting circuit; notable startups are using ions, atoms, and photons as qubits. At this point, it’s not clear what material is best.</p>

<p class="has-text-align-none">All qubits obey the mathematics of quantum mechanics. So do molecules. That’s why experts predict that an early useful application of quantum computers could be performing <a href="https://www.nature.com/articles/s41586-021-04351-z">accurate and fast chemistry simulations</a>, for discovering new materials for better batteries, more climate-friendly fertilizers, and new medical drugs. Currently, to simulate these reactions, scientists rely on supercomputers, which are inexact and slow.&nbsp;</p>

<p class="has-text-align-none">A quantum speedup could upend other industries, as well. Banks are investigating quantum optimization algorithms for <a href="https://arxiv.org/pdf/2403.14436">improving financial forecasts</a>. Quantum algorithms could make AI algorithms more energy-efficient. They should also be able to break existing encryption methods; the prediction has spurred research into<a href="https://www.theverge.com/22523067/nist-challenge-quantum-safe-cryptography-computer-lattice"> more robust forms of cryptography</a>.&nbsp;</p>

<p class="has-text-align-none">But first, researchers need to reduce the errors in a quantum computer overall and make them larger.</p>

<p class="has-text-align-none">And when quantum computers become useful, ordinary consumers shouldn’t expect them as personal devices. Experts currently <a href="https://arxiv.org/pdf/2411.10406">envision</a> future quantum computers as a specialized chip in a supercomputer or as <a href="https://www.nature.com/articles/s41586-024-08406-9">a data center</a>. Either way, users would access the machine through the cloud. It’s also unlikely that quantum computers will be useful for everyday tasks like word processing or internet browsing. Its proposed applications are largely specialized for technical fields such as pharmaceuticals and finance.</p>

<p class="has-text-align-none">Recent progress has been heartening. The first quantum computers of note, built in the last decade, were too error-ridden to execute useful algorithms. Lately, researchers have figured out how to correct computing errors by encoding a single unit of information in multiple physical qubits instead of one. Using this approach, <a href="https://www.nature.com/articles/s41586-024-08449-y">Google</a> and <a href="https://www.nature.com/articles/s41586-025-08642-7">Amazon</a> have shown that their quantum computers can more reliably store information without the machines becoming more error-prone as they get bigger. The results could pave the way toward larger, useful quantum computers.&nbsp;</p>

<p class="has-text-align-none">Still, a leap for physicists is an inch forward for the rest of us. Google and Amazon’s quantum “memory” only stored a single unit of quantum information, known as a logical qubit. A useful quantum computer will need thousands, perhaps a million physical qubits, corresponding to hundreds or thousands of logical qubits. Researchers need to reduce the number of physical qubits to encode a unit of information. In Amazon’s recent announcement, they only needed nine physical qubits per unit of information, compared to the 105 physical qubits that Google needed. “We are a long way away from the big, mind-blowing, world-changing results and applications,” says Morello.</p>

<figure class="wp-block-pullquote"><blockquote><p>“It&#8217;s a very delicate balance. It has a chance of either people getting bored, or getting overexcited and really angry…”</p></blockquote></figure>

<p class="has-text-align-none">The US, European Union, and the UK governments have each pledged funding in the billions to develop quantum computing. For the US, the main rival is China, which has poured <a href="https://merics.org/en/report/chinas-long-view-quantum-tech-has-us-and-eu-playing-catch">$15 billion of public funding</a> into quantum computing, according to the Mercator Institute for China Studies, a Germany-based think tank.&nbsp;</p>

<p class="has-text-align-none">Cash has been flowing in the private sector, as well. <em>Crunchbase</em> reported that <a href="https://news.crunchbase.com/venture/quantum-computing-funding-record-high-ai-quantinuum/">quantum computing received $1.5 billion in venture funding worldwide in 2024</a>, an all-time high compared to the previous record of $963 million in 2022.</p>

<p class="has-text-align-none">But building the technology is difficult. Researchers have to show progress to keep their investors happy, while also tempering their expectations to keep them patient. The worry is a potential “quantum winter,” where overhype leads to inflated expectations and disappointment, and investors withdraw funding. AI development underwent such cooling eras. Researchers made the first AI chatbot in the 1960s, but the field was overly optimistic about the speed of development. When they didn’t deliver, <a href="https://arxiv.org/pdf/2109.01517">funders</a><a href="https://www.pet.theclinics.com/article/S1556-8598(21)00053-5/abstract"> withdrew</a>, leading to two “AI winters” from the late 60s to the mid-90s.</p>

<p class="has-text-align-none">“People would prefer to keep a low-enough profile to be kind of cool and a little bit buzzy, so that they can just continue reaping the benefits slowly,” Frolov says. “But I think it&#8217;s a very delicate balance. It has a chance of either people getting bored, or getting overexcited and really angry” when quantum computers don’t deliver according to their expectations.</p>

<p class="has-text-align-none">The anxiety over losing their funders’ trust has led to physicists’ current furor over Microsoft’s claims. Frolov, along with several other researchers, has spent years calling out what he said were discrepancies between Microsoft’s announcements and their experimental data. The community seems to be more receptive to critiques lately, he says.</p>

<p class="has-text-align-none">Such are the growing pains involved in building a quantum computer. Its potential remains alluring, but the finish line is still far away. In the meantime, physicists will continue squabbling over incremental progress — as long as the cash keeps flowing.&nbsp;</p>

<p class="has-text-align-none"><em><strong>Clarification, March 22nd:</strong> The 2018 Majorana zero-modes paper from a Microsoft-affiliated team was retracted by Nature.</em></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Sophia Chen</name>
			</author>
			
			<title type="html"><![CDATA[The race is on for quantum-safe cryptography]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/22523067/nist-challenge-quantum-safe-cryptography-computer-lattice" />
			<id>https://www.theverge.com/22523067/nist-challenge-quantum-safe-cryptography-computer-lattice</id>
			<updated>2021-06-11T09:00:00-04:00</updated>
			<published>2021-06-11T09:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Report" /><category scheme="https://www.theverge.com" term="Science" /><category scheme="https://www.theverge.com" term="Security" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[In 2016, Lily Chen started a competition to rewrite the building blocks of encryption. With her team of mathematicians at the US National Institute of Standards and Technology, Chen reached out to academic and industry cryptographers around the world to find algorithms that could resist new threats posed by quantum computers. Five years later, the [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Illustration by Maria Chimishkyan" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22643646/VRG_4614_7_NIST.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>In 2016, Lily Chen started a competition to rewrite the building blocks of encryption.</p>

<p>With her team of mathematicians at the US National Institute of Standards and Technology, Chen reached out to academic and industry cryptographers around the world to find algorithms that could resist new threats posed by quantum computers. Five years later, the project is almost complete. After three rounds of elimination, Chen and her team have now narrowed the 69 submissions down to a final seven algorithms, with several winners to be named at the end of the year. If things go according to plan, the result will be a new set of NIST-certified algorithms &mdash; and a new measure of protection against the chaos of a fully operational quantum computer.&nbsp;</p>

<p>&ldquo;Cryptosystems in devices and communication systems will not be secure anymore&rdquo; when those computers reach their potential, Chen says.&nbsp;&ldquo;It&rsquo;s time to prepare for quantum threats.&rdquo;&nbsp;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“It’s time to prepare for quantum threats.”</p></blockquote></figure>
<p>Chen has technical reasons to be concerned. Existing encryption systems rely on specific mathematical equations that classical computers aren&rsquo;t very good at solving &mdash; but quantum computers may breeze through them. As a security researcher, Chen is particularly interested in quantum computing&rsquo;s ability to solve two types of math problems: factoring large numbers and solving discrete logarithms (essentially solving the problem <em>bx</em> = <em>a </em>for <em>x)</em>. Pretty much all internet security relies on this math to encrypt information or authenticate users in protocols such as Transport Layer Security. These math problems are simple to perform in one direction, but difficult in reverse, and thus ideal for a cryptographic scheme.</p>

<p>&ldquo;From a classical computer&rsquo;s point of view, these are hard problems,&rdquo; says Chen. &ldquo;However, they are not too hard for quantum computers.&rdquo;&nbsp;</p>

<p>In 1994, the mathematician Peter Shor outlined in a paper how a future quantum computer could solve both the factoring and discrete logarithm problems, but engineers are still struggling to make quantum systems work in practice. While several companies like Google and IBM, along with startups such as IonQ and Xanadu, have built small prototypes, these devices cannot perform consistently, and they have not conclusively completed any useful task beyond what the best conventional computers can achieve. In 2019, Google reported that its quantum computer had solved a problem faster than the best existing supercomputers, but it was a contrived task with no practical application. And in 2020, academic researchers in China also reported their quantum computer had beat conventional computing in performing an algorithm that could offer utility for specialized optimization tasks. But so far, quantum computers have only managed to factor tiny numbers like 15 and 21 &mdash; a useful proof of principle, but far from a practical threat.&nbsp;</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“A geometric puzzle in a grid of points, arranged across hundreds or even thousands of dimensions”</p></blockquote></figure>
<p>That hasn&rsquo;t stopped researchers from trying to stay one step ahead of the quantum challenge. Peter Schwabe, a mathematician at the Max Planck Institute for Security and Privacy, has devised several cryptography schemes with colleagues that have beat the third round of NIST&rsquo;s competition. One of his submissions qualifies as a lattice-based protocol, a class of quantum-resistant algorithms that involve a geometric puzzle in a grid of points, arranged across hundreds or even thousands of dimensions. To crack the code, the computer must use given line segments to solve the puzzle, such as finding the most compact way to connect the lines end to end in the grid.</p>

<p>&ldquo;Lattice-based cryptography is, at the moment, considered the most realistic drop-in replacement for the protocols we have today,&rdquo; says Schwabe.</p>

<p>It&rsquo;s important to establish cryptographic standards now because once NIST standardizes a new cryptographic protocol, it will take years for some users to buy and set up the necessary technology. Another worry is that hackers today could intercept and store encrypted information, and then decrypt the messages a decade later with a quantum computer. This is a particular concern for government agencies that create documents intended to remain classified for years.&nbsp;</p>

<p>&ldquo;We have to try and get these cryptosystems ready well in advance of quantum computers,&rdquo; says NIST mathematician Dustin Moody, a member of Chen&rsquo;s team.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>A head start in testing and implementing quantum-safe cryptography</p></blockquote></figure>
<p>In advance of NIST&rsquo;s standards, some companies have already begun experimenting with these new cryptography schemes. In 2019, Google and the security company Cloudflare began <a href="https://blog.cloudflare.com/the-tls-post-quantum-experiment/">testing the speed and security</a> of two quantum computing-resistant protocols. &ldquo;We hope that this experiment helps choose an algorithm with the best characteristics for the future of the internet,&rdquo; wrote cryptographer Kris Kwiatkowski of Cloudflare in a blog post after the tests were performed.</p>

<p>When the winning algorithms are chosen, the hope is that NIST&rsquo;s federal certification will spur more companies to follow suit, and give them a head start in testing and implementing quantum-safe cryptography. Ultimately, NIST researchers see this work as public service. They aim to make these cryptographic standards freely available. The agency doesn&rsquo;t pay cryptographers to participate in the competition, and winners will not receive any money. &ldquo;You just get fame in the cryptographic world, which carries its own weight,&rdquo; says Moody.&nbsp;</p>

<p>And the winners get the satisfaction of knowing they&rsquo;ve completely redesigned swaths of internet infrastructure. The new protocols will alter fundamental interactions on the internet, like how your computer confirms you&rsquo;ve actually accessed the right website and not a hacker&rsquo;s server &mdash; not to mention how companies encrypt your credit card number when you make an online purchase.</p>

<p>But the revolution will be quiet. &ldquo;The average user is not really going to see or notice this,&rdquo; says Moody. &ldquo;Hopefully, it&rsquo;ll all be done behind the scenes by the cryptographers and the people who put this into their products.&rdquo; Like the best security products, you can tell it&rsquo;s working when nobody notices the change.</p>
						]]>
									</content>
			
					</entry>
	</feed>
