<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">IBM | The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2025-12-01T18:09:32+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/ibm" />
	<id>https://www.theverge.com/rss/ibm/index.xml</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/rss/ibm/index.xml" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Nilay Patel</name>
			</author>
			
			<title type="html"><![CDATA[IBM CEO Arvind Krishna says there is no AI bubble after all]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/podcast/829868/ibm-arvind-krishna-watson-llms-ai-bubble-quantum-computing" />
			<id>https://www.theverge.com/?p=829868</id>
			<updated>2025-12-01T13:09:32-05:00</updated>
			<published>2025-12-01T10:00:00-05:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Business" /><category scheme="https://www.theverge.com" term="Decoder" /><category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="Podcasts" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Today, I’m talking with Arvind Krishna, the CEO of IBM. IBM is a fascinating company. It’s still a household name and among the oldest tech firms in the US. Without IBM, we simply wouldn’t have the modern era of computing — it was instrumental to the development of a whole stack of foundational technologies in [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="Stylized portrait of Arvind Krishna" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/11/DCD-Arvind-Krishna.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-drop-cap has-text-align-none">Today, I’m talking with Arvind Krishna, the CEO of IBM. IBM is a fascinating company. It’s still a household name and among the oldest tech firms in the US. Without IBM, we simply wouldn’t have the modern era of computing — it was instrumental to the development of a whole stack of foundational technologies in the 20th century, and it still has a lot of patents to show for it.&nbsp;</p>

<p class="has-text-align-none">But it’s a lot harder for most of us to see what IBM has been up to in this century. Watson, the company’s famous AI supercomputer, <a href="https://www.nytimes.com/2011/02/17/science/17jeopardy-watson.html">won <em>Jeopardy!</em> back in 2011</a>. Yet since then, as far as most consumers are concerned, it’s been mostly ads during football games and not a lot else.&nbsp;</p>

<p class="has-text-align-none">IBM has been busy, though, just not in a way most of us can see. It’s fully an enterprise company now, as Arvind explains, and that business is booming. But there&#8217;s a huge change coming to that business as well. The AI technology that Watson pioneered, all that natural language processing and the beginning of what we now call deep learning? Well, that’s given way to generative AI, and with it, a new way of thinking about how all the systems that run a company should be built and interact with each other.</p>

<div class="wp-block-vox-media-highlight vox-media-highlight"><img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/24792604/The_Verge_Decoder_Tileart.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="" title="" data-has-syndication-rights="1" data-caption="" data-portal-copyright="" />


<p><em>Verge</em> subscribers, don&#8217;t forget you get exclusive access to ad-free <em>Decoder</em> wherever you get your podcasts. Head <a href="https://www.theverge.com/account/podcasts">here</a>. Not a subscriber? You can <a href="https://www.theverge.com/subscribe">sign up here</a>. </p>
</div>

<p class="has-text-align-none">So I really wanted to ask Arvind how he felt about IBM investing in all of that Watson technology and showing it off a decade before everyone else, only to have maybe made the wrong technology bet and potentially miss out on the modern AI boom.&nbsp;</p>

<p class="has-text-align-none">You’ll hear Arvind be pretty candid that the way IBM was approaching AI back then was off the mark — he says outright that pushing Watson so early into the healthcare field was “inappropriate.” But his take, as you’ll hear him discuss, is that the infrastructure and research from that era weren’t wasted because developers and companies can still build on top of that foundation. So sure, Arvind says IBM got there a little too early. But he doesn’t seem too concerned that IBM will be stuck on the sidelines.</p>

<p class="has-text-align-none">Of course, I did have to bring up&nbsp;how the AI industry has all the hallmarks of a bubble, and it’s one that I and a lot of other folks, <a href="https://www.theverge.com/ai-artificial-intelligence/759965/sam-altman-openai-ai-bubble-interview">even OpenAI’s Sam Altman</a>, are pretty sure is going to pop. Arvind’s more optimistic — or maybe less cynical — than I am, though, and he’s pretty confident this isn’t a bubble. But you’ll hear us compare the current moment to the dotcom boom and bust of the early 2000s — before the smartphone came along to realize the promise of ubiquitous computing&nbsp; — and how ultimately disruptive all that was in a lot of really negative ways for a lot of people, even though all of the bets from the early dotcom era did eventually prove to be correct.&nbsp;</p>

<p class="has-text-align-none">One other thing I had to ask him was: if this isn’t a bubble, then who’s going to win? Because it feels like Apple and Google managed to keep all the profit from the transition to a digital economy, thanks to their hugely successful ecosystems and app stores that effectively collect rent from the labor and transactions of almost every other player that has an app. If the AI economy goes that way, will there be room for IBM or anyone else to get big from it?</p>

<p class="has-text-align-none">Arvind’s answer seems to be to play a different long-term game, which is where the <a href="https://www.theverge.com/23988271/ibm-quantum-heron-system-two-jerry-chow-qubits">company’s big bet on quantum computing comes in</a>. That bet still isn’t making useful products for most people, but you’ll hear Arvind explain why he still has some faith. This is a good one; we went a lot of places, and Arvind is remarkably candid.&nbsp;</p>

<p class="has-text-align-none">Okay: Arvind Krishna, CEO of IBM. Here we go.</p>

<iframe frameborder="0" height="200" src="https://playlist.megaphone.fm?e=VMP4542090967" width="100%"></iframe>

<p class="has-text-align-none"><em>This interview has been lightly edited for length and clarity.&nbsp;</em></p>

<p class="has-text-align-none"><strong>Arvind Krishna, you&#8217;re the CEO of IBM. Welcome to </strong><strong><em>Decoder</em></strong><strong>.</strong></p>

<p class="has-text-align-none">Nilay, great to be here with you.</p>

<p class="has-text-align-none"><strong>I&#8217;m excited to talk to you. IBM is one of the most famous companies in the world, but candidly, I think most consumers don&#8217;t know why anymore. It&#8217;s very much an enterprise company. It has a lot of businesses. You have been there for 35 years. What has IBM been, and what are you trying to make it today?</strong></p>

<p class="has-text-align-none">You&#8217;re right, IBM is an enterprise. It&#8217;s a B2B company, to use a more common parlance, as opposed to a B2C. Historically, IBM did create a lot of consumer products. We did that iconic typewriter that people kind of knew about. We did the IBM PC — even though it hasn&#8217;t been here for more than 20 years —&nbsp; and a few other consumer things along the way.&nbsp;</p>

<p class="has-text-align-none">I would say candidly that for the last 30 years, we&#8217;ve really had no consumer products. So, what does IBM do? Our role is to help our clients deploy technology that makes their business better. Whether they&#8217;re on multiple public clouds, want to take advantage of their data, or want to get to their customers faster, that&#8217;s what we are really about today.</p>

<p class="has-text-align-none"><strong>A lot of people know the Watson brand, which IBM has talked about for years. Famously, Watson competed on </strong><strong><em>Jeopardy!.</em></strong><strong> Now I think the brand has </strong><a href="https://newsroom.ibm.com/2023-05-09-IBM-Unveils-the-Watsonx-Platform-to-Power-Next-Generation-Foundation-Models-for-Business"><strong>turned into Watsonx</strong></a><strong>. There&#8217;s a lot of what I would call &#8220;airport&#8221; and &#8220;football advertising&#8221; around Watson that&#8217;s aimed directly at CIOs of companies and not at consumers, but we still all experience that advertising. How does Watson fit into the IBM brand? I think that&#8217;s what people really hook onto.</strong></p>

<p class="has-text-align-none">If you don&#8217;t mind, I&#8217;m going to give a slightly longer answer. It&#8217;ll be a few minutes, but stop me and ask questions.</p>

<p class="has-text-align-none">So, if we think about the Watson brand, it did really well initially with putting AI on the map. The Watson computer won <em>Jeopardy!</em> and that shocked people. It was really the first time that a computer could understand human language, think about open-ended questions, and was more right than wrong. I wouldn&#8217;t say perfectly right, but more right than wrong. I think that woke people up to the possibilities of AI. I will take credit and say that it got us going on the current AI journey.&nbsp;</p>

<p class="has-text-align-none">It fell off because we did things that were a little bit wrong for the market at the time. We were trying to be too monolithic, and we picked healthcare, maybe one of the toughest areas to go into, which I think was inappropriate. The world is ready to take these things as building blocks. Engineers want to open them up. They want to see what&#8217;s inside. They want to build their own applications. &#8220;I want to use it for this, but not that.&#8221;&nbsp;</p>

<p class="has-text-align-none">So when LLMs came along, we had a chance to say, &#8220;Let&#8217;s rebrand things. Let&#8217;s really rebuild the stack, and let&#8217;s give people both the pieces, but also a lot easier capability.&#8221; That&#8217;s what Watsonx is. So it builds on that Watson is associated with artificial intelligence. I&#8217;m convinced that AI is a really big unlock for people. I call it the eighth technology, but that&#8217;s a later conversation. So, that&#8217;s what the Watsonx brand is all about.</p>

<p class="has-text-align-none"><strong>Let me push on that a little bit. You described Watson as a computer, and it was a single computer that could go </strong><a href="https://j-archive.com/showplayer.php?player_id=7208"><strong>play </strong><strong><em>Jeopardy!</em></strong></a><strong>. Then, you described the introduction of LLM technology, and this ecosystem of building blocks.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>What was the AI technology bet with the initial Watson computer? Do you think that that was the wrong bet as a technology? Because I have a lot of questions about LLMs as a technology and the bet we&#8217;re making, but I&#8217;m curious now that you&#8217;ve had that experience, what was the technology in the initial Watson computer, and was it the right bet or the wrong bet?</strong></p>

<p class="has-text-align-none">It&#8217;s literally the same technologies. So, LLMs were not known at that time, but various other neural network models were. Neural network models span from what we call machine learning to what was beginning to be called deep learning. What was inside the Watson at that time was a mixture of machine learning and a lot of statistical learning, which was the core of what became deep learning.&nbsp;</p>

<p class="has-text-align-none">Let me just note, the first big deep learning algorithm was a year <em>after</em> Watson won <em>Jeopardy!</em>Watson won <em>Jeopardy!</em> in 2011, and 2012 was when the term came to be. But the early incarnations of those things were in there. Unfortunately, they were not there in a way that you could tune them, take one out, make it modular, and take another one. We were trying to give it to you as a monolith — that&#8217;s what I meant by monolith —&nbsp; and that was the wrong approach, just to be straightforward. Right technology, wrong go-to-market approach.</p>

<p class="has-text-align-none"><strong>Can you draw the connection between that set of technologies and LLMs today? The counterargument that I would give to you is… I&#8217;ll just pick on Google. Google has made a number of bets across machine learning, deep research, and LLMs for a long time. It showed off LLMs really early. I remember [CEO Sundar Pichai] demoing it and saying something like, &#8220;</strong><a href="https://www.theverge.com/2021/5/18/22442328/google-io-2021-ai-language-model-lamda-pluto"><strong>I can talk to Pluto</strong></a><strong>,&#8221; and no one knew what he was talking about. Then three years later, ChatGPT happened, and Google was like, &#8220;Wait, we invented all of that.&#8221; That was its technology bet, that was its paper: “</strong><a href="https://research.google/pubs/attention-is-all-you-need/"><strong>Attention is all you need</strong></a><strong>.”&nbsp;</strong></p>

<p class="has-text-align-none"><strong>You&#8217;re saying you had it, too, but it feels to me like there was actually an inflection point where the industry picked a different technology, they picked LLMs. So can you just draw the connection for me?</strong></p>

<p class="has-text-align-none">For sure. From 2010-2022, around 12 years, deep learning made incredible progress. No question about it. Here was the catch. Deep learning, to me, was incredibly bespoke. You could take a lot of data and employ a lot of people to label that data. It could do one task incredibly well, it really could, but tasks don&#8217;t stay static. The data changes. The tasks change. If I have to redo all that human labeling, relearning, and retraining, I&#8217;m calling that bespoke and fragile. So, the return was always a little bit out there. That applies if you have a massive, singular B2C task, maybe suggesting which photograph or ad you may love. It&#8217;s worth it because in the month or two months I use that model, I can get a lot of return. That&#8217;s a little harder in an enterprise context because it takes a lot more time to make up for all the costs.</p>

<p class="has-text-align-none">To go back to the original work you referred to, when there were massive amounts of data, labeling goes away. Wow, that drops the cost by half. You do a brute force approach using a lot more compute and a lot fewer people. Wow, the cost comes down even more because tech always gets cheaper over time.&nbsp;</p>

<p class="has-text-align-none">So now, half a dozen people and a ton of compute could do what previously may have taken 30 or 40 PhDs and 40 or 50 engineers over six months. You can now do the task that much shorter. That&#8217;s a huge unlock. In short, it looked like a 2x or 4x advantage, but if I compare from the beginning to the end, this is a 100x advantage in terms of speed, tuning, and deployability. That&#8217;s industrial scale. Plus, these models can be tuned for many tasks, not just one. I&#8217;m not saying all tasks, but many, which means that the applicability is massive.</p>

<p class="has-text-align-none">Also, when I want to ingest new data, I don&#8217;t have to restart at the beginning. I can add some. At some stage it makes sense to restart, but I can do a bit more there. All of these are massive unlocks, which is why I think it&#8217;s the right technology to help massively scale AI. By the way, I don&#8217;t think it&#8217;s the end all. We&#8217;ll come back to that, but it is a hundred times better than the prior.</p>

<p class="has-text-align-none"><strong>That&#8217;s the turn that I&#8217;m really interested in. There were all these shots at AI before, deep research being one of them. There were machine learning algorithms deployed broadly across the industry. Apple was talking about neural accelerators in the iPhone years ago, but they didn&#8217;t add up to what LLMs have since added up to in the industry.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>I&#8217;m curious though. You mentioned cost and that the cost can come down, but you and I are talking at the end of an earnings cycle, and everyone&#8217;s costs are skyrocketing. Their CapEx is skyrocketing. There are some layoffs associated with the increased CapEx that I do want to ask you about.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>But just purely on cost, it doesn&#8217;t seem like it&#8217;s that much cheaper, right? It seems like to win, you have to spend vastly more money, and that money does not, at the moment, have a defined ROI. There are a lot of bets. Can you reconcile the idea that there are lower costs in the industrial scale versus the actual expenditures we&#8217;re seeing?</strong></p>

<p class="has-text-align-none">I can, but if you&#8217;ll allow me to say this, there&#8217;s a difference in the B2C world versus the B2B world. First, let&#8217;s just talk about the cost. Are there huge amounts of not just capital but operating expenses being spent on populating data centers with GPUs and building out those infrastructures, and are those amounts being committed now up in the trillions? It&#8217;s absolutely true, and that&#8217;s what you just mentioned: &#8220;Hey, that doesn&#8217;t sound cheap. That doesn&#8217;t sound a lot cheaper than before.&#8221;</p>

<p class="has-text-align-none"><strong>It doesn&#8217;t even sound safe, just to be clear. I don&#8217;t even think that sounds safe based on the potential returns.</strong></p>

<p class="has-text-align-none">Maybe we&#8217;ll come back to that. What I meant when I said it&#8217;s going to get a lot cheaper is that if I take a five-year arc, what has the semiconductor industry shown time over time? Go back to the beginning of the PC. You have half a dozen competing technologies, and some begin to win. That was the beginning of Moore&#8217;s Law really, right?&nbsp;</p>

<p class="has-text-align-none">Every two years you get a 2x advantage in what you can do. I look at the semiconductor side, and I say, &#8220;Over five years, we&#8217;ll probably get a 10x advantage in pure semiconductor capability, or the amount of compute for a dollar you can spend.&#8221; Got it. That&#8217;s one. Second, nobody has said that a GPU is the <em>only</em> architecture that is great for deploying these large language models. It&#8217;s certainly one. There are other companies coming up. We have <a href="https://newsroom.ibm.com/2025-10-20-ibm-and-groq-partner-to-accelerate-enterprise-ai-deployment-with-speed-and-scale">a partnership with Groq</a>, they have a different kind. You have <a href="https://www.cerebras.ai/blog/cerebras-partners-with-ibm-to-accelerate-enterprise-ai-adoption">Cerebras</a>, they have a different kind–</p>

<p class="has-text-align-none"><strong>That&#8217;s Groq the processor company, not Grok, Elon [Musk&#8217;s] AI company.&nbsp;</strong></p>

<p class="has-text-align-none">Correct. Groq, the processor company. Yes, <a href="https://en.wikipedia.org/wiki/Grok">the word</a> comes from computer science. A lot of people use the word. But yes, Groq, the inferencing chip company. At least in these first steps, Groq looks like it&#8217;ll be 10x cheaper. But that, again, is not going to be the only design possible. I think you&#8217;ll get a 10x advantage on the pure silicon side. You&#8217;re going to get a 10x from the design side. Then there&#8217;s the third piece. I think there&#8217;s a lot of work to be done around memory caching and how you deploy these models. Do I quantize them? Do I compress them? Do I always need the biggest?&nbsp;</p>

<p class="has-text-align-none">So, there&#8217;s a 10x advantage from the software side. You put those three 10s together, and that&#8217;s a thousand times cheaper. I&#8217;m simply saying, &#8220;Hey, maybe we won&#8217;t get all of it in the next five years, but even if you get the square root of that, that&#8217;s 30 times cheaper for the same dollar spent.&#8221; That&#8217;s why I believe that this is going to play out. It is going to get a lot cheaper, but it&#8217;ll take five years to play through.</p>

<p class="has-text-align-none"><strong>Five years right now, feels like forever to most people living through this disruption. It feels like forever when you can see the hundreds of billions of dollars being deployed today in data centers that are running mostly Nvidia GPUs. You talked about Moore&#8217;s Law. I look at all of that and I actually see a massive disincentive for Nvidia to come out with the next generation of its GPUs. There&#8217;s a lot of equity tied up in the </strong><a href="https://www.theverge.com/2024/2/23/24080975/nvidia-ai-chips-h100-h200-market-capitalization"><strong>H100 being the literal unit of currency</strong></a><strong> that these deals are taking place upon.</strong></p>

<p class="has-text-align-none"><strong>That&#8217;s a weird dynamic, right? It sounds like you say there&#8217;s going to be competitors that upend that dynamic.</strong></p>

<p class="has-text-align-none">Not necessarily upend but provide a lot more competition, and that&#8217;s the nature of it.</p>

<p class="has-text-align-none"><strong>You kind of nodded in agreement when I said there was a disincentive for Nvidia to release the next generation of GPUs. Do you think that&#8217;s true?</strong></p>

<p class="has-text-align-none">I think that when you have an incredibly valuable company that&#8217;s making its profit stream from a few products, there&#8217;s always an inherent or organic disincentive to try to modify that. That said, I would <em>never</em> bet against Jensen [Huang]&#8217;s ability to disrupt himself and go towards the next plateau, if there is one. So, you have both. I think certain companies are able to disrupt themselves, others hesitate to do it, and that is actually what causes the up and down of companies in the tech world.</p>

<p class="has-text-align-none"><strong>I&#8217;m obviously leading towards the big question, which is that this feels like a bubble. </strong><a href="https://www.theverge.com/ai-artificial-intelligence/759965/sam-altman-openai-ai-bubble-interview"><strong>A lot of people think it&#8217;s a bubble</strong></a><strong>. You have a markedly different view of how this industry will play out. You&#8217;re investing, and I want to talk about the fact that you&#8217;re hiring while some of your competitors are doing layoffs at a huge scale. But let me just ask the question directly, and then we can go into everything else. Do you think we&#8217;re in an AI bubble right now?</strong></p>

<p class="has-text-align-none">No. Do I believe that there will be some displacement and some of the capital being spent, especially the debt capital, will not get its payback? Yes, but let&#8217;s just look at it. So, this is a place that is a B2C, and then there is the B2B world. There is a lot of common tech in both, but let&#8217;s just look at the B2C. If you build a set of models that are very attractive in B2C, and half a billion people become consumers of that (which are roughly the current numbers), it makes economic sense to build a slightly better model by spending another $50 billion that can attract another 200 million users.&nbsp;</p>

<p class="has-text-align-none">So, this is a race towards who can get more and more of the world&#8217;s 7.5 billion people to become subscribers of a given model because the next bet becomes that network scale and those economies of scale that will allow you to go succeed. You&#8217;ve seen that movie play out. That was social media in the last generation. So, I react with, &#8220;It makes sense for them.&#8221;&nbsp;</p>

<p class="has-text-align-none">Now, if 10 of them are going to go compete, we know that maybe two or three of them will be the eventual winners, not all 10. To me, it makes economic sense that they&#8217;re chasing that. My point is that not all of that will see a return. By the way, if I look at fiber optics in the ground back in the year 2000, not all of those people got a return.</p>

<p class="has-text-align-none">However, this is the beauty of capitalism, and I&#8217;m calling it a beauty. We spend the money, it gets corrected back to 30 cents on the dollar. At that point, it makes an incredible amount of sense for somebody else to get that asset and turn it into a profit stream, but not all of it will get lost. As I said, two or three are going to make a ton of money, and the others won&#8217;t. So, I think the equity being put in will actually get a return. Some of the debt will not.</p>

<p class="has-text-align-none"><strong>I love the fiber comparison, and if you&#8217;ll indulge me, I want to sit in it for just a minute. I was very young when the fiber rollouts were happening. I was very excited to get faster internet access, and I remember that bubble well. Part of that bubble was wanting to build infrastructure for the internet, and the thing that really drove the bubble was wanting to move the entire economy onto the internet, and that didn&#8217;t work.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>There was the Pets.com IPO, and that was the sign that we hadn&#8217;t quite moved the economy, but we built the infrastructure. The important thing and the important difference is the fiber in the ground didn&#8217;t go bad.</strong></p>

<p class="has-text-align-none"><strong>Earlier this year, I </strong><a href="https://www.theverge.com/24351247/ciena-fiber-optic-internet-subsea-cables-wdm-ai-hyperscale-data-decoder-podcast-interview"><strong>interviewed Gary Smith</strong></a><strong>, who&#8217;s the CEO of Ciena, which does fiber multiplexing. It can get infinite returns on fiber that was deployed 30, 40 years ago to this day, and their technology helps them build data centers. That was really why he was on the show, because he really wanted to tell everyone that his technology could build data centers. The GPUs go bad. They&#8217;re already failing at a rate between 3-9 percent in the data centers. There also might be an H200, or the chip you&#8217;re investing in with Groq might displace the H100.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>So, all of this CapEx is not going to be here 30 years from now for the next generation of entrepreneurs, like Gary, to build upon and create more capacity with. We&#8217;re just going to throw it away.</strong></p>

<p class="has-text-align-none">No, no, let&#8217;s decompose it. So, you&#8217;re building a physical data center that&#8217;s a lot larger. I think concrete and steel survive. Next to it is a power plant. We need the electricity. Actually, I believe those power plants will even get hooked up to the grid over time, which is even better for national infrastructure. That&#8217;s useful.&nbsp;</p>

<p class="has-text-align-none">Now, the fiber coming out of them — the networking, storage, and CPUs inside these places — are all useful. I&#8217;ll acknowledge right now there is a very high failure rate, but being a bit of a semiconductor geek, though I&#8217;m not anywhere near as deep as some of my friends and competitors in those spaces, if you can run something at 3GHz and you try to run it at 4GHz, it will actually run but has a higher failure rate.&nbsp;</p>

<p class="has-text-align-none">Maybe it&#8217;s great if you try to run it at 300W. If you run it at 400W, it has a higher failure rate. So, if today you just need the performance for training a model that much faster, it actually is worth it to tune it and say, &#8220;I&#8217;m okay to have that failure rate. I got software that worries about moving stuff around.&#8221; But you can de-tune it slightly for higher resilience.</p>

<p class="has-text-align-none">I think that is actually a design point. That&#8217;s not really a bug, so to speak. Do I acknowledge that these will move up over time? I began by saying, &#8220;I think in five years, our semiconductors will be 100 times better.&#8221; So you&#8217;re right, there&#8217;s a five-year depreciation to the GPU or some of the compute infrastructure, but the other half is useful. But in five years, you don&#8217;t throw away all the CapEx. You throw away a little piece, and you replace that with something that is better at that point.</p>

<p class="has-text-align-none"><strong>I think the specific comparison to fiber making — and maybe it&#8217;s too pedantic — but the fiber was in the ground and then it was there. It did not incur a recurring cost to the people who wanted to use it outside of wanting to create more capacity by multiplexing the fiber.</strong></p>

<p class="has-text-align-none">You&#8217;re right, the fiber in the ground is endurable. Maybe not forever, but at least for 100 years. At some point, even glass begins to occlude and do all kinds of weird things, but it&#8217;s good for 100 years. But people also built a lot of end stuff on top, all of which had to be thrown away.</p>

<p class="has-text-align-none">You&#8217;re now forgetting all the failures. People were building <a href="https://en.wikipedia.org/wiki/Asynchronous_Transfer_Mode">Asynchronous Transfer Mode</a> (ATM). People thought that they could build really intelligent video streaming and put the guts of that inside. People were talking about doing <a href="https://www.ciena.com/insights/what-is/What-Is-WDM.html">Wavelength Division Multiplexing</a> (WDM), since you talked about Ciena. Then, it became simpler. Here&#8217;s dark fiber, it&#8217;s a dump pipe. Go throw your bits in it at a terabit, the intelligence belongs at the cloud end. That took 10 years to unfold. So there was actually a change in how it transpired. I&#8217;m sorry to be that geeky.&nbsp;</p>

<p class="has-text-align-none"><strong>No, this is why we&#8217;re here, that&#8217;s why I asked the questions. I would actually argue that was one of the most exciting periods in tech, when no one knew how it would work, and there were many, many more shots being taken. It all did pop in a catastrophic bubble. But it was very exciting.</strong></p>

<p class="has-text-align-none">It did go down, and then today you could turn around and say, &#8220;But all the companies that got built on the back of that clearly proved that that investment was worthwhile.&#8221; If I look at it at a national or an aggregate investor level, while some people did lose a lot of money, some people made a lot of money.</p>

<p class="has-text-align-none"><strong>I want to take the other part of that bubble comparison, which is that we were going to move the entire economy to the internet. You brought up social media. As someone who covered it very deeply from the beginning of the iPhone to now, I would characterize it as wanting to move the entire economy onto your phone.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>First, we were going to put it all online. Maybe it didn&#8217;t have the distribution because we&#8217;re not all going to look at CRT monitors on our desktop, so that didn&#8217;t happen. But then we all got phones, and the idea that we could move an enormous amount of at least the consumer economy onto our phones happened. That occurred. We&#8217;re all living with the results of that today.</strong></p>

<p class="has-text-align-none"><strong>Do you feel like the argument, at least in the consumer space as you&#8217;ve described it, is that we&#8217;re going to move that app economy to AI? Because how I see it is that the same class of investors who got rich moving the economy onto smartphones now think they can run the playbook again with AI. Maybe we&#8217;ll re-architect the applications with [Model Context Protocol] (MCP) and maybe there&#8217;ll be agents using the websites instead of people, but the argument from the same set of characters feels broadly the same to me.</strong></p>

<p class="has-text-align-none">If you don&#8217;t mind, I&#8217;ll go a little bit deeper on your first part.</p>

<p class="has-text-align-none">You&#8217;re absolutely correct that the front end of the economy moved on to the phone. It was definitely a massive unlock the moment the phone gave you access so that it could be with you everywhere and you were not just anchored to a desk with a laptop or a desktop. Let&#8217;s acknowledge that. But there is still a physical economy.&nbsp;</p>

<p class="has-text-align-none">I always talk about how 60 percent of the workers in the United States are still frontline: people who do construction, people who have warehouses. If you&#8217;re buying a tangible good, it&#8217;s still coming from a warehouse. It&#8217;s maybe not from a retail store near you because they had a front end, but in the back, there&#8217;s a warehouse, a truck driver, and maybe multiple routes of distribution. We still go to restaurants, there&#8217;s still food, there&#8217;s still groceries, there&#8217;s physical healthcare, there&#8217;s all of that. It becomes more efficient, easier, and more convenient.</p>

<p class="has-text-align-none">But now I say, &#8220;I don&#8217;t have to spend that much time, I&#8217;m going to have an agent or a front-end AI that helps to unlock even more and puts together four or five things that I have in my head,&#8221; I completely agree with you. Why wouldn&#8217;t we want that to happen? That is going to happen. You can see the early instances of that already happening. It&#8217;s so appealing now because it gives a chance for people (without me taking any names) to reform who are the biggest players, and it gives a chance for some disruption. On the other hand, I think it goes beyond the consumer and into the enterprise. I actually believe there&#8217;s going to be a billion new applications written.</p>

<p class="has-text-align-none">Now, if you think about the smartphone ecosystem we talked about, people talked about half a million, a few million, I think this could be a billion. There may be a few million that sit on the consumer side, but if there are let&#8217;s say 1,000 enterprises and you go across the number of enterprises times 1,000, then that unlocks a lot more.</p>

<p class="has-text-align-none"><strong>Let me ask you one question there, and then I do want to ask you the </strong><strong><em>Decoder</em></strong><strong> questions and about IBM specifically. The biggest winners of that move to put all economic activity onto the smartphone were in many ways Apple and Google because they collected an enormous amount of rent on the back of that transition with app store taxes and the fees.</strong></p>

<p class="has-text-align-none"><strong>Maybe that&#8217;s going to get unwound now with whatever antitrust litigation is happening in Europe, but it happened. They collected a huge amount of fees. They are some of the richest companies in the world on the back of that. Apple just </strong><a href="https://www.cnbc.com/2025/10/30/apple-silences-its-critics-with-strong-iphone-demand-and-blowout-services-revenue.html"><strong>reported its quarterly earnings</strong></a><strong>, and its services revenue is higher than ever on the back of App Store fees. That&#8217;s what that line really is. I think it runs the TV business just to pretend that reality is not the reality.</strong></p>

<p class="has-text-align-none"><strong>Do you see that playing out in AI? Because I look at OpenAI announcing what </strong><a href="https://www.theverge.com/2024/1/10/24032144/openai-chatgpt-gpt-store-ai-launch"><strong>looks like an app store</strong></a><strong>. I look at Google announcing that Google Search will have inbuilt custom developed applications as you search. It&#8217;s very cool, but I see these points of centralization emerging again that don&#8217;t look like Apple and Google, and maybe there&#8217;s competition for that. There might be competition for that in the enterprise. Do you see those same points of centralization?</strong></p>

<p class="has-text-align-none">I wouldn&#8217;t say that we know who the winners are today because we are only in the first innings of the game. There will be some winners. How about I agree with you on that.</p>

<p class="has-text-align-none"><strong>But do you think those winners look like the central points of control that we saw in the smartphone era?</strong></p>

<p class="has-text-align-none">There will be a few different winners. If you go back to the smartphone analogy, you had one who built a vertically integrated stack. It was an easier, more convenient device, and then to get access to that device, people had to come into the App Store. That was that model. The other model said, &#8220;We are completely open,&#8221; with the Android operating system. However, to get access to everything else, you had to go into the Play Store or into Google Search. That was the second model. It wasn&#8217;t identical, but it was similar. So, those became the two entry points to get access to the end individual. That&#8217;s why they could charge the appropriate… you&#8217;re calling it rent, which is from an economics term. Let&#8217;s say they could charge an appropriate margin from a business standpoint.</p>

<p class="has-text-align-none"><strong>I think Tim Cook would call it a margin, but the developers I know feel very differently about that margin.&nbsp;</strong></p>

<p class="has-text-align-none">But there is also a massive amount of cost for those who build out that massive infrastructure. It&#8217;s not like they can maintain it forever. As the Chinese have shown, you can build competing products. If you can keep running ahead, then people will prefer these devices. But at the end of the day, the value is in the apps, as you were saying. If that app is available on something else or if the friction and innovation on the main platform slows down, people will switch.&nbsp;</p>

<p class="has-text-align-none">It&#8217;ll take maybe three or five years. It&#8217;s not like there will be guaranteed returns forever. It will switch. As many other companies have seen, that switch takes a few years. It doesn&#8217;t take decades. When it happens though, it&#8217;s disastrous to the original company. Some manage to recover because they wake up and say, &#8220;Hey, wait a moment, I got to change.&#8221; Some don&#8217;t.</p>

<p class="has-text-align-none"><strong>I think this brings me to IBM. This is the process you and IBM have been in for many years now. You took over as CEO in 2020, and you&#8217;ve been at the company for almost 30 years when that happened.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>I ask everybody these questions. You have a unique perspective here. You&#8217;d been at the company for a very long time when you took over as CEO. How was IBM structured when you took over, and how have you changed that structure?</strong></p>

<p class="has-text-align-none">It&#8217;s much more about culture, focus, what we do, and how we do it than the formal organization structure. If you say that you&#8217;ve got to be focused on innovation, you&#8217;ve then got to be focused on where you can provide a unique value back to your clients. That&#8217;s the first question. I want to be clear that our sweet spot is helping our B2B clients succeed. You might say, &#8220;Okay, well, that&#8217;s a very big remit. What then?&#8221;&nbsp;</p>

<p class="has-text-align-none">I hold two points of view that are somewhat unique. One, I don&#8217;t believe that the majority of our customers are going to go to a singular public cloud. Some will, but the majority will not. People outside the US tend to want to be somewhat split between an American cloud and something more sovereign. Then, there are people who use plenty of SaaS properties. There&#8217;s a huge amount of economic value in what they&#8217;ve already written in their preexisting applications. I&#8217;ll use the word hybrid to describe that.&nbsp;</p>

<p class="has-text-align-none">Is there a place for a vendor to have leading-edge tech to help our clients in that journey? That&#8217;s the hybrid approach we took, and that has shown to be of incredible value over time. About 60 percent of the total spend is outside the US. Even inside the US, anyone in a regulated industry is going to be hybrid in some sense. So that&#8217;s the first.&nbsp;</p>

<p class="has-text-align-none">The second is focusing on where AI can be deployed in the enterprise. Let&#8217;s not go try to compete. I will not try to compete with Google on building a chatbot that… what&#8217;s the current number? It&#8217;s <a href="https://blog.google/products/gemini/gemini-3/">650 million active subscribers</a>. That&#8217;s not where we have brand permission and credibility. But I can walk into a health insurance company and say, &#8220;I&#8217;ll make sure that your clients&#8217;, your patients&#8217;, health data is protected, but let&#8217;s unlock AI to make those people feel even happier and get quicker, easier answers.&#8221; Those people tend to trust us because in 114 years, we have never misused that data, not even once. You get that, and then you can give them the tech and get it deployed.&nbsp;</p>

<p class="has-text-align-none">So we picked those two. Then, I asked, &#8220;What are we really good at?&#8221; We&#8217;re really good at building systems. I decided early on that the third bet was on quantum. Let&#8217;s see whether we can change it from being a science challenge to an engineering challenge. Once it&#8217;s an engineering challenge, how do we scale it to really get deployed? That was really the big inflection point as opposed to trying to do lots of things. I used the word innovation. That meant commodity services had to leave the company because you can&#8217;t do both. It meant that if we are going to be hybrid, I had to partner with everybody else that I talked about.</p>

<p class="has-text-align-none">So, you begin with the clear view of what should be done, and then you say, &#8220;It doesn&#8217;t matter, I&#8217;ll make all the hard decisions of changing the way the sales teams are paid by changing the incentives of all the executives to align with what&#8217;s needed to make those things succeed.&#8221; Sorry for a really long answer.</p>

<p class="has-text-align-none"><strong>No, that&#8217;s great. A trope on this show is that if you tell me your company structure, I can predict 80 percent of your problems. You might say culture and structure are divorced, but I see the connection, and they feed off each other.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>So, you were at IBM for a long time. Vanishingly few people will ever interview to be the CEO of IBM. What was that process like? Did you come in saying, &#8220;This company is focused all wrong. We got to let go of the commodity stuff. I&#8217;m going to make these changes?&#8221; Then, once you had decided to do that, how did you actually change the structure of the company to focus on those things?</strong></p>

<p class="has-text-align-none">I probably didn&#8217;t spend 30 years aspiring to this job, just to be upfront. I think it was more of a process of discovery, even for myself, in the couple of years before that. I made the hybrid observation deeply in 2017. As I was making that, I said, &#8220;Okay, how do I test this? &#8221; I actually had a partnership with Red Hat, and I said–</p>

<p class="has-text-align-none"><strong>Is this why you have a red hat? I noticed you have the red hat behind you.</strong></p>

<p class="has-text-align-none">I have a red hat there because when we <a href="https://www.redhat.com/en/about/press-releases/ibm-acquire-red-hat-completely-changing-cloud-landscape-and-becoming-worlds-1-hybrid-cloud-provider">announced the decision in 2018</a>, it took a year to get through regulators and close it. It was 30 percent of our market cap. Very few companies spent 30 percent of their market cap on a conviction and a belief. So, I keep the red hat there because to me it was clear: if that conviction turned out to be wrong, I should be fired. People hesitate to say those things, but I say, &#8220;If I&#8217;m that wrong, I should not be working here.&#8221; That is why I keep the red hat as a reminder to myself that not only must you have the conviction but you must then do the really hard action.&nbsp;</p>

<p class="has-text-align-none">So, that&#8217;s the culture part of making conviction succeed. Otherwise, people will just fall back into the lanes they were in. There&#8217;s comfort in doing things the way they&#8217;ve always done them —</p>

<p class="has-text-align-none"><strong>Put me in the room. It&#8217;s 2020, you&#8217;re going through the interview process with the board. Did you have a deck that said, &#8220;We&#8217;re doing too much commodity stuff. I&#8217;m going to cut it down, and we&#8217;re going to focus on these areas and the big bet with the quantum stack change?&#8221;</strong></p>

<p class="has-text-align-none">My deck was three pages of pros. It was not like 100 pages of analysis. I believe that you should talk about what you want. I said, &#8220;We have to grow, and my view is very simple: you&#8217;ve got to grow well above GDP growth, otherwise you&#8217;re not going to be relevant in the future.&#8221; &#8220;Okay. If you&#8217;re going to grow, where are you going to grow?&#8221;&nbsp;</p>

<p class="has-text-align-none">If you look at us, our brand permission is fundamentally being a technology company. That was code for &#8220;high innovation.&#8221; Now, this is where I think many companies fall short. If you&#8217;re clear about that, then things that don&#8217;t belong should not be in the company. So, that is why the spinouts took a couple of years to get done.&nbsp;</p>

<p class="has-text-align-none">Then, I said, &#8220;We have to grow in software because that is where our clients perceive value.&#8221; You talk about structure. Well, if you&#8217;re going to grow in software that becomes a big fundamental change. That&#8217;s where capital allocation and resource allocation go. That&#8217;s where you&#8217;ve got to put way more investment than you historically had. Then, how do you fundamentally line up with partners? That is organizational change because you got to say, &#8220;How do the sales teams get paid? How do you have the right incentives?&#8221; So, those were maybe the three first really big decisions I made in the first two years.</p>

<p class="has-text-align-none">As you do that, you also realize people tend to be very risk-averse. How do you unlock them so they take that risk? To me, there&#8217;s no risk-free path to success. If you want to be risk-free, you&#8217;re going to almost always be slammed against the bottom end of performance. How do you unlock risk-taking in people so that they feel motivated to do it more often than not?</p>

<p class="has-text-align-none"><strong>This leads me into the second go-to question I ask everybody. I have a sense of it, but I&#8217;m curious how you will describe it. How do you make decisions? What&#8217;s your framework for making decisions?</strong></p>

<p class="has-text-align-none">You always start with if there is value. If it&#8217;s a decision that&#8217;s going to impact what we do and how we do it, does a client benefit from this new way of doing it? If you&#8217;re pretty convinced of that — and I&#8217;ll come back to where you get your conviction — I always believe that you should triangulate. I will always talk to a number of people on the inside and outside. Maybe not with a full description because sometimes you don&#8217;t want to give that, but with enough to validate my assumptions or what the possible victory would be.&nbsp;</p>

<p class="has-text-align-none">So, you arrive at a conviction, you triangulate it with a few people, and then you ask yourself, &#8220;What needs to change inside if we really want this to go all the way?&#8221; Once you arrive at conviction and all those, you are then able to go execute it.</p>

<p class="has-text-align-none">I build on my own strengths. I think I&#8217;m a reasonably deep technologist. I think I generally understand where the tech can go, but I may not always fully understand what a client can do with the tech. That&#8217;s why the first piece is really important. Then, I triangulate. I don&#8217;t mind reaching 10 levels down in the organization to talk to somebody who I think has an opinion on that topic or knows about it. Talk to possible clients about it. Talk to partners about those things. It just informs your opinion. In any case, when you&#8217;re out talking to them, keep your ears open for what they say. That could actually inform some things later.</p>

<p class="has-text-align-none"><strong>Let&#8217;s put that into practice on the farthest bet you&#8217;re making, which is quantum. All the big tech companies have quantum divisions. I&#8217;ve had Jerry Chow, who runs part of your quantum team, on the show before. That was </strong><a href="https://www.theverge.com/23988271/ibm-quantum-heron-system-two-jerry-chow-qubits"><strong>a great conversation</strong></a><strong>. I&#8217;ve looked at a lot of rooms where someone tells me that this is the coldest place on Earth to run their quantum or whatever qubit they&#8217;re trying to generate on that day.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>None of that has paid off yet. We&#8217;re not close to what they call &#8220;utility-scale computing&#8221; in quantum. That&#8217;s not something your customers are asking for yet. That&#8217;s outside of structure and culture&#8217;s purview that you&#8217;re deciding. That&#8217;s a big bet where there will be a massive step change in how we build computers that unlocks vastly more value for everybody. You have to keep that investment even through all the turmoil, all the data center investment everyone else is doing, and Amazon saying, &#8220;</strong><a href="https://www.theverge.com/news/807825/amazon-job-layoffs-2025-ai"><strong>We&#8217;re laying off 14,000 people because of AI</strong></a><strong>&#8221; while you&#8217;re saying, &#8220;</strong><a href="https://fortune.com/2025/11/05/ibm-ceo-arvind-krishna-promise-hire-more-gen-z-college-graduates-but-thousands-laid-off-ai-restructuring/?intcid=CNR-01-0623"><strong>We&#8217;re going to hire more college graduates</strong></a><strong> than anybody else.&#8221;</strong></p>

<p class="has-text-align-none"><strong>What is the decision to stay focused on quantum in that way? How do you maintain that decision?</strong></p>

<p class="has-text-align-none">You are right that you can&#8217;t go check with a customer because they don&#8217;t know what to do with it today. But that&#8217;s not fully true. So, over the first five years, 2015-2020, you&#8217;ve got to have a belief in what it could do. Maybe because of my graduate school math background, I thought, &#8220;Wow, if we can do that, I can immediately see what kind of problems could get unlocked.&#8221; But trying to explain that to anybody but the people excited in the field is impossible. I completely acknowledge that those five years were about an internal bet on a set of people and a possibility.</p>

<p class="has-text-align-none">But from 2020 onwards, we began to say, &#8220;These are not utility scale. Let me acknowledge it. They&#8217;re full of errors. They are small. Could clients still get excited by it?&#8221; I did perform a full check. We have 300 non-commercial clients. There are 300 people working with us in… let&#8217;s call it a research mode. There are 100 who are purely commercial, 100 who are in the world of materials or medicine, and 100 who are pure academics. Those are the rough buckets.&nbsp;</p>

<p class="has-text-align-none">That&#8217;s why HSBC proved to itself we could do bond trading pricing on it. Vanguard proved to itself that if it got big enough, it could build a portfolio that better appeals to your needs. You have Daimler working on EV batteries. You have Boeing looking at corrosion on materials. So, there is a proof point. They&#8217;re not saying they&#8217;ll buy it the way it is today. All they&#8217;re saying is, &#8220;Hey, if you get to that point, this is really interesting to us.&#8221;</p>

<p class="has-text-align-none">There is validation, even from clients. Then I said, &#8220;How do I know there&#8217;s enough interest?&#8221; So, I asked the team to put the software out open source. Now, I&#8217;ll say for many people, including some currently in AI, that&#8217;s not common to do early on. Why open source? How will developers and universities use this stuff and get any excitement if you put a price on it? So, we put out all our software open source. The fact that there are 650,000 people globally who use it tells me that there is excitement, there is a movement, and that people are hungry for a new approach to solve other kinds of problems.&nbsp;</p>

<p class="has-text-align-none">Those were the two validations on my framework that were useful. If that 650,000 had been 100,000, I might still be okay. The fact that it&#8217;s 650,000 tells me there is real, real traction. But if 650,000 had been 1,000, I would have told my people, &#8220;Guys, these are your physics friends. This is not a market.&#8221;</p>

<p class="has-text-align-none"><strong>I&#8217;m curious about that. That is the kind of long-term bet, and the early interest from people who think, &#8220;This type of computing will let us do many more things.&#8221; It&#8217;s funny on the consumer side. I hear about it in terms of, &#8220;Well, when there&#8217;s quantum computing, we&#8217;ll need quantum proof encryption.&#8221; It&#8217;s like there&#8217;s a secondary market now based on whether or not you will succeed in quantum computing that has almost nothing to do with quantum computing succeeding. It&#8217;s a bet. It&#8217;s a strange hedge against your success, Microsoft&#8217;s success, or whoever else is doing quantum.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>What does actual success look like? Is it a step change in computing that is as big as the re-architecture of all computers around AI that we&#8217;re experiencing today? Is it bigger than that? What does that feel like to you?</strong></p>

<p class="has-text-align-none">I actually think it&#8217;s an add. So, there are CPUs. GPUs did not replace CPUs, it was an add. Now, GPUs are priced much higher than CPUs, so the market is bigger for GPUs than CPUs, but it was a complete add. It didn&#8217;t displace what AMD, Intel, and ARM do.</p>

<p class="has-text-align-none"><strong>I feel like Intel feels differently about that right now. Sure, I agree with you.</strong></p>

<p class="has-text-align-none">Some companies have many other issues. The number of x86 chips being sold per year is as high as it has ever been. How about if I phrase it like that?</p>

<p class="has-text-align-none"><strong>Fair enough.</strong></p>

<p class="has-text-align-none">Okay. So, it&#8217;s an add, but if the next one has more immediate value, you can price it at a different price point. Does that make sense?</p>

<p class="has-text-align-none"><strong>Yeah.</strong></p>

<p class="has-text-align-none">Let&#8217;s just use the term QPU just to keep it simple with quantum. QPUs are going to have an incredible value when they come because they can solve problems that you actually cannot solve on GPUs and CPUs in any economic terms in the near term. Look, everything you can do on a GPU, you could do on a CPU, but it&#8217;s going to be a thousand times slower and not be as economically feasible. So, GPUs opened up a whole class of new problems.&nbsp;</p>

<p class="has-text-align-none">QPUs are similarly going to open those up. It&#8217;s an add, not a displacement. But given there&#8217;s finite dollars in the world, if there&#8217;s an add and we have a first mover advantage, like what one of the companies you named had with GPUs, that opens up a possibility that the market is that big.</p>

<p class="has-text-align-none">So we did work. We asked a couple of our friends in the consulting world, like Boston Consulting Group and McKinsey. &#8220;Tell us what you think the value is if we can arrive at some utility point?&#8221; They both came back and gave us a pretty consistent answer. It was very sparkly, but think of it as, &#8220;We think there&#8217;s $400 billion to $700 billion of value early on per year.&#8221; Great! &#8220;How much do you think the tech world could get out of that?&#8221; &#8220;Probably 20 to 30 percent. Seems reasonable.&#8221; I said, &#8220;Okay, that&#8217;s the size of the prize&nbsp; we&#8217;re going to chase.&#8221; How much of that share will we get versus others is out of the question, and that&#8217;s the journey we are on for the next five years.</p>

<p class="has-text-align-none"><strong>You think you&#8217;ll be able to pay off the quantum investment in five years?</strong></p>

<p class="has-text-align-none">It&#8217;s really hard for engineering to put a dot on it and say, &#8220;This is not like building the next mainframe.&#8221; There, I really know what I&#8217;m doing. I know exactly how many months it&#8217;ll take, and I could put a dot on it.&nbsp;</p>

<p class="has-text-align-none">Here, I gave it a spectrum. Will we get to something remarkable in maybe three and a half years? I&#8217;m going to give it low odds. It&#8217;s possible, but the odds are maybe 20 or 30 percent. Can we get there in four years? My odds go way up. Can I get that in five years? My odds go really high. So that&#8217;s why I say five. That&#8217;s not to say it&#8217;s really five years. I think it&#8217;ll be a bit of a spectrum. I&#8217;m hoping you&#8217;ll see some really early adopters in around years three to four. There&#8217;ll be more at the end of year four, and then the risk decreases for people after that.</p>

<p class="has-text-align-none"><strong>That&#8217;s a lot of action in 24 months. That will be a very exciting two-year period if you hit it.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>This is really interesting to talk about in comparison to AI. You&#8217;re talking about how you estimated the market size for a nascent technology that you have to develop actual capabilities for. You estimated how much of that market share you could take, and&nbsp; you&#8217;re making some investments based on the potential return.&nbsp;</strong></p>

<p class="has-text-align-none">So, that last part, why us? I assume others can do all of this. Why would we succeed? Because I think it&#8217;s much more. There&#8217;s so much talk. You mentioned the various qubit technologies, cold rooms, and alternate technologies. I actually love the fact that there is that much, but that&#8217;s not building a computer. I always tell people, &#8220;You absolutely need a great QPU and a great qubit. You also need a way for them all to talk to each other. You also need a way to control all of them. You also need a way for it to function by itself without six quantum physicists standing in the room tuning it.&#8221;</p>

<p class="has-text-align-none"><strong>This is a great employment plan for quantum physicists. Come on.</strong></p>

<p class="has-text-align-none">[<em>Laughs</em>] So, you need all those things, and we are one of the unique players who have a lot of those skills in house. It unlocks people to go do that, and it really motivates and excites them. I think that is an advantage we have today in terms of underlying skills.</p>

<p class="has-text-align-none"><strong>I would call that a very sober, very thoughtful, almost conservative approach to deploying billions of dollars in CapEx against a technology that has not yet proven itself in the market.</strong></p>

<p class="has-text-align-none"><strong>You&#8217;ve made some estimates. You have an idea of what your company can do to add value. You&#8217;re going to do the hard research, and then you&#8217;re going to get there. I would just compare that to OpenAI and the AI market that we see today. Just this week, OpenAI </strong><a href="https://www.theverge.com/news/807875/openai-microsoft-for-profit-agi"><strong>converted to a for-profit company</strong></a><strong>. There&#8217;s </strong><a href="https://www.reuters.com/business/openai-lays-groundwork-juggernaut-ipo-up-1-trillion-valuation-2025-10-29/"><strong>reportedly a trillion-dollar IPO coming</strong></a><strong>. There&#8217;s everything we&#8217;ve talked about in the enterprise space, where you can see how AI and enterprise can help accelerate data use and all this unstructured data that companies have. Fine.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>But the bet is in the consumer space. We&#8217;re just going to build a full-fledged agent that&#8217;s going to run around and do stuff for you, and that will replace your smartphone. None of that seems sober, conservative, based on a real market estimate, or even whether consumers want the&nbsp; product. It&#8217;s just a pipe dream.</strong></p>

<p class="has-text-align-none"><strong>How do you reconcile those two things? The bet is there will be AGI. At the end of the day, the whole market is based on that someone&#8217;s going to figure out AGI, and then all of this would have been worth it. The </strong><a href="https://blogs.microsoft.com/blog/2025/10/28/the-next-chapter-of-the-microsoft-openai-partnership/"><strong>press release from Microsoft</strong></a><strong> announcing the restructured deal with OpenAI mentions several times in bullet points that the terms expire when OpenAI declares AGI.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>I read that and I thought that this is the most remarkable press release I&#8217;ve ever read in my entire life. No one can even define this term, and now two of the richest companies in the world are issuing press releases saying their deal will restructure itself when that happens. That&#8217;s very different from your bet on quantum. How do you read that discrepancy?</strong></p>

<p class="has-text-align-none">Of the ones you mentioned, one has a huge amount of cash flow and ability to invest.&nbsp;</p>

<p class="has-text-align-none"><strong>That&#8217;s Microsoft.</strong></p>

<p class="has-text-align-none">But it&#8217;s something that could be incredibly profitable. The other one is a classic Silicon Valley startup. Some will succeed, some will not. I&#8217;ll offer you an opinion. First, I don&#8217;t think deeply about the consumer side and how much money they&#8217;ll spend. It&#8217;s interesting to observe, but I&#8217;m not going to pretend that I deeply–</p>

<p class="has-text-align-none"><strong>Well, let me ask you this question. Do you think there&#8217;s an enterprise ROI that would justify the spend we have today? Because I look at it and I say, &#8220;Absent AGI, this spend might not be worth it.&#8221;</strong></p>

<p class="has-text-align-none">I&#8217;ll actually put it this way. You said I&#8217;m a little numerical, I&#8217;m a little geeky.&nbsp;</p>

<p class="has-text-align-none"><strong>I&#8217;m having the time of my life in this conversation, by the way. I love it.</strong></p>

<p class="has-text-align-none">So, let&#8217;s ground this in today&#8217;s costs because anything in the future is speculative. It takes about $80 billion to fill up a one-gigawatt data center. That&#8217;s today&#8217;s number. If one company is going to commit 20-30 gigawatts, that&#8217;s $1.5 trillion of CapEx. To the point we just made, you&#8217;ve got to use it all in five years because at that point, you&#8217;ve got to throw it away and refill it. Then, if I look at the total commits in the world in this space, in chasing AGI, it seems to be like 100 gigawatts with these announcements. That&#8217;s $8 trillion of CapEx. It&#8217;s my view that there&#8217;s no way you&#8217;re going to get a return on that because $8 trillion of CapEx means you need roughly $800 billion of profit just to pay for the interest.</p>

<p class="has-text-align-none"><strong>Have you told Sam [Altman]? Because he seems to think he can get both the CapEx and the return.</strong></p>

<p class="has-text-align-none">But that&#8217;s a belief. It&#8217;s a belief that one company is going to be the only company that gets the entire market. I got it, that&#8217;s a belief. That&#8217;s what some people like to chase. I understand that from their perspective, but that&#8217;s different from agreeing with them. &#8220;Understand&#8221; is different from &#8220;agree.&#8221; I think it&#8217;s fine. I mean, they&#8217;re chasing it. Some people will make money, some people will lose money. All the [infrastructure] being built will be useful if it goes away, but if they make it, then they are the sole surviving company.</p>

<p class="has-text-align-none">Nilay, I will be clear. I am not convinced, or rather I give it really low odds — we&#8217;re talking like 0 to 1 percent — that the current set of known technologies gets us to AGI. That&#8217;s my bigger gap. I think that this current set is great. I think it&#8217;s incredibly useful for enterprise. I think it&#8217;s going to unlock trillions of dollars of productivity in the enterprise, just to be absolutely clear.&nbsp;</p>

<p class="has-text-align-none">That said, I think AGI will require more technologies than the current LLM path. I think it&#8217;ll require fusing knowledge with LLMs. We have words, and I&#8217;m not sure that&#8217;s the only way to create knowledge.&nbsp;</p>

<p class="has-text-align-none">People talk about neuro-symbolic AI, but if I just said &#8220;knowledge&#8221; in a broader sense, I mean hard knowledge that people have spent thousands of years discovering. If we can figure out a way to fuse knowledge with LLMs, maybe. Even then I&#8217;m a maybe, I&#8217;m not like 100 percent, but that&#8217;s from a geeky technical view.</p>

<p class="has-text-align-none"><strong>Actually, that was my question, and you started answering before I asked it.</strong></p>

<p class="has-text-align-none"><strong>I&#8217;m on the same path as you. I look at what LLMs can do today. I look at how people talk about the scaling laws they might hit, the need for more data that doesn&#8217;t necessarily exist at the scale it might be needed, and I say, &#8220;I don&#8217;t think LLMs can do it.&#8221; I don&#8217;t see a here-to-there path for this technology to get to what everybody says it can do.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>It sounds like you don&#8217;t think that&#8217;s true either. I would just connect that to what we started with. IBM developed Watson, and it was very good at its tasks, but it wasn&#8217;t the right set of bets at that moment and you had to pivot. Do you see the next technology that LLMs or the AI industry would have to pivot to?</strong></p>

<p class="has-text-align-none">Let&#8217;s look at three examples. Machine learning was not actually replaced. Machine learning is incredibly useful for lots of simple tasks. Your little thermostat in your house uses machine learning, not LLMs.</p>

<p class="has-text-align-none"><a href="https://www.theverge.com/2011/11/14/2559567/tony-fadell-nest-learning-thermostat"><strong>We did the first profile of the Nest</strong></a><strong>, and I remember meeting their machine learning person to talk about the Nest thermostat in 2011.</strong></p>

<p class="has-text-align-none">That&#8217;s incredibly useful. People look at it like golf ball, baseball, tennis ball trajectories. That&#8217;s all machine learning, it&#8217;s not being replaced. It&#8217;s really useful, but it&#8217;s not going to answer questions.&nbsp;</p>

<p class="has-text-align-none">Deep learning will be replaced with LLMs. I actually think LLMs are here to stay, I don&#8217;t think they&#8217;ll go away. But that&#8217;s not the end technology for AI. There is a next one and the next one will be an add, too. There&#8217;s machine learning, which is robust. There are&nbsp; LLMs, which I think are robust but are statistical in nature. So, where&#8217;s the deterministic piece? Where&#8217;s the knowledge piece? Is there something beyond LLMs?&nbsp;</p>

<p class="has-text-align-none">Look, this stuff is eight years old at this point. The first paper I think was in 2017, when intention and this approach with transformers came together.&nbsp; Is there another one? I don&#8217;t know. I suspect there is, but we don&#8217;t know. It&#8217;s the same as in 2016 when you couldn&#8217;t predict the current LLM technology.</p>

<p class="has-text-align-none"><strong>A comparison I would make is there&#8217;s now a core technology that everyone feels very invested in. I live in New York, and when I go to San Francisco, I joke that it&#8217;s just a different planet. Everyone is maybe much happier and more optimistic about AI than I am. I look at the companies springing up with the people who have left OpenAI to start super intelligence companies or AGI labs. They are all still betting on LLMs. The core of their work is still LLMs.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>The idea that you can feel the AGI is from a lot of people using Claude to write code and saying they can feel the AGI. Are you worried that there&#8217;s not enough investment in the stuff around the edges that might supplant or augment LLMs?</strong></p>

<p class="has-text-align-none">No, because I think when it is so unknown, it should not be companies that change it. I think that academia should change it. I do think there are enough AI researchers in academia who are going to be working around these topics, but when you don&#8217;t make enough progress, there isn&#8217;t going to be any media coverage or any other coverage. But from me talking to my friends — whether at MIT, at Illinois, or Chicago — there is work going on. It&#8217;s just not occupying attention because the airwaves are completely LLMs only.</p>

<p class="has-text-align-none"><strong>That&#8217;s why I&#8217;m asking. Do you think that there&#8217;s enough work happening? It sounds like you do, even in America in 2025 where there is pressure on universities to not bring in foreign graduates or have other kinds of academics going on. It seems tenuous at best.</strong></p>

<p class="has-text-align-none"><strong>Do you think that investment is still happening there?</strong></p>

<p class="has-text-align-none">I&#8217;m more optimistic than pessimistic. Is there some of what you described happening? Absolutely. But when I look at the number of top faculty in&nbsp; the top 20 engineering schools, it&#8217;s not really decreasing. Are there some funding cuts? We&#8217;re talking like under 10 percent. It&#8217;s not like it&#8217;s massive. Yes, there are much larger numbers in some areas than in others that are not hard sciences — by hard sciences I mean physics, chemistry, math, and engineering — but that&#8217;s not where I spend my energy. If I think about physics and hard engineering, I&#8217;d say there are some cuts, but it&#8217;s not that extreme.</p>

<p class="has-text-align-none">I also look at the national labs. No cuts. So it looks pretty good.</p>

<p class="has-text-align-none"><strong>I&#8217;m happy the frontier is good.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>Let me end by talking about the near term. We spent a lot of time talking about how things might go, how the core technology bets you&#8217;re making might play out over time, whether or not GPUs are dark fiber, which is one of my favorite arguments to have, I don&#8217;t know if you could tell.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>In the short to medium term, what we are seeing is a bunch of companies saying, &#8220;Okay, we have AI. We can just do it. We&#8217;re going to make the job cuts.&#8221; </strong><a href="https://www.ft.com/content/a74f8564-ed5a-42e9-8fb3-d2bddb2b8675"><strong>Accenture had a bunch of job cuts</strong></a><strong>. Amazon had a bunch of job cuts. </strong><a href="https://apnews.com/article/ups-amazon-layoffs-turnaround-85afc1c459883f41a2283c8394ce1eaf"><strong>UPS had a bunch of job cuts</strong></a><strong> just in the week that we&#8217;re talking.</strong></p>

<p class="has-text-align-none"><strong>If I was to be as harsh as possible about the work of your average big consulting firm, I would look at it and say, &#8220;Boy, a lot of that can go.&#8221; You can just let the AI make those decks all day long because the point of this contract is to let the CEO restructure the company. We just need the gloss of external validation to make the changes and the layoffs that we&#8217;re already going to make.&nbsp;</strong></p>

<p class="has-text-align-none"><strong>That is McKinsey&#8217;s function in the world: &#8220;Boy, it&#8217;s a lot cheaper and faster to just let the AI make the deck that no one ever reads in the end.&#8221; I feel like I see that playing out. How do you think people should react to that inside of the timeframes you were talking about where the real change comes?</strong></p>

<p class="has-text-align-none">Could there be up to 10 percent job displacement? I believe that&#8217;ll be likely over the next couple of years. It&#8217;s not 30 or 40 percent, but it is up to 10 percent of the total US employment pool. It is very concentrated in certain areas.&nbsp;</p>

<p class="has-text-align-none">Now, as you get more productive, companies are going to then hire more people but in different places. That was the point I was making. We are hiring more because people say, &#8220;I don&#8217;t need to do the entry-level task because an AI agent can do it.&#8221; I&#8217;m looking at them like, &#8220;Really?&#8221; Think strategically for a moment. Wouldn&#8217;t you rather have an entry-level person and AI makes them more like a 10-year expert? Isn&#8217;t that more useful to me than the other way around? Otherwise, where is the talent who&#8217;s going to come up with the next great product? Where is the person who&#8217;s going to be able to convince a client to deploy technology the way it should be deployed? That&#8217;s why I think some are being shortsighted.</p>

<p class="has-text-align-none">But I also think that some of this is happening right now because if you look at the total employment numbers, I think people gorged on employment. I used that phrase during the pandemic and the year after. Some of the displacement is just people saying, &#8220;I don&#8217;t need so many people because I went up 30, 40, 50, 100 percent from 2020 to 2023.&#8221; There is going to be some natural correction. Business is never completely optimized. I think in engineering terms, it&#8217;s an underdamped system. When there&#8217;s a need, it goes above. Now, it has to correct. It&#8217;s probably going to go below what&#8217;s needed, and then it&#8217;ll hit the correct equilibrium, depending on&nbsp; market demand and growth.</p>

<p class="has-text-align-none"><strong>Do you feel like the broader market is stable or predictable enough at this moment for that natural business correction cycle to play out in a healthy way?</strong></p>

<p class="has-text-align-none">People talk about, &#8220;With all the wars, with all of the cyber attacks, with interest rates, doom is coming. GDP is going to fall.&#8221; I kind of held the view where if I look at the demand, I think that global GDP growth near 3 percent looks likely. But that ignores inflation, so in real terms, we are at like 5 percent. I think that those two together are probably going to hold for quite some time.</p>

<p class="has-text-align-none"><strong>I&#8217;m curious because I hear from our readers who are consumers and some who work at tech companies and build the products. The split between how they feel about AI and what AI is doing to the economy and what people are claiming AI will do to the economy is as vast as any split I&#8217;ve ever experienced in my time covering technology.</strong></p>

<p class="has-text-align-none">I think people who got trained on a certain set of technologies and who are experts with their expertise don&#8217;t acknowledge it, but it&#8217;s deeply tied to their identity. Now suddenly, the person who&#8217;s been coding the product for 10 years finds that a kid coming in from college using generative AI tools is three times faster than them. They didn&#8217;t know the code, but the AI knows the code, and they don&#8217;t know how to use the AI.</p>

<p class="has-text-align-none"><strong>You&#8217;re the CEO of IBM. Is that your experience at IBM? Because what I hear from our readers is that it would be great, but it&#8217;s not true. It&#8217;s not happening.</strong></p>

<p class="has-text-align-none">We took a tool we built ourselves and that wasn&#8217;t one of the industry tools on code to help our people do software development. Within four months, the 6,000-person team that embraced it — so not a tiny number — was 45 percent more productive. Just to compare, we have 30,000 others who don&#8217;t yet use that tool.&nbsp;</p>

<p class="has-text-align-none">So, those are real numbers. We are going to grow those teams. We&#8217;re not trying to cut any of them because if we can be that much more productive at software development, that means we can build a lot more products, which means we can go get more market share. It doesn&#8217;t mean that it&#8217;s a fixed amount of work. I think the amount of work is infinite. So, we can be more productive.&nbsp;</p>

<p class="has-text-align-none">The calculus always is if it&#8217;s that expensive to build, is there enough margin so that it&#8217;s a viable business? If the answer is it&#8217;s cheaper to build it, I can sell it cheaper and still have a great margin. Does that make sense?</p>

<p class="has-text-align-none"><strong>Oh, it does. Yeah.</strong></p>

<p class="has-text-align-none">That is our lived experience, which is why I&#8217;m leaning into hiring more programmers and more tech people.</p>

<p class="has-text-align-none"><strong>Arvind, this is great. Tell people what&#8217;s next for IBM. What should they be looking for?</strong></p>

<p class="has-text-align-none">Watch what we are going to do on quantum. I think that in about two or three years, you&#8217;ll see some surprising results.</p>

<p class="has-text-align-none"><strong>Well, we&#8217;re going to have to have you back on </strong><strong><em>Decoder</em></strong><strong> very soon as this market shakes out, and then when the quantum bet pays off. That&#8217;s an exciting 24 months that I want to make sure you&#8217;re back for. Thank you so much for being on </strong><strong><em>Decoder</em></strong><strong>.</strong></p>

<p class="has-text-align-none">Pleasure being with you.</p>

<p class="has-text-align-none"><em><sub>Questions or comments about this episode? Hit us up at decoder@theverge.com. We really do read every email!</sub></em></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Verge Staff</name>
			</author>
			
			<title type="html"><![CDATA[MOVEit cyberattacks: keeping tabs on the biggest data theft of 2023]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/23892245/moveit-cyberattacks-clop-ransomware-government-business" />
			<id>https://www.theverge.com/8159/moveit-cyberattacks-clop-ransomware-government-business</id>
			<updated>2025-01-22T08:01:47-05:00</updated>
			<published>2024-11-11T15:22:05-05:00</published>
			<category scheme="https://www.theverge.com" term="Business" /><category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Policy" /><category scheme="https://www.theverge.com" term="Privacy" /><category scheme="https://www.theverge.com" term="Security" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[In May 2023, a ransomware gang called Clop began abusing a zero-day exploit of Progress Software&#8217;s MOVEit Transfer enterprise file transfer tool. Progress quickly issued a patch, but the damage was already extensive. Clop&#8217;s widespread attack saw it steal data from government, public, and business organizations worldwide, including New York City&#8217;s public school system, a [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Illustration: Beatrice Sala" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/23262657/VRG_Illo_STK001_B_Sala_Hacker.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>In May 2023, a ransomware gang called Clop began abusing a <a href="https://www.cisa.gov/news-events/cybersecurity-advisories/aa23-158a">zero-day exploit of Progress Software&rsquo;s MOVEit Transfer</a> enterprise file transfer tool. Progress quickly issued a patch, but the damage was already extensive. Clop&rsquo;s <a href="https://www.theverge.com/2023/6/15/23762583/several-federal-government-agencies-have-been-breached-via-the-moveit-vulnerabilities">widespread attack</a> saw it steal data from <a href="https://www.theverge.com/2023/6/5/23749650/a-security-exploit-for-a-file-transfer-tool-is-behind-data-breaches-at-the-bbc-british-airways-and-m">government, public, and business organizations worldwide</a>, including <a href="https://www.schools.nyc.gov/alerts/alert-regarding-data-incident">New York City&rsquo;s public school system</a>, a UK-based HR solutions and payroll company with clients like <a href="https://techcrunch.com/2023/06/05/microsoft-clop-moveit-hacks-victims/">British Airways and BBC</a>, and others.</p>

<p>How many others? According to a <a href="https://www.emsisoft.com/en/blog/44123/unpacking-the-moveit-breach-statistics-and-analysis/">running tally from Emsisoft</a>, over 2,000 organizations have reported being attacked, with data thefts affecting more than 62 million people. The vast majority of attacks were on US-based entities. Most recently, BORN Ontario, which first <a href="https://www.bornontario.ca/en/news/cybersecurity-incident-moveit.aspx">reported being attacked in June</a>, revealed that data from newborns and pregnant patients in Ontario, <a href="https://www.bornincident.ca/">spanning from January 2010 to May 2023</a>, was stolen, affecting on the order of about 3.4 million people.</p>

<p>Progress issued two more patches on June 9th and June 15th, both of which addressed further vulnerabilities that were &ldquo;distinct&rdquo; from the original exploit. In both cases, the company&rsquo;s <a href="https://www.progress.com/security/moveit-transfer-and-moveit-cloud-vulnerability">page announcing those patches</a> says that, while its investigations are ongoing, it doesn&rsquo;t see any evidence they were used for further attacks.</p>

<p>There has been&#8230; so very much legal action after the attacks. Class action lawsuits have been <a href="https://thecyberexpress.com/ibm-class-action-lawsuit-ibm-data-breach/">filed against IBM</a>, which ran servers that were <a href="https://techcrunch.com/2023/08/14/millions-americans-health-data-moveit-hackers-clop-ibm/">breached for multiple organizations</a>, <a href="https://www.thinkadvisor.com/2023/08/24/prudential-sued-over-moveit-hack/">Prudential Financial</a>, <a href="https://www.hbsslaw.com/press/progress-software-moveit-data-breach/multiple-class-action-lawsuits-filed-after-2023-moveit-data-breach-affecting-more-than-40-million-people#:~:text=BOSTON%20%E2%80%93%20Following%20the%202023%20MOVEit,an%20estimated%2040%20million%20people.">Progress Software</a> itself, and others. The MOVEit breach and other high-profile hacks have led to the SEC requiring public companies to <a href="https://www.theverge.com/2023/7/26/23808943/sec-cybersecurity-public-companies-hacks-data-breaches">issue disclosures within four days</a> of discovering a cybersecurity incident, except when the disclosure could be a national security or public safety risk.</p>
<ul>
					<li>
				<a href="https://www.theverge.com/2024/11/11/24293817/amazon-employee-emails-phone-numbers-moveit-data-breach">Amazon confirms employee data breach, but says it’s limited to contact info</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/11/10/23955767/maine-says-moveit-hackers-accessed-the-information-of-1-3-million-people">Maine says MOVEit hackers accessed the information of 1.3 million people.</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/10/30/23939291/moveit-hackers-accessed-around-632000-government-emails-last-year">MOVEit hackers accessed around 632,000 government emails last year.</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/10/5/23905370/sony-interactive-entertainment-security-breach-confirmation">Sony confirms server security breaches that exposed employee data</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/10/2/23899805/progress-software-fixes-more-bugs-attackers-are-exploiting-in-its-secure-file-transfer-software">Progress Software fixes more bugs attackers are exploiting in its “secure” file-transfer software.</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/9/27/23892665/the-biggest-known-moveit-hack-leaked-the-personal-information-of-up-to-11-million-people">The biggest known MOVEit hack leaked the personal information of up to 11 million people.</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/9/27/23892458/over-50000-students-data-was-stolen-in-a-recent-moveit-breach">Over 50,000 students’ data was stolen in a recent MOVEit breach.</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/7/26/23808943/sec-cybersecurity-public-companies-hacks-data-breaches">New SEC rules put a time limit on reporting hacks and data breaches</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/6/15/23762583/several-federal-government-agencies-have-been-breached-via-the-moveit-vulnerabilities">“Several” federal government agencies have been breached via the MOVEit vulnerabilities.</a>
			</li>
					<li>
				<a href="https://www.theverge.com/2023/6/5/23749650/a-security-exploit-for-a-file-transfer-tool-is-behind-data-breaches-at-the-bbc-british-airways-and-m">A security exploit for a file transfer tool is behind data breaches at the BBC, British Airways, and more.</a>
			</li>
			</ul>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Nilay Patel</name>
			</author>
			
			<title type="html"><![CDATA[IBM’s Jerry Chow on the future of quantum computing]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/23988271/ibm-quantum-heron-system-two-jerry-chow-qubits" />
			<id>https://www.theverge.com/23988271/ibm-quantum-heron-system-two-jerry-chow-qubits</id>
			<updated>2023-12-05T10:00:00-05:00</updated>
			<published>2023-12-05T10:00:00-05:00</published>
			<category scheme="https://www.theverge.com" term="Decoder" /><category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="Podcasts" /><category scheme="https://www.theverge.com" term="Science" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Today, I&#8217;m talking with Jerry Chow. He&#8217;s the director of quantum systems at IBM, meaning he&#8217;s trying to build the future one qubit at a time. IBM made some announcements this week about its plans for the next 10 years of quantum computing: there are new chips, new computers, and new APIs. You&#8217;ll hear us [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Illustration by The Verge | Photo by IBM" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/25133027/Chow_Decoder_credit_IBM.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Today, I&rsquo;m talking with Jerry Chow. He&rsquo;s the director of quantum systems at IBM, meaning he&rsquo;s trying to build the future one qubit at a time.</p>

<p>IBM <a href="https://research.ibm.com/blog/quantum-roadmap-2033">made some announcements this week</a> about its <a href="https://www.theverge.com/2023/12/4/23987596/ibm-lays-out-a-ten-year-plan-for-useful-quantum-supercomputing">plans for the next 10 years of quantum computing</a>: there are new chips, new computers, and new APIs. You&rsquo;ll hear us get into more of the details as we go, but the important thing to know upfront is that quantum computers could have theoretically incredible amounts of processing power and could entirely revolutionize the way we think of computers&hellip; if, that is, someone can build one that&rsquo;s actually useful.&nbsp;</p>

<p>Here&rsquo;s Jerry, explaining the basics of what a quantum computer is:</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>A quantum computer is basically a fundamentally different way of computing. It relies on the laws of quantum mechanics, but it just changes how information is handled. So instead of using bits, we have quantum bits or qubits.</p>
</blockquote>
<p>A regular computer &mdash; the quantum folks call them &ldquo;classical computers&rdquo; &mdash; like an iPhone or a laptop or even a fancy Nvidia GPU works by encoding data in bits. Bits basically have two states, which we call zero and one. They&rsquo;re on or they&rsquo;re off.&nbsp;</p>

<p>But the laws of quantum mechanics that Jerry just mentioned mean that qubits behave very, very differently. They <em>can</em> be zero or one, but they might also be a whole lot of things in between.</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>You still have two states: a zero and a one. But they can also be in superpositions of zero and one, which means that there&rsquo;s a probability that when you measure it, it will be zero or one with particular probability. In terms of how we physically build these, they&rsquo;re not switches anymore, they&rsquo;re not transistors, but they&rsquo;re actually elements that have quantum mechanical behavior.</p>
</blockquote>
<p>One of my favorite things about all this is that in order to make these new quantum computers work, you have to cool them to within fractions of a degree of absolute zero, which means a lot of companies have had to work very hard on cryogenic cooling systems just so other people could work on quantum chips. Jerry calls early quantum computers &ldquo;science projects,&rdquo; but his goal is to engineer actual products people can use.</p>

<p>You&rsquo;ll hear Jerry talk about making a useful quantum computer in terms of &ldquo;utility,&rdquo; which is when quantum computers start to push against the limits of what regular computers can simulate. IBM has been chasing after utility for a while now. It first made quantum computers available on the cloud in 2016, it&rsquo;s shipped System One quantum computers to partners around the world, and now, this week, it&rsquo;s announcing System Two along with a roadmap for the future. It&rsquo;s <em>Decoder</em>, so I asked Jerry exactly how he and his team sit down and build a roadmap for the next 10 years of applied research in a field that requires major breakthroughs at every level of the product. Oh, and we talked about <em>Ant-Man</em>.&nbsp;</p>

<p>It&rsquo;s a fun one &mdash; very few people sit at the bleeding edge all day like Jerry.</p>

<p>Okay. Jerry Chow, director of quantum systems at IBM. Here we go.</p>
<iframe frameborder="0" height="200" src="https://playlist.megaphone.fm/?e=VMP6513371576" width="100%"></iframe>
<p><em>This transcript has been lightly edited for length and clarity.</em></p>

<p><strong>Jerry Chow, you are an IBM fellow and director of quantum systems. Welcome to <em>Decoder</em>.</strong></p>

<p>Glad to be here.</p>

<p><strong>I&rsquo;m really excited to talk to you. There&rsquo;s quite a lot to talk about &mdash; quantum computing in general, where it is. But you&rsquo;ve got some news to announce today, so I want to make sure we talk about the news right off the bat. What is going on in IBM Quantum?</strong></p>
<div class="wp-block-vox-media-highlight vox-media-highlight alignnone"><h3 class="wp-block-heading" id="">&nbsp;</h3>

<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/24792604/The_Verge_Decoder_Tileart.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="" title="" data-has-syndication-rights="1" data-caption="" data-portal-copyright="" />


<p>Listen to <em>Decoder</em>, a show hosted by <em>The Verge</em>&rsquo;s Nilay Patel about big ideas &mdash; and other problems.&nbsp;Subscribe&nbsp;<a href="https://podcasts.apple.com/us/podcast/welcome-to-decoder/id1011668648?i=1000496212371&amp;itsct=podcast_box&amp;itscg=30200&amp;ls=1&amp;at=1001l7uV&amp;ct=verge091322">here</a>!</p>
</div>
<p>Yeah, so we have our annual <a href="https://www.ibm.com/quantum/summit-2023">Quantum Summit</a> coming up, where we basically invite our network of members and users to come, and we talk about some of the really exciting news. What we&rsquo;re announcing this year is actually we have a really exciting upgraded quantum processor that we&rsquo;re talking about. It&rsquo;s called the IBM Quantum Heron. It has 133 qubits. It&rsquo;s the highest performance processor that we&rsquo;ve ever built, and it&rsquo;s going to be available for users to access via our cloud services.</p>

<p>We&rsquo;re also launching IBM Quantum System Two and introducing this as a new architecture for scaling our quantum computers into the future. We&rsquo;re also talking about a 10-year roadmap looking ahead. We, at IBM Quantum, like to sort of call our shots, tell everyone what we&rsquo;re doing because that keeps us honest, keeps everyone in the industry on the same benchmark of seeing what progress is. And we&rsquo;re expanding that roadmap, which we actually first introduced a couple of years ago and have hit all our milestones thus far. But we are extending it out to 2033, pushing forward into this next realm where we&nbsp; really want to drive toward pushing quantum computing at scale.</p>

<p><strong>So you&rsquo;ve got a new processor, you&rsquo;ve got a new computing architecture in System Two, you&rsquo;ve got a longer roadmap. Put that in context for me: we&rsquo;ve been hearing about quantum computing for quite a long time. I have stared at a number of quantum computers and been told, &ldquo;This is the coldest piece of the universe that has ever existed.&rdquo; It&rsquo;s been very entertaining, at the very least. We&rsquo;re only now at the point where we&rsquo;re actually solving real problems with quantum computers.</strong></p>

<p>We&rsquo;re not even at the point of solving real problems.</p>

<p><strong>Not even yet?</strong></p>

<p>Not yet. But we are, really excitingly, just this past year, at the point where we&rsquo;re calling this utility-scale quantum computing. We&rsquo;re using 100-plus qubits. We used a processor earlier in the year called Eagle, where we were able to look at a particular problem that you couldn&rsquo;t really solve with brute-force methods using a classical computer, but also it challenged the best classical approximation methods that are used on high-performance computing. So what&rsquo;s interesting there is that now the quantum computer becomes like the benchmark. You almost need it to verify whether your approximate classical methods are working properly. And that just happens when you go over 100 qubits.</p>

<p>At 100 qubits, things all change so that you just can&rsquo;t use, say, GPUs or any kind of classical computers to simulate what&rsquo;s going on accurately. This is why we&rsquo;re in this phase where we call it utility scale because there&rsquo;s going to be this back and forth between using a quantum as a tool compared with what you can still potentially do in classical. But then there&rsquo;s a long road there that we&rsquo;re going to try to drive value using the quantum to get toward quantum manage.</p>

<p><strong>I think the word utility there threw me off. This is the branch point where the problems you solve with a quantum computer start to become meaningfully different than the problems you could solve with a regular computer.</strong></p>

<p>That&rsquo;s right. We see this really as an inflection point. There are a lot of industries that use high-performance computation already, and they are looking at very, very hard problems that use the Oak Ridge supercomputers and whatnot. And now quantum becomes an additional tool that opens up a new lens for them to look at a different area of compute space that they weren&rsquo;t able to look at before.</p>

<p><strong>So IBM has a huge program in quantum. The other big companies do, too &mdash; Microsoft, Google, what have you, they&rsquo;re all investing in this space. Does this feel like a classical capitalist competition, &ldquo;We&rsquo;re all racing forward to get the first product to market&rdquo;? Is it a bunch of researchers who know that there&rsquo;s probably a pot of gold at the end of this rainbow, but we&rsquo;re nowhere close to it yet, so we&rsquo;re all kind of friendly? What&rsquo;s the vibe?</strong></p>

<p>I&rsquo;d say that it&rsquo;s a very exciting time to be in this field. How often do you get to say you&rsquo;re building from the ground floor of a completely new computational architecture? Something that is just fundamentally different from traditional classical computing. And so yeah, I&rsquo;d say that there&rsquo;s certainly a lot of groundswell, there&rsquo;s a lot of buzz. Sometimes a little too much buzz, maybe. But also I think from the perspective of competition, it helps drive the industry forward.&nbsp;</p>

<p>We, at IBM, have been at the forefront of computation for decades. And so it&rsquo;s in our blood. The ideas of roadmaps and pushing the next big development, the next big innovations in computation, have always been something that is just native to IBM, and quantum is no different. We&rsquo;ve been in the game with quantum since the early <a href="https://www.wired.com/story/wired-guide-to-quantum-computing/">theoretical foundings</a> for probably 30 years, 30-plus years. But now we&rsquo;re really starting to bear a lot of that fruit in terms of building the architectures, building the systems, putting out the hardware, developing the framework for how to make it usable and accessible.</p>

<p><strong>Let me give you just a much dumber comparison. We </strong><a href="https://www.theverge.com/23824200/ai-cloud-amazon-aws-adam-selipsky"><strong>had the CEO of AWS on the show</strong></a><strong>, Adam Selipsky. AWS is furiously competitive with Microsoft Azure and Google Cloud. They are trying to take market share from each other, and they do a lot of innovative things to make better products, but the end goal of that is taking one customer away from Google. You&rsquo;re not there yet, right? There&rsquo;s not market share to be moved around yet?</strong></p>

<p>Certainly not at that scale.</p>

<p><strong>But are there quantum customers that you compete for?</strong></p>

<p>There&rsquo;s certainly a growing quantum community.&nbsp;</p>

<p><strong>[<em>Laughs</em>] It&rsquo;s not a customer; there are people who are interested.</strong></p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“At 100 qubits, things all change”</p></blockquote></figure>
<p>There are people that are interested across the board, from developers, to students, to Fortune 500 companies. We have a lot of interest. So just as an example, we <a href="https://uk.newsroom.ibm.com/2016-May-04-IBM-Makes-Quantum-Computing-Available-on-IBM-Cloud-to-Accelerate-Innovation">first put</a> systems on the cloud in 2016. We put a very simple five-qubit computer, five-qubit quantum computer, on the cloud. But it reflected a real fundamental shift in how quantum could be approached. Before, you had to be sort of a physicist. You had to be in a laboratory turning knobs. You&rsquo;re taking data, you&rsquo;re running physicist code; you&rsquo;re not programming a computer.</p>

<p><strong>Wow. [<em>Laughs</em>] Shout out to physicists.</strong></p>

<p>Well, I&rsquo;m a physicist, and you don&rsquo;t want to see my code. [<em>Laughs</em>] But the whole point is that we developed a whole framework around it to actually deploy it and to make it programmable. And think about the early days of computers and all the infrastructure you needed to build in terms of the right assembly language and compilers and the application layers all above that. We&rsquo;ve been building that for the last seven years since that first launched. And in that time, we&rsquo;ve had over 500,000 users of our platform and of our services.</p>

<p><strong>I&rsquo;m always curious how things are structured and how decisions are made. That&rsquo;s really what we talk about on the show. And there&rsquo;s a forcing function that comes when it&rsquo;s a business, and there&rsquo;s a growth path. Quantum seems very much like one day it will be a huge business because it will solve problems that regular computers can&rsquo;t. But right now, it&rsquo;s on the very early part of the curve where you&rsquo;re investing a lot into R&amp;D, on an aggressive roadmap, but you&rsquo;re nowhere close to the business yet.</strong></p>

<p>I would say that we&rsquo;re knocking on the door of business value and looking for that business value, because especially when we&rsquo;re in this realm where we know that it can be used as a tool pitted against the best classical computers, there&rsquo;s something there to be explored. A lot of times, even with traditional computers, there are very few proven algorithms that are where we drive all the value. A lot of the value that gets driven is done through heuristics, through just trial and error, through having the tool and using it on a particular problem. That&rsquo;s why we see this fundamental game-changer of this inflection point going toward utility scale systems of over 100 qubits as now this is the tool that we want users to actually go and find business advantage, find the problems that map appropriately onto these systems for exploration.</p>

<p><strong>So put that in the context of IBM. IBM&rsquo;s a huge company, it&rsquo;s over 100 years old, it does a lot of things. This is probably the most cutting-edge thing IBM is doing, I imagine. I&rsquo;m guessing you&rsquo;re not going to disagree with me. But it feels like the most cutting-edge thing that most of the Big Tech companies are doing.</strong></p>

<p>Yes, absolutely.</p>

<p><strong>How is that structured inside of IBM? How does that work?</strong></p>

<p>So we&rsquo;re IBM Quantum within IBM Research. IBM Research has always been the organic growth engine for all of IBM. It&rsquo;s where a lot of the innovative ideas come in, but overall, a particular strategy within IBM and IBM Research is that we&rsquo;re not just doing research and then we&rsquo;re going to do development and then it&rsquo;s going to go on this very linearized product journey. It&rsquo;s all integrated together as we are moving forward. And so therefore, we have the opportunity within IBM Quantum that we&rsquo;re developing products, we&rsquo;re putting it on the cloud, we&rsquo;re integrating with IBM Cloud. We&rsquo;re actually pushing these things forward to build that user base, build that groundswell, before all the various different technology elements are finished. That&rsquo;s sort of this agile methodology of building this from the ground up, but also getting it out early and often to drive excitement and to really build up the other parts of the ecosystem.</p>

<p><strong>So how is IBM Quantum structured? How many people is it? How is it organized?</strong></p>

<p>So we don&rsquo;t speak explicit numbers, but we have several hundred people. And then we have parts of the team which are focused on the actual hardware elements, all the way down to the actual quantum processor and the system around it in terms of making those processors function by cooling it down in the cryogenic system, talking to it with control electronics, talking to it with classical computing. So it all needs to tie together.</p>

<p>Then you have software development teams. We also have a cloud and services team that helps to deliver our offerings as a service. And then we have applications teams looking at the next algorithms, the next novel ways of making use of our quantum services. We also have teams that are more outward-looking for business development &mdash; trying to drive adoption, working with various clients to engage in the problems of their interests. We also have a part of our team which runs an offering called the Quantum Accelerator. It&rsquo;s like a consulting arm, working with the clients to get quantum-ready, start understanding how their problems can be impacted by quantum computing and start using our systems.</p>

<p><strong>Is that all flat? Every one of those teams reports to you, or is there structure in between?</strong></p>

<p>No, so all those different ones report to our vice president of quantum computing, which is Jay Gambetta. I take care of the systems part. Basically, the wrapping of the processor and how it runs, executing problems for the users, that&rsquo;s the piece that I own.</p>

<p><strong>There&rsquo;s a tension there. It sounds like IBM is designed to attack this tension head-on, which is: &ldquo;We&rsquo;re doing a bunch of pure research in cryogenics to make sure that quantum computing can run because it has to be really cold to run.&rdquo; Then there&rsquo;s a business development team that&rsquo;s just off and running, doing sales stuff, and at some point they&rsquo;re going to come back and say, &ldquo;We sold this thing.&rdquo; And the cryogenics team is going to say, &ldquo;Not yet.&rdquo; Every business has a problem like that. When you&rsquo;re in pure research mode, the &ldquo;not yet&rdquo; is a real problem.</strong></p>

<p>Oh, yeah.</p>

<p><strong>How often do you run into that?</strong></p>

<p>We have a very good strategy across the team. We know our core services and what the core product we have is. And also we have a roadmap. The concept of the roadmap is both great for the R&amp;D side but also great for the client perspective, business development angle view of seeing what&rsquo;s coming next. From the internal side, we know we&rsquo;ve got to continue to drive toward this, and these are our deliverables and these are the new innovations that we need to do. In fact, in our new roadmap that we&rsquo;re releasing, we have that separated. Both a development roadmap, which is more product focused and more like what the end user&rsquo;s going to get and client&rsquo;s going to get. And we have an innovation roadmap to show those things which we&rsquo;re still going to need to turn to crank and figure out what feeds in.</p>

<p>I often say the roadmap is our mantra, and it really is our calling card both internally and externally. Not many people really show a lot of detail in their roadmap, but it serves as a guiding tool for us all.</p>

<p><strong>I was looking at </strong><a href="https://www.flickr.com/photos/ibm_research_zurich/53347055153/"><strong>that roadmap</strong></a><strong>, and it is very aggressive. We&rsquo;re at Heron, there are many birds to come from what I understand. And the goal is that a truly functional quantum computer needs thousands or millions of qubits, right?</strong></p>

<p>We have a transition toward what we are calling quantum at scale, which I think what you&rsquo;re referring to is when you will get to the point where you can run quantum error correction, correct for all the errors that are underlying within these qubits, which are noisy. People throw around that number &mdash; millions of qubits &mdash; in a way that almost drives fear into the hearts of people. One actually really exciting thing that we&rsquo;ve done this past year is we&rsquo;ve developed a set of novel error correction codes that brings down that resource count a lot.</p>

<p>So actually, you&rsquo;ll need potentially hundreds of thousands of qubits, 100,000 qubits or so, to build a fault-tolerant quantum error-correction-based quantum computer of a particular size to do some of those problems that we&rsquo;re talking about at scale. And that&rsquo;s part of the roadmap, too. So that&rsquo;s what we&rsquo;re looking at further to the Blue Jay system in 2033. So there&rsquo;s certainly a number of birds to get there, but we have concrete ideas for the technological hurdles to overcome to get there.</p>

<p><strong>That&rsquo;s the goal. You&rsquo;re going to get to some massively larger scale than you are today. Orders of magnitude. Today the chip has 133 qubits, you need to get to thousands. Some people, terrifyingly, are saying millions.</strong></p>

<p><strong>Part of your strategy is linking the chips together into these more modular systems and then putting control circuitry around them. I&rsquo;m a person who came up in what you might call the classical computing environment, that&rsquo;s very familiar. That&rsquo;s a very familiar strategy; we&rsquo;re just going to do more cores. That&rsquo;s what that looks like to me. Lots of companies have run up against a lot of problems here. In that part of the world, there&rsquo;s just Moore&rsquo;s law, and we sit around talking about it all day long. And Nvidia and maybe TSMC have gotten over it this time, and Intel has struggled to get the next process node and increase the transistor density. Is there an equivalent to Moore&rsquo;s law in quantum that you were thinking about?</strong></p>

<p>Our roadmap is showing that type of progression.</p>

<p><strong>I look at that roadmap, and you are definitely assuming a number of breakthroughs along the way &mdash; in a way that Intel just assumed it for years and years and they achieved it, and then kind of hit the end of the road.</strong></p>

<p>Even where we are today with Heron, and actually complementary to Heron this year, we also already built a 1,000-qubit processor, Condor. Its explicit goal was to push the limits of how many qubits could we put on a single chip, push the limits of how much architecture could we put in an entire system. How much could we actually cool down in the dilution refrigerators that we know today, the cryogenic refrigerators that we have today? Push the boundaries of everything to understand where things break. And if you look at the early part of our roadmap, the birds are there with various technological hurdles that we&rsquo;ve already overcome to get toward this thousand-qubit level. And now those next birds that you see in the rest of the innovation roadmap are different types of couplers, different types of technologies, that are those technological hurdles, like in semiconductors, that allow us to bridge the gap.</p>

<p><strong>Are they the same? Is it the same kind of, &ldquo;We need to double transistor density,&rdquo; or is it a different set of challenges?</strong></p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“I’d say, the decades of experience matter”</p></blockquote></figure>
<p>They&rsquo;re different, because with this sort of modular approach, there&rsquo;s some that are like, how many can we place into a single chip? How many can we place into a single package? How many can we package together within the system? So they all require slightly different technological innovations within the whole value chain. But we don&rsquo;t see them as not doable; we see them certainly as things that we will handle over the next few years. We&rsquo;re already starting to test linking between two packages via a cryogenic cable. This is toward our Flamingo demonstration, which we&rsquo;re planning for next year.</p>

<p><strong>Do you get to leverage any of the things that are happening on the process side with classical computers?</strong></p>

<p>Oh, yeah.</p>

<p><strong>Like </strong><a href="https://www.tsmc.com/english/dedicatedFoundry/technology/logic/l_3nm"><strong>TSMC hits three nanometers</strong></a><strong> and you get to pull that forward, or is that different?</strong></p>

<p>Not so explicitly to the newest stuff that&rsquo;s happening today in semiconductors. But IBM has been in the semiconductors game for many, many decades. And a lot of the work that we&rsquo;ve achieved with even achieving a 100 qubits with Eagle a couple of years ago was because we had that deep-rooted semiconductor background. So just to give you an example, at 100 qubits, the challenge is how do you actually wire to 100 qubits in a chip? The standard thing you do in semiconductors is you go to more layers, but it&rsquo;s not so easy to do that just in these superconducting quantum circuits because they might mess up the qubits. It might cause them to decohere.</p>

<p>But because of our know-how with packaging, we found the right materials, we found the right way of using our fabrication techniques to implement that type of multilayer wiring and still talk to these 100 qubits. We evolved that further this past year to actually get to 1,000. And so that type of semiconductor know-how is just ingrained and something that is, I&rsquo;d say, the decades of experience matter.</p>

<p><strong>So you&rsquo;re going to build the next-generation quantum computing chip, Heron. It&rsquo;s got 133 qubits. How is that chip manufactured?&nbsp;</strong></p>

<p>Okay. Well, to build the next-generation quantum computing chip, we rely on advanced packaging techniques that involve multiple layers of superconducting metal to package and to wire up various superconducting qubits. With Heron, we&rsquo;re also using a novel tunable coupler architecture, which allows us to have world-record performing two-qubit gate qualities. And all this is done in a standard fabrication facility that we have at IBM and package up this chip, and we have to cool it down into a cryogenic environment.</p>

<p><strong>So silicon goes in one side of the building, Heron comes out the other?</strong></p>

<p>I mean, certainly more steps than that. [<em>Laughs</em>] And there&rsquo;s this know-how of how to do it properly to have high-performing qubits, which we&rsquo;ve just built up.</p>

<p><strong>Explain to me what a high-performing qubit is.</strong></p>

<p>Yeah, so the tricky thing with these qubits&hellip; There are different ways of building qubits. There are people who use ions and atoms and electrons and things like that, but ours are actually just metal on a substrate; they&rsquo;re circuits. They&rsquo;re much like the circuits that you might see when you look inside of a standard chip. But the problem with these circuits is that you can build, so you can basically arrange them in a certain way and use the right materials. And you have a qubit that, in this case, for superconducting qubits, it resonates at five gigahertz.</p>

<p>If you choose the wrong materials, the lifetimes of these qubits can be extremely short. So when we first started in the field of building superconducting qubits in 1999, superconducting qubits lasted for maybe like two nanoseconds, five nanoseconds. Today, we&rsquo;ve gotten up to close to a millisecond, hundreds of microseconds to a millisecond. Already in numbers orders of magnitude longer. But that took many years of development. And at the point of a few hundred microseconds, we&rsquo;re able to do all these complex operations that we&rsquo;ve been talking about to push this utility scale that we discussed earlier. So that know-how to increase that lifetime comes down to engineering, comes down to understanding the core pieces that generate loss in the materials, and that&rsquo;s something that we certainly have expertise at.</p>

<p><strong>Tell me about the industry at large. So IBM has one approach: you said you&rsquo;re using metals on a substrate. You&rsquo;re leveraging all of the semiconductor know-how that IBM has. When you&rsquo;re out in the market and you&rsquo;re looking at all your competitors, Microsoft is doing something else, Google something else. Go through the list for me. What are the approaches, and how do you think they&rsquo;re going?</strong></p>

<p>When we think about competitors, you can think about the platform competitors of who&rsquo;s building the services, but I think what you&rsquo;re pointing to more is the hardware side.</p>

<p>When it comes down to it, there&rsquo;s a simple set of metrics for you to compare the performance of the quantum processors. It&rsquo;s scale: what number of qubits can you get to and build reliably? Quality: how long do those qubits live for you to perform operations and calculations on? And speed: how quickly can you actually run executions and problems through these quantum processors? And that speed part is something where it&rsquo;s an interplay between your quantum processor and your classical computing infrastructure because they talk to one another. You don&rsquo;t control a quantum computer without a classical computer. And so you need to be able to get your data in, data out and process it on the classical side.</p>

<p>So scale, quality, speed. Our approach with superconducting qubits, to the best of our knowledge, we can hit all three of those in a very strong way. Scale, pushed up to over 1,000 qubits. We know that we can build up to 1,000 qubits already with the technologies that we&rsquo;ve built. From the quality, Heron &mdash; which we&rsquo;re releasing &mdash; has the best gate quality. So the gates, the operations, the gate qualities that have been shown across a large device. And then speed, in terms of just the execution times, we&rsquo;re on the order of microseconds for some of the clock rates, whereas other approaches can be a thousand orders of magnitude slower.</p>

<p><strong>What are the other approaches in the industry that you see, and where are they beating you and where are you ahead?</strong></p>

<p>So there are trapped ions: basically they&rsquo;re using molecular ions like caesium and things that you might use for clocks, atomic clocks. They can have very good quality. In fact, there are some results that have tremendous performance across a number of those types of trapped-ion qubits in terms of their two-qubit gate qualities. But they&rsquo;re slow. In terms of the clock rates of getting your operations in, getting your operations out, you do operations to recycle the ion sometimes. And that&rsquo;s where it, I&rsquo;d say, has a downside.</p>

<p>I&rsquo;d say, right now, superconducting qubits and trapped ions are the approaches that have the most prominence at the moment that have been put out in terms of usable services. Atoms have also emerged; it&rsquo;s very similar to the trapped ions. There, they use these fun little things called <a href="https://www.photometrics.com/learn/single-molecule-microscopy/optical-trapping">optical tweezers</a> to hold atoms into little arrays. And there are some exciting results that have been coming out from various atom groups there. But again, it comes down to that speed. Anytime you have these actual atomic items, either an ion or an atom, your clock rates end up hurting you.</p>

<p><strong>Alright, let me make a comparison to semiconductors again. So in semiconductors there was </strong><a href="https://semiengineering.com/knowledge_centers/manufacturing/patterning/multipatterning/"><strong>multiple pattern lithography</strong></a><strong> that everyone chased for a minute, and it hit an end state. And then TSMC had bet really big </strong><a href="https://www.reuters.com/technology/tsmc-says-it-will-have-advanced-asml-chipmaking-tool-2024-2022-06-16/"><strong>on EUV</strong></a><strong> and that let them push ahead. And Intel had to make a big shift over there. You&rsquo;re looking at your roadmap, you&rsquo;re doing superconductors, cryogenics, metals on substrates, and over here some guys are doing optical tweezers on atoms. Is there a thought in your head like, &ldquo;We better keep an eye on this because that might be the process innovation that we actually need&rdquo;?</strong></p>

<p>I think overall, in the landscape, we&rsquo;re always keeping track of what&rsquo;s going on. You&rsquo;re always seeing what are the latest innovations in the various different technologies.</p>

<p><strong>Is that even a good comparison to semiconductors in that way?</strong></p>

<p>The whole systems are completely different. The architectures are not so compatible. At some level, with your nodes of your semiconductors, there might be certain kinds of know-how that translate how you route and layout, maybe. And here, above a certain layer, there&rsquo;s also going to be commonality in terms of the compute platform, how the quantum circuits are generated. The software layers might be similar, but the actual physical hardware are very different.</p>

<p><strong>It feels like the thing we&rsquo;re talking about is how do you make a qubit? And it&rsquo;s not settled yet. You have an approach that you&rsquo;re very confident in, but there&rsquo;s not a winner in the market.</strong></p>

<p>I mean, we&rsquo;re pretty confident. We&rsquo;re pretty confident in superconducting qubits.</p>

<p><strong>Fair enough. [<em>Laughs</em>] I was just wondering.</strong></p>

<p>It&rsquo;s why we&rsquo;re able to prognosticate 10 years forward, that we see the direction we&rsquo;re going. And to me it&rsquo;s more that there are going to be innovations within that are going to continue to compound over those 10 years, that might make it even more attractive as time goes on. And that&rsquo;s just the nature of technology.</p>

<p><strong>You&rsquo;ve got to make decisions on maybe the longest timeline of anyone I&rsquo;ve had on the show. It&rsquo;s always the chip people who have the longest timelines. I talk to social media CEOs, and it&rsquo;s like their timeline is like five minutes from now, like, &ldquo;What are we going to ban today?&rdquo; That&rsquo;s the timeline. I talk to chip folks, and your timelines are decades. You just casually mentioned a chip you&rsquo;re going to ship in 2033. That&rsquo;s a long time from now. How do you make decisions on that kind of timeline?</strong></p>

<p>There&rsquo;s the near-term stuff, obviously, and the roadmap serves as that guide. That roadmap is constructed so that all these various things do impact that long-term delivery.</p>

<p><strong>Just walk me through: What does the quantum computing roadmap meeting look like? You&rsquo;re all in a conference room, are you at the whiteboard? Paint the picture for me.</strong></p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“It’s mainly an inertia thing to move entire industries, move banks, move commerce, to adopt those standards”</p></blockquote></figure>
<p>Yeah, that is a great question. I mean, we have a number of us who are sitting there. We certainly know that we have certain types of technical foundations that we know that we need to include into these next-generation chips and systems.</p>

<p>For this roadmap, we said, &ldquo;We know at some point we need to get quantum error correction into our roadmap.&rdquo; And with that technical lead, we know what are the requirements? So first we said, &ldquo;Okay, let&rsquo;s put it here. Now let&rsquo;s work backward. It says that we need to do this innovation and this innovation by this date, and this other innovation in the software stack or whatever by this date.&rdquo; And then we say, &ldquo;Oh shoot, we ran out of time. Let&rsquo;s move back a little bit.&rdquo; And so we do a little bit of that planning, because we also want to do it so that we lay out this roadmap that we often call no-regrets decisions. We don&rsquo;t want to do things that are just for the near term. We want to really pick strategies that give us this long-term path.</p>

<p>It&rsquo;s why we talk about utility scale so much in terms of what we can do with Herons and soon Flamingos. But everything that we want to build on top of what we can do there will translate to what we can do when we get those systems at scale, including error correction. And in terms of the roadmap planning&hellip; We&rsquo;re not done, by the way. We have this overall framework for the 10-year roadmap, and then we need to refine. We&rsquo;ve got a lot of details still to come to work on in terms of what are those things that need to be worked on across the software layer, the compiler layer, the control electronics layer, and certainly at the processor layer.</p>

<p><strong>Is there commercial pressure on this? Again, this is a lot of cost at a big public company. Is the CEO of IBM in that room saying, &ldquo;When&rsquo;s this going to make money? Move it up&rdquo;?</strong></p>

<p>I think the point is, our mission is to bring useful quantum computing to the world. I&rsquo;ve been working in this area for 20 years now. We&rsquo;ve never been this close to being able to build something that is driving real value. And so I think when you look at our team, we are all aligned along that mission. That we want to drive this to something that&hellip; We started with just getting it out there in the cloud in terms of building the community. Now, we fundamentally see this as a tool that will alter how users are going to perform computation. And so there has to be, and I expect there to be, value there. And we&rsquo;ve seen how the HPC community has progressed and we&rsquo;ve seen how supercomputing has&#8230; You could see what&rsquo;s happening with the uptake of AI and everything. We build it, we will build the community around it, we&rsquo;ll drive value.</p>

<p><strong>Let&rsquo;s talk about AI for a second. This is a really good example of this. AI demand is through the roof. The industry is hot. We&rsquo;ll see if the products are long lasting, but there seems to be real consumer demand for them. And that is all translated into </strong><a href="https://www.theverge.com/2023/12/4/23987953/the-gpu-haves-and-have-nots"><strong>a lot of people</strong></a><strong> want a lot of Nvidia H100 chips. It&rsquo;s very narrowly focused on one kind of processor. Do you see quantum systems coming into that zone where we&rsquo;re going to run a lot of AI workloads on them? Like future AI workloads.</strong></p>

<p>What&rsquo;s happened in AI is phenomenal, but we&rsquo;re not at the point where the quantum computer is this commodity item that we&rsquo;re just buying tons of chips. You&rsquo;re not fabricating millions of these chips. But we are going to build this supercomputer based off of quantum computing, which is going to be exquisitely good at certain types of tasks. And so the framework that I actually see is &hellip; already you&rsquo;re going to have your AI compute clusters. The way that people run workloads today, I&rsquo;m sure they are running some parts on their regular computers, on their own laptops, but parts of the job get fed out to the cloud, to their hyperscalers, and some of them are going to use the AI compute nodes.</p>

<p>We see that also for how quantum will feed in. It&rsquo;ll be another part of that overall cloud access landscape where you&rsquo;re going to take a problem, you&rsquo;re going to break it down. You&rsquo;re going to have parts of it that run on classical computing, parts of it that might run on AI, parts of it that will leverage what we call quantum-centric supercomputing. That&rsquo;s the best place to solve that one part of the problem. Then it comes back, and you&rsquo;ve got to stitch all that together. So from the IBM perspective, where we often talk about hybrid cloud, that&rsquo;s the hybrid cloud that connects all these pieces together. And differentiation is there in terms of building this quantum-centric supercomputer within there.</p>

<p><strong>So your quantum-centric supercomputer in the cloud. We&rsquo;ve talked a lot about superconducting now. You need a data center that&rsquo;s very cold. This does not seem like a thing that&rsquo;s going to happen locally, for me, ever, unless </strong><a href="https://www.theverge.com/2023/8/10/23827216/superconductor-lk-99-research-findings"><strong>LK-99</strong></a><strong> is real. This isn&rsquo;t going to happen for anyone in their home outside of an IBM data center for quite some time.</strong></p>

<p>I would say this. So when I was first working in this area and did <a href="https://rsl.yale.edu/sites/default/files/files/RSL_Theses/jmcthesis.pdf">my PhD</a> in this area &mdash; I worked on superconducting qubits &mdash; we required these large canisters, these refrigerators, where we need to wheel up these huge jugs of liquid helium and fill them every three days to keep them cold. Now, that&rsquo;s a physics experiment. I mean, there have already been innovations in cryogenics that they&rsquo;re turnkey: you plug them in, they stay running, they can run for years and maintain your payloads at the right temperatures. You&rsquo;re paying electricity, obviously, to keep them cold. But we&rsquo;re seeing innovations there, too, in terms of driving infrastructure-scale cryogenics. Honestly, we&rsquo;re going to evolve the data center of the future, just like data centers today have evolved to handle increased compute resources needed. We will work hand in hand with how to build these quantum data centers, and we&rsquo;re already doing that. So we have a quantum data center up in Poughkeepsie, which hosts the majority of our systems, and we&rsquo;re planning on expanding that further.</p>

<p><strong>I think AI has very much complicated the question of what you&rsquo;re allowed to do with a computer chip. The White House just released </strong><a href="https://www.theverge.com/2023/10/30/23914507/biden-ai-executive-order-regulation-standards"><strong>an executive order</strong></a><strong> about AI. And somewhere in there is the idea that you should not be able to do some things with AI. And I </strong><a href="https://www.theverge.com/23894647/amd-ceo-lisa-su-ai-chips-nvidia-supply-chain-interview-decoder"><strong>talked to AMD CEO Lisa Su</strong></a><strong> at the Code Conference, and I said, &ldquo;Would you accept a regulation that limits what people can do on an AMD chip?&rdquo; And she said, &ldquo;Well, yeah, we might have to. There might be some stuff we just don&rsquo;t let these computers do anymore.&rdquo; Which is very challenging when you&rsquo;re talking about someone&rsquo;s laptop.</strong></p>

<p><strong>It is way less challenging when you&rsquo;re talking about a data center. Like AWS can just keep you from doing a workload. IBM, I&rsquo;m sure, has </strong><a href="https://quantum.ibm.com/terms"><strong>rules and regulations</strong></a><strong> about what its cloud is capable of doing and what you will allow to be done with its cloud computing. You fast-forward quantum, people are worried that you&rsquo;re going to break AES encryption with quantum one day, and then the world will fall apart because the world runs on AES encryption. Are you thinking about that yet: there&rsquo;s some stuff we should not allow people to do? And as we build the cloud system, we should make sure we put the controls in place?</strong></p>

<p>There are certainly threads of that type of discourse, especially throughout the community. Personally, what I see is, the encryption one, we already know that there are quantum-safe encryption standards. And a fun thing is, in terms of IBM Quantum, our mission is to bring useful quantum computing to the world. The other side of it is make the world quantum-safe. We want to actually help clients figure out how to update their encryption standards to those quantum-safe ones. They exist. <a href="https://www.nist.gov/news-events/news/2023/08/nist-standardize-encryption-algorithms-can-resist-attack-quantum-computers">NIST has approved</a> a number of them, which it&rsquo;s mainly an inertia thing to move entire industries, move banks, move commerce, to adopt those standards.&nbsp;</p>

<p><strong>I can&rsquo;t get people to stop using four-character passwords. Will you talk to them?</strong></p>

<p>Yeah, right. Exactly, that&rsquo;s the challenge. And it&rsquo;s almost a social challenge that needs to be overcome to make that happen. Removing that, if we look across at what you can or cannot do on quantum computers, I honestly think we need to just watch what&rsquo;s happening with AI, see what&rsquo;s been done in the past with high-performance computing. Again, not everybody has a high-performance computer at home. And so we expect a lot of the frameworks to be very similar. And so my concern about putting too many safeguards around it early is stifling progress, is stifling the development early.</p>

<p><strong>But this conversation is now happening, I would say, in a much more heated way in the AI space. I mean, it&rsquo;s almost like </strong><a href="https://doctorow.medium.com/the-real-ai-fight-1ce751886457"><strong>two religions</strong></a><strong> are competing to see what the future of AI will be: &ldquo;Just run as fast as you can&rdquo; and &ldquo;We should have more safety.&rdquo; And this culminated potentially in </strong><a href="https://www.theverge.com/2023/11/29/23982046/sam-altman-interview-openai-ceo-rehired"><strong>whatever happened at OpenAI</strong></a><strong>.</strong></p>

<p>That&rsquo;s right.</p>

<p><strong>Who knows? We still don&rsquo;t know. I don&rsquo;t even know if that&rsquo;s the case. But that is one narrative about that chaos that certainly exists. Is there anything like that in Quantum? Are there Quantum researchers who are like, &ldquo;That person is out of control&rdquo;? Name names. [<em>Laughs</em>]</strong></p>

<p>No, we&rsquo;re not at that stage yet, I&rsquo;d say. But there are <a href="https://www.quantum.gov/">responsible</a> <a href="https://law.stanford.edu/stanford-center-for-responsible-quantum-technology/">quantum</a> <a href="https://www.rti.ox.ac.uk/wp-content/uploads/2022/09/Ten_Holter_et_al_2021_creating_a_responsible.pdf">computing</a> <a href="https://www.weforum.org/projects/quantum-computing-ethics/">initiatives</a>. There are things that are looking at it, and I think there&rsquo;s a lot to lean on in terms of learning from what&rsquo;s happening right now with those AI stories.</p>

<p><strong>What&rsquo;s the thing &mdash; outside of just the pure entertainment value &mdash; what&rsquo;s the thing about AI accelerationism that you&rsquo;ve pulled into how you&rsquo;re thinking about your roadmap and building the systems?</strong></p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“It’s always cool to see tremendous excitement about computing capabilities”</p></blockquote></figure>
<p>It&rsquo;s actually really cool. Something that we&rsquo;re talking about at our computing summit, too, is that we have Watsonx at IBM, and we actually brought in some GenAI methods to help users program in Qiskit. So there&rsquo;s actually an engine that we built there to help users code that we&rsquo;re going to be previewing. And then another thing is that translating problems into the right circuits that can run on physical hardware is a very challenging task. It, in itself, is an optimization task in terms of there&rsquo;s a particular problem you want to run, and my hardware is configured in this particular way. We call that transpilation: how to map one to the other. And our teams actually used AI methods to find basically more optimal paths of that mapping. It&rsquo;s actually really fun in that AI impacts how we can accelerate quantum. There&rsquo;s another flip side, which is we are looking into how quantum can actually boost classification methods for AI. So it&rsquo;s all tied together in some ways here.</p>

<p><strong>Has that changed your roadmap, the explosion of demand for AI systems? A year ago, there was no ChatGPT. Now we&rsquo;re sitting at the end of it, and I&rsquo;m going to go to CES in a couple of weeks and everyone&rsquo;s going to tell me that AI&rsquo;s in everything. The industry just sort of reacts to buzzwords. Has this moved your path at all?</strong></p>

<p>This AI transpilation thing did come in all of a sudden and is part of our roadmap. It&rsquo;s an innovation, and now we want to feed it into something that we want to drive toward product. So in that micro sense, it has. In the more macro sense, I just say that it&rsquo;s always cool to see tremendous excitement about computing capabilities. If the buzz stayed more on the AI and let quantum off the hook for a little bit, it&rsquo;s not so bad.</p>

<p><strong>Wow. The encryption doomers are like, &ldquo;Pay attention to us.&rdquo; There are some problems that quantum has always been promised to solve: molecular behavior, mapping proteins. Some of those problems have been attacked by AI very directly. We just had </strong><a href="https://www.theverge.com/23778745/demis-hassabis-google-deepmind-ai-alphafold-risks"><strong>Demis Hassabis</strong></a><strong> on the show &mdash; obviously DeepMind, they just did proteins. </strong><a href="https://www.theverge.com/2022/7/28/23280743/deepmind-alphafold-protein-database-alphabet"><strong>It&rsquo;s done now, you can have it</strong></a><strong>, we&rsquo;re going to walk away. Is there an overlap between where AI is expanding to in terms of the problem set or what it can do that is competitive with what you want to accomplish with quantum?</strong></p>

<p>I&rsquo;m not the foremost expert about what molecular problems can be solved here. But I can at least say that we know that there are certain sizes and certain scales of problems that, in terms of supercomputing resources, push summit, push frontier to its max limits of what users can actually simulate. Again, I don&rsquo;t know how much of that can actually be looked at using AI for approximate methods, but even then, it&rsquo;d still be approximate methods. And here&rsquo;s where quantum is really going to be something that allows one to look at it differently.</p>

<p><strong>When you&rsquo;re looking at what you have right now &mdash; you have partners, you have potential customers, you have people interested &mdash; what&rsquo;s the largest volume of interest from the community?</strong></p>

<p>There are those that are using various materials. For example, Department of Energy, Oak Ridge National Lab, those that already use high-performance computing. They are super interested in using our platforms. Boeing actually has been working with us for quite a bit. They&rsquo;re just looking at super tough problems like composites of materials and layers of materials and how best to arrange them. And they have problems with thousands of variables that are tremendous, that basically cannot work on classical computers. And we&rsquo;ve been working with them to understand how to map their problems into quantum. And then you have the financial services industry. You have a number of players there that are looking at things like portfolio optimization, trying to understand all these things.</p>

<p><strong>It&rsquo;s always portfolio optimization, man. At the end of the day, it&rsquo;s like Boeing&rsquo;s doing some cool shit and portfolio optimization. It&rsquo;s always lurking in the background somewhere. It&rsquo;s fine, they pay the bills. It&rsquo;s good.</strong></p>

<p><strong>You&rsquo;ve been talking a lot about the cloud. You&rsquo;ve got your cloud systems. You&rsquo;ve also put System Ones on college campuses. How does that work? You buy a System One, it&rsquo;s got some qubits in it. Is there a person rolling the helium up to it?</strong></p>

<p>They&rsquo;re still owned by IBM. They&rsquo;re actually managed services deployed on the premises of the client locations. So we have one actually that, earlier this year, is with <a href="https://newsroom.clevelandclinic.org/2023/03/20/cleveland-clinic-and-ibm-unveil-first-quantum-computer-dedicated-to-healthcare-research/">Cleveland Clinic in Ohio</a>. That&rsquo;s probably the most interesting place that we&rsquo;ve deployed a system, in that it&rsquo;s <a href="https://www.forbes.com/sites/moorinsights/2023/03/21/cleveland-clinic-and-ibm-launch-worlds-first-quantum-computer-dedicated-to-healthcare-research-and-biomedical-discoveries/?sh=16e95c91a843">in their cafeteria</a>.</p>

<p><strong>That&rsquo;s amazing.</strong></p>

<p>People have their morning coffees and eat their lunch around it.</p>

<p><strong>And that&rsquo;s just a self-contained local supercomputer.</strong></p>

<p>You can think of it as a self-contained, local managed service that they&rsquo;re able to build a network and ecosystem around with their researchers, other partner university institutions that might want to use it. So that&rsquo;s sort of the idea. Again, we have our main data center and cloud-accessible systems as you had mentioned. And you have these other ones that you drive regional ecosystems. And we&rsquo;re actually launching a European data center around our system over in Germany next year because, in different locations, people care about how their data is handled. And so then you never have to send information overseas and things like that. So at that level, we can certainly build that type of flexibility into how we manage that service in terms of user data and everything.</p>

<p><strong>Part of the news today is System Two. Do you have System One customers who are like, &ldquo;Oh, shit, I should have waited&rdquo;? How does that work with a quantum supercomputer? Is there an upgrade cycle?</strong></p>

<p>Even with our System Ones, we&rsquo;ve actually upgraded those over time. And again with our roadmap, some of them, we in fact first launched with 27-qubit Falcons. As an example, we just announced that our system in Japan with the University of Tokyo got upgraded to a 127-qubit Eagle processor. But in terms of the infrastructure from System One to System Two, it&rsquo;s wholly different. So System One is great in that it first showed that we can put these things almost anywhere &mdash; a cafeteria, for example. You didn&rsquo;t have to be in a physics laboratory for them to function.</p>
<img src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/25132870/52748207710_a77945921b_o.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" alt="An airy, open, round, windowed multi-story dining area with a large, unmissable glass box, an IBM System One, in the middle of the floor, surrounded by open space." title="An airy, open, round, windowed multi-story dining area with a large, unmissable glass box, an IBM System One, in the middle of the floor, surrounded by open space." data-has-syndication-rights="1" data-caption="&lt;em&gt;It’s hard to miss.&lt;/em&gt; | Image: Ryan Lavine for IBM" data-portal-copyright="Image: Ryan Lavine for IBM" />
<p><strong>And in the cafeteria there&rsquo;s the superconducting, super-cooled cryogenic system?</strong></p>

<p>Yeah. Like I say, you have your morning coffee&hellip;</p>

<p><strong>And you&rsquo;re just looking at it.</strong></p>

<p>&hellip;next to a really, really cold 15 millikelvin quantum processor.&nbsp;</p>

<p><strong>Do people know it&rsquo;s there? Is there a sign?</strong></p>

<p>It&rsquo;s hard to miss. [<em>Laughs</em>] It&rsquo;s this glass box that is&#8230; Actually, funny story is that we work with this vendor, Goppion, that actually handles the glass that encases the Mona Lisa to help build the enclosures for our systems.</p>

<p><strong>That&rsquo;s cool. Alright, so System Two.</strong></p>

<p>Yeah. So System Two: whole new level of infrastructure. But it&rsquo;s designed to scale. And so that&rsquo;s where certainly upgradeability and modularity is inherently built into it. You want to increase the number of processors, increase the cryogenic cooling environment? We can do that. Like Lego, like modular blocks. You want to increase the amount of control electronics? We can do that. You want to increase the amount of classical computation to interface with the quantum computer? We can do that, too. That&rsquo;s the idea behind System Two, that it&rsquo;s really designed for scalability in a modular way within a data center environment.</p>

<p><strong>So IBM is announcing a new chip, new supercomputer, System Two, new roadmaps. If you&rsquo;re just a regular person and you&rsquo;re looking at the pace of supercomputer development, what should you be looking out for?</strong></p>

<p>I&rsquo;d say that you&rsquo;ve just got to be looking out at the fact that it&rsquo;s actually not hard to get started and learn about it. There&rsquo;s an entire set of resources that you can go and program a quantum computer tomorrow. And the fact that we have this 10-year roadmap, and the fact that we are building this ecosystem and driving toward these new generations of chips and systems, we want to develop the developer of the future. And so if you&rsquo;re at all interested in learning about using a quantum computer and getting involved, there&rsquo;s a tremendous opportunity for growth here. And we&rsquo;re going to need that. To build an entire industry, and to build this as a computer platform that works together seamlessly with today&rsquo;s most high-performance computers, is going to require a groundswell of people. So to me, you touch so many different people out there, it&rsquo;s like, get out there. You can run and program a quantum computer tomorrow. We have freely available systems to run circuits on.</p>

<p><strong>Stop playing around with your LLMs; get on the quantum train. That&rsquo;s what I&rsquo;m taking away from this.</strong></p>

<p>Yeah.</p>

<p><strong>Alright. Last very silly question. When you watch the Ant-Man movies, are you just furious all the time?</strong></p>

<p>I&rsquo;d say that the first few Ant-Man movies, some of the quantum focus was interesting, it was cute. But the most recent one where they had an entire civilization inside, oof.</p>

<p><strong>It&rsquo;s a little rough.</strong></p>

<p>That was a little rough.</p>

<p><strong>Alright, Jerry, this was amazing. Thank you so much for coming on <em>Decoder</em>.</strong></p>

<p>Yeah, you&rsquo;re very welcome. Glad to be here.</p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Mark Wilding</name>
			</author>
			
			<title type="html"><![CDATA[IBM promised to back off facial recognition — then it signed a $69.8 million contract to provide it]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2023/8/31/23852955/ibm-uk-government-contract-biometric-facial-recognition" />
			<id>https://www.theverge.com/2023/8/31/23852955/ibm-uk-government-contract-biometric-facial-recognition</id>
			<updated>2023-08-31T09:04:24-04:00</updated>
			<published>2023-08-31T09:04:24-04:00</published>
			<category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Policy" /><category scheme="https://www.theverge.com" term="Privacy" /><category scheme="https://www.theverge.com" term="Security" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[IBM has returned to the facial recognition market - just three years after announcing it was abandoning work on the technology due to concerns about racial profiling, mass surveillance, and other human rights violations. In June 2020, as Black Lives Matter protests swept the US after George Floyd's murder, IBM chief executive Arvind Krishna wrote [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Illustration by Alex Castro / The Verge" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/12322323/acastro_180730_1777_facial_recognition_0001.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>IBM has returned to the facial recognition market - just three years after <a href="https://www.theverge.com/2020/6/8/21284683/ibm-no-longer-general-purpose-facial-recognition-analysis-software">announcing it was abandoning work on the technology</a> due to concerns about racial profiling, mass surveillance, and other human rights violations.</p>
<p>In June 2020, as Black Lives Matter protests swept the US after George Floyd's murder, IBM chief executive Arvind Krishna <a href="https://web.archive.org/web/20200609031426/https:/www.ibm.com/blogs/policy/facial-recognition-susset-racial-justice-reforms/">wrote a letter to Congress</a> announcing that the company would no longer offer "general purpose" facial recognition technology. "The fight against racism is as urgent as ever," he wrote. "IBM firmly opposes and will not condone uses of any technology, including facial recognition technology offered by othe …</p>
<p><a href="https://www.theverge.com/2023/8/31/23852955/ibm-uk-government-contract-biometric-facial-recognition">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Chaim Gartenberg</name>
			</author>
			
			<title type="html"><![CDATA[IBM’s first 2nm chip previews the processors of tomorrow]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/5/6/22422815/ibm-2nm-chip-processors-semiconductors-power-performance-technology" />
			<id>https://www.theverge.com/2021/5/6/22422815/ibm-2nm-chip-processors-semiconductors-power-performance-technology</id>
			<updated>2021-05-06T17:37:00-04:00</updated>
			<published>2021-05-06T17:37:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Circuit Breaker" /><category scheme="https://www.theverge.com" term="Gadgets" /><category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[IBM has revealed what it says are the world's first 2nm process chips, giving a brief preview of the technology that might eventually power the smartphones, laptops, and gadgets of the future. The big jump here is in transistor count. Compared to today's 7nm chips, the new IBM technology features dramatically more transistors, thanks to [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="IBM" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22496830/IBM_Research_2_nm_Wafer.jpeg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p><a href="https://newsroom.ibm.com/2021-05-06-IBM-Unveils-Worlds-First-2-Nanometer-Chip-Technology,-Opening-a-New-Frontier-for-Semiconductors#assets_all">IBM has revealed</a> what it says are the world's first 2nm process chips, giving a brief preview of the technology that might eventually power the smartphones, laptops, and gadgets of the future.</p>
<p>The big jump here is in transistor count. Compared to today's 7nm chips, the new IBM technology features dramatically more transistors, thanks to the more compact design. That means chips built using that process can potentially offer big gains in performance and battery life: IBM says its 2nm chips are "projected to achieve 45 percent higher performance, or 75 percent lower energy use, than today's most advanced 7 nm node chips."</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>It'll still be years …</p></blockquote></figure>
<p><a href="https://www.theverge.com/2021/5/6/22422815/ibm-2nm-chip-processors-semiconductors-power-performance-technology">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Justine Calma</name>
			</author>
			
			<title type="html"><![CDATA[IBM sets new climate goal for 2030]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2021/2/16/22285669/ibm-climate-change-commitment-cut-greenhouse-gas-emissions" />
			<id>https://www.theverge.com/2021/2/16/22285669/ibm-climate-change-commitment-cut-greenhouse-gas-emissions</id>
			<updated>2021-02-16T12:09:02-05:00</updated>
			<published>2021-02-16T12:09:02-05:00</published>
			<category scheme="https://www.theverge.com" term="Climate" /><category scheme="https://www.theverge.com" term="Energy" /><category scheme="https://www.theverge.com" term="Environment" /><category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="Science" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[IBM plans to get rid of its planet-heating carbon dioxide emissions from its operations by 2030, the company announced today. And unlike some other tech companies that have made splashy environmental commitments lately, IBM's pledge emphasized the need to prevent emissions rather than developing ways to capture carbon dioxide after it's released. IBM is pledging [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Inside IBM Research Headquarters" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/22307246/468943455.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Inside IBM Research Headquarters	</figcaption>
</figure>
<p>IBM plans to get rid of its planet-heating carbon dioxide emissions from its operations by 2030, the company announced today. And unlike some other tech companies that have made splashy environmental commitments lately, IBM's pledge emphasized the need to prevent emissions rather than developing ways to capture carbon dioxide after it's released.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>IBM is pledging to do "all it can across its operations" to stop polluting</p></blockquote></figure>
<p>The company committed to reaching net zero greenhouse gas emissions by the end of this decade, pledging to do "all it can across its operations" to stop polluting before it turns to emerging technologies that might be able to …</p>
<p><a href="https://www.theverge.com/2021/2/16/22285669/ibm-climate-change-commitment-cut-greenhouse-gas-emissions">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Monica Chin</name>
			</author>
			
			<title type="html"><![CDATA[Hackers are targeting the COVID-19 vaccine supply chain, IBM finds]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2020/12/3/22151016/hackers-phishing-coronavirus-vaccine-ibm-security" />
			<id>https://www.theverge.com/2020/12/3/22151016/hackers-phishing-coronavirus-vaccine-ibm-security</id>
			<updated>2020-12-03T18:53:20-05:00</updated>
			<published>2020-12-03T18:53:20-05:00</published>
			<category scheme="https://www.theverge.com" term="Coronavirus" /><category scheme="https://www.theverge.com" term="Health" /><category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Science" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[A global phishing campaign has been targeting organizations associated with the distribution of COVID-19 vaccines since September 2020, IBM security researchers say. In a blog post, analysts Claire Zaboeva and Melissa Frydrych of IBM X-Force IRIS announced that the phishing campaign spans six regions: Germany, Italy, South Korea, Czech Republic, greater Europe, and Taiwan. The [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Amelia Krales" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/6264805/akrales_160205_0929_A_0016.0.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>A global phishing campaign has been targeting organizations associated with the distribution of COVID-19 vaccines since September 2020, IBM security researchers say.</p>
<p>In a <a href="https://securityintelligence.com/posts/ibm-uncovers-global-phishing-covid-19-vaccine-cold-chain/">blog post</a>, analysts Claire Zaboeva and Melissa Frydrych of IBM X-Force IRIS announced that the phishing campaign spans six regions: Germany, Italy, South Korea, Czech Republic, greater Europe, and Taiwan.</p>
<p>The campaign appears to be focused on the "cold chain," the segment of the vaccine supply chain that keeps doses cold during their storage and transportation. Some vaccines need to stay at <a href="https://www.theverge.com/21537171/temperature-sensors-covid-vaccine-cold-chain">extremely low temperatures</a> in order to remain potent. Pfizer, for example, recomme …</p>
<p><a href="https://www.theverge.com/2020/12/3/22151016/hackers-phishing-coronavirus-vaccine-ibm-security">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>James Vincent</name>
			</author>
			
			<title type="html"><![CDATA[IBM will spin off legacy business to focus on cloud and AI services]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2020/10/9/21508974/ibm-split-into-two-companies-newco-hybrid-cloud-legacy-it" />
			<id>https://www.theverge.com/2020/10/9/21508974/ibm-split-into-two-companies-newco-hybrid-cloud-legacy-it</id>
			<updated>2020-10-09T07:23:42-04:00</updated>
			<published>2020-10-09T07:23:42-04:00</published>
			<category scheme="https://www.theverge.com" term="Business" /><category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[IBM is splitting into two public companies, with a spin-off handling the firm's legacy IT infrastructure work, allowing IBM to focus on new high-margin businesses, particularly cloud services and AI. The 109-year-old company announced the news this week, which follows CEO Arvind Krishna's longterm plan to streamline the sprawling business. Krishna took the reins of [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Photo by Jeremy Moeller/Getty Images" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/21947997/1254279163.jpg.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>IBM is splitting into two public companies, with a spin-off handling the firm's legacy IT infrastructure work, allowing IBM to focus on new high-margin businesses, particularly cloud services and AI.</p>
<p>The 109-year-old company <a href="https://www.ibm.com/investor/events/ibm-strategic-update-2020">announced the news</a> this week, which follows CEO Arvind Krishna's longterm plan to streamline the sprawling business. Krishna took the reins of IBM in April 2020 after working on its $34 billion acquisition of open source software firm Red Hat from 2018 onwards. Red Hat's software is key to IBM's new hybrid cloud offerings.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>The split is just the latest divestment of unfavorable businesses in IBM's long history</p></blockquote></figure>
<p>In a call  …</p>
<p><a href="https://www.theverge.com/2020/10/9/21508974/ibm-split-into-two-companies-newco-hybrid-cloud-legacy-it">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Taylor Lyles</name>
			</author>
			
			<title type="html"><![CDATA[Los Angeles settles Weather Channel lawsuit, lets it keep selling location data to advertisers]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2020/8/19/21376217/los-angeles-the-weather-channel-app-lawsuit-settlement-location-data-selling" />
			<id>https://www.theverge.com/2020/8/19/21376217/los-angeles-the-weather-channel-app-lawsuit-settlement-location-data-selling</id>
			<updated>2020-08-19T19:35:20-04:00</updated>
			<published>2020-08-19T19:35:20-04:00</published>
			<category scheme="https://www.theverge.com" term="Apps" /><category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="Law" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Policy" /><category scheme="https://www.theverge.com" term="Privacy" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Los Angeles has settled its lawsuit against the operator of The Weather Channel app. The city filed litigation against the company in 2019, alleging that the app misled millions of people into granting access to their personal location data and sold that data to third parties. While IBM is celebrating this moment by calling those [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/assets/1123363/iPhone_TWC_app.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Los Angeles has <a href="https://www.lacityattorney.org/post/city-attorney-feuer-settles-digital-privacy-lawsuit-against-the-weather-channel-app">settled its lawsuit against the operator of The Weather Channel app</a>. The city <a href="https://www.theverge.com/2019/1/4/18168373/los-angeles-weather-channel-app-user-location-data">filed litigation against the company</a> in 2019, alleging that the app <a href="https://www.nytimes.com/2019/01/03/technology/weather-channel-app-lawsuit.html">misled millions of people</a> into granting access to their personal location data and sold that data to third parties. </p>
<p>While IBM is celebrating this moment by calling those original claims "baseless" in a statement to <em>The Verge</em>, it sounds like they were largely true - since the only thing the settlement requires is for The Weather Channel to proactively warn users that yes, your location data is for sale.</p>
<p>As part of <a href="https://filedn.com/lOJqn8isbUNJvUBnJTlV5OS/Weather%20Channel%20App%20Aug%202020.pdf">the settlement</a>, the app's operator, TWC Product and Technology LLC, …</p>
<p><a href="https://www.theverge.com/2020/8/19/21376217/los-angeles-the-weather-channel-app-lawsuit-settlement-location-data-selling">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Kim Lyons</name>
			</author>
			
			<title type="html"><![CDATA[Computer scientist Frances Allen, known for her work on compiling, dies at 88]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/2020/8/9/21360722/frances-allen-computer-scientist-compiling-ibm" />
			<id>https://www.theverge.com/2020/8/9/21360722/frances-allen-computer-scientist-compiling-ibm</id>
			<updated>2020-08-09T14:21:03-04:00</updated>
			<published>2020-08-09T14:21:03-04:00</published>
			<category scheme="https://www.theverge.com" term="IBM" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Frances Allen, whose work on computer compiling helped establish a foundation for much of modern computer programming, died on August 4th, her 88th birthday. She was the first woman to win the Turing Award, and the first female IBM fellow. Allen was determined to make the tedious compiling process - converting software programs into ones [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="IBM" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/21712077/Frances_Allen.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Frances Allen, whose work on computer compiling helped establish a foundation for much of modern computer programming, died on August 4th, her 88th birthday. She was the first woman to win the <a href="https://amturing.acm.org/">Turing Award</a>, and the first female IBM fellow. Allen was determined to make the tedious compiling process - converting software programs into ones and zeroes- more efficient. The work became a hallmark of her career.</p>
<p>After receiving a master's degree in mathematics from the University of Michigan, Allen took a job with IBM Research in Poughkeepsie, NY, in 1957, intending only to stay until she had her student loan debt paid off. She taught IBM employe …</p>
<p><a href="https://www.theverge.com/2020/8/9/21360722/frances-allen-computer-scientist-compiling-ibm">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
	</feed>
