<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">AMD | The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2026-03-24T22:28:49+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/amd" />
	<id>https://www.theverge.com/rss/amd/index.xml</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/rss/amd/index.xml" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Richard Lawler</name>
			</author>
			
			<title type="html"><![CDATA[Arm’s first CPU ever will plug into Meta’s AI data centers later this year]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/ai-artificial-intelligence/899823/arm-agi-cpu-meta" />
			<id>https://www.theverge.com/?p=899823</id>
			<updated>2026-03-24T18:28:49-04:00</updated>
			<published>2026-03-24T16:43:14-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="Intel" /><category scheme="https://www.theverge.com" term="Meta" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Nvidia" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[After decades of only licensing its chip designs for others to use, UK-based Arm revealed the first chip it's producing on its own, and the first customer. Dubbed the Arm AGI CPU, it's another chip designed for inference, or running the cloud processing for AI tools like AI agents that can continue to spawn more [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="Stylized image showing the edge of a CPU" data-caption="" data-portal-copyright="Image: Arm" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2026/03/20260216_VISION25_ExplodedTight_Chip-01-16x9_16bit_v2-1200x675-1.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">After decades of only licensing its chip designs for others to use, UK-based Arm <a href="https://newsroom.arm.com/news/arm-agi-cpu-launch">revealed the first chip</a> it's producing on its own, and the first customer. Dubbed the Arm AGI CPU, it's another chip designed for inference, or running the cloud processing for AI tools like AI agents that can continue to spawn more and more tasks to run at once. The first company in line to use it is Meta, which has <a href="https://www.ft.com/content/d3b50dfc-31fa-45a8-9184-c5f0476f4504">reportedly struggled</a> to launch its own AI chips.</p>
<p class="has-text-align-none">Meta <a href="https://about.fb.com/news/2026/03/meta-partners-with-arm-to-develop-new-class-of-data-center-silicon/">says it's both the lead partner and co-developer</a>, and plans to work on "multiple generations" of the data center CPUs, for use along with hardware from other vendors like <a href="https://www.theverge.com/ai-artificial-intelligence/880513/nvidia-meta-ai-grace-vera-chips">Nvidia</a> and <a href="https://www.theverge.com/tech/883593/amd-forges-100-billion-deal-with-meta-for-ai-chips">AMD</a>. Arm cus …</p>
<p><a href="https://www.theverge.com/ai-artificial-intelligence/899823/arm-agi-cpu-meta">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Sean Hollister</name>
			</author>
			
			<title type="html"><![CDATA[Future Sony PlayStation games will use AI to imagine new frames]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/898283/future-sony-playstation-games-will-use-ai-to-imagine-new-frames" />
			<id>https://www.theverge.com/?p=898283</id>
			<updated>2026-03-20T17:10:19-04:00</updated>
			<published>2026-03-20T15:46:23-04:00</published>
			<category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="Gaming" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="PlayStation" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Mark Cerny, the lead architect of the PlayStation 5 and PS5 Pro, told Digital Foundry that ML-based frame generation tech is coming to "PlayStation platforms" in the future, letting the game console use AI to imagine new frames between the ones it's actually rendering, which can create smoother perceived image quality while (typically) introducing some [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Photo by Vjeran Pavic / The Verge" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/25717620/247361_PS5_Pro_VPavic_182.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">Mark Cerny, the lead architect of the PlayStation 5 and PS5 Pro, <a href="https://www.digitalfoundry.net/news/2026/03/mark-cerny-confirms-frame-generation-should-be-seen-at-some-point-on-playstation-platforms">told <em>Digital Foundry</em></a> that ML-based frame generation tech is coming to "PlayStation platforms" in the future, letting the game console use AI to imagine new frames between the ones it's actually rendering, which can create smoother perceived image quality while (typically) introducing some amount of lag. At least, that's how it works on PCs, where critics call them "fake frames." </p>
<p class="has-text-align-none">It's not clear whether Cerny means he'll bring it to the <a href="https://www.theverge.com/reviews/24289319/ps5-pro-review">PS5 Pro</a>, which just got better AI upscaling with <a href="https://www.theverge.com/games/895396/playstation-pssr-upscaling-cyberpunk-2077-silent-hill">an upgraded PlayStation Spectral Super Resolution (PSSR)</a> technique, or whether it'll have to w …</p>
<p><a href="https://www.theverge.com/news/898283/future-sony-playstation-games-will-use-ai-to-imagine-new-frames">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Antonio G. Di Benedetto</name>
			</author>
			
			<title type="html"><![CDATA[HP ZBook Ultra G1a review: a business-class workstation that’s got game]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/tech/877688/hp-zbook-ultra-g1a-laptop-amd-strix-halo-review" />
			<id>https://www.theverge.com/?p=877688</id>
			<updated>2026-02-17T11:57:01-05:00</updated>
			<published>2026-02-12T07:00:00-05:00</published>
			<category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="Gadgets" /><category scheme="https://www.theverge.com" term="HP" /><category scheme="https://www.theverge.com" term="Laptop Reviews" /><category scheme="https://www.theverge.com" term="Laptops" /><category scheme="https://www.theverge.com" term="Reviews" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Business laptops are typically dull computers foisted on employees en masse. But higher-end enterprise workstation notebooks sometimes get an interesting enough blend of power and features to appeal to enthusiasts. HP's ZBook Ultra G1a is a nice example. It's easy to see it as another gray boring-book for spendy business types, until you notice a [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Photo: Antonio G. Di Benedetto / The Verge" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2026/02/268349_HP_ZBook_Ultra_G1a_laptop_ADiBenedetto_0007.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-drop-cap has-text-align-none">Business laptops are typically dull computers foisted on employees en masse. But higher-end enterprise workstation notebooks sometimes get an interesting enough blend of power and features to appeal to enthusiasts. HP's ZBook Ultra G1a is a nice example. It's easy to see it as another gray boring-book for spendy business types, until you notice a few key specs: an AMD Strix Halo APU, lots of RAM, an OLED display, and an adequate amount of speedy ports (Thunderbolt 4, even - a rarity on AMD laptops).</p>
<p class="has-text-align-none">I know from my time with the <a href="https://www.theverge.com/reviews/621947/asus-rog-flow-z13-gaming-tablet-laptop-amd-strix-halo-review">Asus ROG Flow Z13</a> and <a href="https://www.theverge.com/reviews/749404/framework-desktop-pc-amd-ryzen-ai-max-385-395-plus-review">Framework Desktop</a> that anything using AMD's high-end Ryzen AI Max chips should make for a co …</p>
<p><a href="https://www.theverge.com/tech/877688/hp-zbook-ultra-g1a-laptop-amd-strix-halo-review">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Tom Warren</name>
			</author>
			
			<title type="html"><![CDATA[AMD’s faster Ryzen 7 9850X3D CPU arrives on January 29th for $499]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/865648/amd-ryzen-7-9850x3d-price-release-date-announcement" />
			<id>https://www.theverge.com/?p=865648</id>
			<updated>2026-01-22T10:38:37-05:00</updated>
			<published>2026-01-22T10:29:10-05:00</published>
			<category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="Gaming" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="PC Gaming" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[AMD announced an improved version of its popular Ryzen 7 9800X3D processor at CES earlier this month, and it's now confirming a release date and pricing today. The new Ryzen 7 9850X3D will debut on January 29th, priced at $499. That's $20 more than the initial retail pricing of the 9800X3D, and this faster 9850X3D [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2026/01/amdryzen.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">AMD announced an improved version of its popular <a href="https://www.theverge.com/2024/11/6/24288948/amd-ryzen-7-9800x3d-review-cpu-processor-benchmark-test">Ryzen 7 9800X3D processor</a> at CES earlier this month, and it's now confirming a release date and pricing today. The new Ryzen 7 9850X3D will debut on January 29th, priced at $499.</p>
<p class="has-text-align-none">That's $20 more than the initial retail pricing of the 9800X3D, and this faster 9850X3D will feature the same 8-core / 16-thread CPU running at faster boost clocks. The boost clock on the 9850X3D is 400MHz higher than the standard 9800X3D, and it maintains the same 120-watt TDP. It's essentially a better-binned version of the 9800X3D.</p>
<img src="https://platform.theverge.com/wp-content/uploads/sites/2/2026/01/9850x3dintelcomparison.jpg?quality=90&amp;strip=all&amp;crop=0,0,100,100" alt="A bar chart showing performance improvements in a variety of popular video games with the AMD Ryzen 7 9850X3D." title="A bar chart showing performance improvements in a variety of popular video games with the AMD Ryzen 7 9850X3D." data-has-syndication-rights="1" data-caption="" data-portal-copyright="Image: AMD">
<p class="has-text-align-none">It's still not clear exactly how much these higher boost clocks will help with PC  …</p>
<p><a href="https://www.theverge.com/news/865648/amd-ryzen-7-9850x3d-price-release-date-announcement">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Sean Hollister</name>
			</author>
			
			<title type="html"><![CDATA[The two things AMD subtly revealed at CES that actually excite me]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/tech/862861/amd-ces-2026-socketed-laptop-chips-strix-halo-price" />
			<id>https://www.theverge.com/?p=862861</id>
			<updated>2026-01-16T18:56:44-05:00</updated>
			<published>2026-01-16T14:25:12-05:00</published>
			<category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="CES" /><category scheme="https://www.theverge.com" term="Gaming" /><category scheme="https://www.theverge.com" term="PC Gaming" /><category scheme="https://www.theverge.com" term="Report" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[As we predicted, the world's biggest consumer electronics show was a bit of a bust for gamers this year! CES 2026 brought us several neat gamepads, but barely any handhelds and no new desktop GPUs - not from Nvidia, not from Intel, and not from AMD. But if you dig deep, AMD said two things [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2026/01/AMD_Ryzen_7000_Desktop_CPU_Lineup_low_res_scale_4_00x_Custom-1.webp?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none"><a href="https://www.theverge.com/tech/851165/ces-2026-what-to-expect">As we predicted</a>, the world's biggest consumer electronics show was a bit of a bust for gamers this year! CES 2026 brought us <a href="https://www.theverge.com/tech/848331/8bitdo-ultimate-3e-controller-xbox-wireless-tmr-swappable-joysticks">several</a> <a href="https://www.tiktok.com/@verge/video/7593953104756821262">neat</a> <a href="https://www.theverge.com/tech/860346/force-feedback-steering-wheel-in-a-gamepad-feels-surprisingly-great">gamepads</a>, but <a href="https://www.theverge.com/tech/853587/lenovo-legion-go-2-steam-os-annoucement-price-release-date">barely</a> <a href="https://www.theverge.com/tech/860992/onexplayer-ces-2026-liquid-cooled-strix-halo-apex-super-x-handheld-egpu">any</a> handhelds and no new desktop GPUs - not from Nvidia, not from Intel, and not from AMD. </p>
<p class="has-text-align-none">But if you dig deep, AMD said two things at this year's show that are worthy of attention. Did you catch that the company's about to make socketed <em>mobile</em> chips again? Or that its answer to Intel is to lower the price of its <a href="https://www.theverge.com/games/791460/gpd-win-5-corded-handhelds">monster Strix Halo silicon</a>?</p>
<p class="has-text-align-none">Publicly, AMD barely acknowledged consumers at the Consumer Electronics Show. "AMD failed us," <a href="https://www.youtube.com/watch?v=WsCrKGY9F1o">decried Gamers Nexus</a>, pointing out that the compa …</p>
<p><a href="https://www.theverge.com/tech/862861/amd-ces-2026-socketed-laptop-chips-strix-halo-price">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Tom Warren</name>
			</author>
			
			<title type="html"><![CDATA[AMD’s new Ryzen 7 9850X3D makes the best gaming CPU even faster]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/tech/855188/amd-ryzen-7-9850x3d-processor-boost-clock-announcement" />
			<id>https://www.theverge.com/?p=855188</id>
			<updated>2026-01-05T22:40:16-05:00</updated>
			<published>2026-01-05T22:30:00-05:00</published>
			<category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="CES" /><category scheme="https://www.theverge.com" term="Gaming" /><category scheme="https://www.theverge.com" term="PC Gaming" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[AMD is introducing an improved version of its popular Ryzen 7 9800X3D processor today at CES. The new Ryzen 7 9850X3D offers up the same 8-core / 16-thread CPU as the 9800X3D, running at even faster boost clocks. "We've fine-tuned our best gaming processor in the world and have increased the boost clocks to 5.6GHz," [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/02/acastro_180529_1777_amd_0001.0.webp?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">AMD is introducing an improved version of its popular <a href="https://www.theverge.com/2024/11/6/24288948/amd-ryzen-7-9800x3d-review-cpu-processor-benchmark-test">Ryzen 7 9800X3D processor</a> today at CES. The new Ryzen 7 9850X3D offers up the same 8-core / 16-thread CPU as the 9800X3D, running at even faster boost clocks.</p>
<p class="has-text-align-none">"We've fine-tuned our best gaming processor in the world and have increased the boost clocks to 5.6GHz," says Rahul Tikoo, the head of AMD's client CPU business, in a briefing with <em>The Verge</em>. The boost clock on the 9850X3D is 400MHz higher than the standard 9800X3D, and it maintains the same 120-watt TDP. It's essentially a better-binned version of the 9800X3D, and how much faster it is over the existing chip will vary on a game-by …</p>
<p><a href="https://www.theverge.com/tech/855188/amd-ryzen-7-9850x3d-processor-boost-clock-announcement">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Antonio G. Di Benedetto</name>
			</author>
			
			<title type="html"><![CDATA[Windows on Arm had another good year]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/gadgets/850074/2025-windows-arm-laptops-qualcomm-intel-amd-nvidia" />
			<id>https://www.theverge.com/?p=850074</id>
			<updated>2025-12-26T11:47:24-05:00</updated>
			<published>2025-12-29T08:00:00-05:00</published>
			<category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="Chips" /><category scheme="https://www.theverge.com" term="Gadgets" /><category scheme="https://www.theverge.com" term="Intel" /><category scheme="https://www.theverge.com" term="Laptops" /><category scheme="https://www.theverge.com" term="Microsoft" /><category scheme="https://www.theverge.com" term="Nvidia" /><category scheme="https://www.theverge.com" term="Tech" /><category scheme="https://www.theverge.com" term="Windows" />
							<summary type="html"><![CDATA[In 2024, Qualcomm's Snapdragon X chips finally made Arm-based Windows laptops viable. Unlike previous Arm laptops that struggled to even run Windows well, this new class offered solid performance and the best battery life on Windows, and they impressed us in Microsoft's own Surface Laptop and Surface Pro offerings. But inconsistent app compatibility remained the [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/08/STK109_WINDOWS_A.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-drop-cap has-text-align-none">In 2024, Qualcomm's Snapdragon X chips finally made Arm-based Windows laptops viable. Unlike <a href="https://www.theverge.com/23421326/microsoft-surface-pro-9-arm-qualcomm-sq3-review">previous Arm laptops</a> that struggled to even run Windows well, this new class offered solid performance and the best battery life on Windows, and <a href="https://www.theverge.com/24319497/windows-on-arm-2024-review-laptops">they impressed us</a> in Microsoft's own <a href="https://www.theverge.com/2024/6/25/24185462/microsoft-surface-laptop-7th-edition-review">Surface Laptop</a> and <a href="https://www.theverge.com/24191243/microsoft-surface-pro-11-oled-review">Surface Pro</a> offerings. But inconsistent app compatibility remained the biggest hurdle to running Windows on Arm. (It forced me to use the watered-down Adobe Lightroom app instead of Lightroom Classic, and that's a sin.) And playing games, one of Windows' greatest strengths against the walled garden of Apple's Macs, was basically <a href="https://www.theverge.com/2024/8/8/24215905/microsoft-windows-on-arm-gaming-laptops-notepad">a nonstarter</a>.</p>
<p class="has-text-align-none">Throughou …</p>
<p><a href="https://www.theverge.com/gadgets/850074/2025-windows-arm-laptops-qualcomm-intel-amd-nvidia">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Sean Hollister</name>
			</author>
			
			<title type="html"><![CDATA[AMD FSR Redstone is an exciting and confusing upgrade for Radeon PC gamers]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/games/841342/amd-fsr-redstone-is-an-exciting-and-confusing-upgrade-for-radeon-pc-gamers" />
			<id>https://www.theverge.com/?p=841342</id>
			<updated>2025-12-10T11:27:57-05:00</updated>
			<published>2025-12-10T09:00:00-05:00</published>
			<category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="Gaming" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="PC Gaming" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Today, AMD is soft-launching its latest suite of graphics and performance-enhancing tech, FSR Redstone - and it might take a second to wrap your head around. It certainly did for me. The good news is that in just three months, AMD has more than doubled the number of games that support the flagship machine-learning version [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="An AMD graphics card sitting on a table." data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/03/257587_AMD_RX_9700_and_9700_XT_TWarren_0004.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">Today, AMD is soft-launching its latest suite of graphics and performance-enhancing tech, FSR Redstone - and it might take a second to wrap your head around. It certainly did for me.</p>
<p class="has-text-align-none">The good news is that in <a href="https://gpuopen.com/news/amd-fsr4-over-85-games/">just three months</a>, AMD has more than doubled the number of games that support the flagship machine-learning version of its upscaling tech, FSR4, to over 200 games in all, and it's launching ML-based frame generation (yes, "fake frames") for over 30 titles too. Both techniques can dramatically increase your framerate while preserving image quality better than FSR 1, 2, or 3 allowed. You can <a href="https://www.amd.com/en/products/graphics/technologies/fidelityfx/supported-games.html">find a full game lists here</a> and download the 25 …</p>
<p><a href="https://www.theverge.com/games/841342/amd-fsr-redstone-is-an-exciting-and-confusing-upgrade-for-radeon-pc-gamers">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Stevie Bonifield</name>
			</author>
			
			<title type="html"><![CDATA[AMD, Department of Energy announce $1 billion AI supercomputer partnership]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/807483/amd-department-of-energy-announce-1-billion-ai-supercomputer-partnership" />
			<id>https://www.theverge.com/?p=807483</id>
			<updated>2025-10-27T18:16:48-04:00</updated>
			<published>2025-10-27T18:16:48-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[AMD has sealed a $1 billion deal with the US Department of Energy to develop two supercomputers, Lux and Discovery, in collaboration with Oracle and Hewlett Packard Enterprise (HPE). Both supercomputers will live at Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee. Lux is slated to come online fairly soon in early 2026, with [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/03/acastro_STK081_amd_02.jpg.webp?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">AMD has sealed a $1 billion deal with the US Department of Energy to develop two supercomputers, Lux and Discovery, in collaboration with Oracle and Hewlett Packard Enterprise (HPE). Both supercomputers will live at Oak Ridge National Laboratory (ORNL) in Oak Ridge, Tennessee. Lux is slated to come online fairly soon in early 2026, with Discovery following in 2029. </p>
<p class="has-text-align-none">Both build on the work that went into <a href="https://www.theverge.com/2019/5/7/18535078/worlds-fastest-exascale-supercomputer-frontier-amd-cray-doe-oak-ridge-national-laboratory">the Frontier supercomputer</a>, which is also housed at ORNL and was the fastest in the world until <a href="https://asc.llnl.gov/exascale/el-capitan">El Capitan</a> came online last year at Lawrence Livermore National Laboratory. AMD also helped develop those supercomputers, so this isn't its first …</p>
<p><a href="https://www.theverge.com/news/807483/amd-department-of-energy-announce-1-billion-ai-supercomputer-partnership">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jess Weatherbed</name>
			</author>
			
			<title type="html"><![CDATA[AMD teams up with OpenAI to challenge Nvidia&#8217;s AI chip dominance]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/792650/amd-openai-five-year-ai-chip-agreement" />
			<id>https://www.theverge.com/?p=792650</id>
			<updated>2025-10-06T08:24:10-04:00</updated>
			<published>2025-10-06T07:33:05-04:00</published>
			<category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="AMD" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="OpenAI" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[AMD is partnering with OpenAI to provide six gigawatts worth of processors for AI data centers, a move that challenges Nvidia's AI chip market dominance. The five-year agreement aims to help OpenAI bolster its infrastructure to meet growing computational demands for AI applications like ChatGPT, starting with a gigawatt deployment of AMD Instinct MI450 GPUs [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="The AMD logo against a red background." data-caption="" data-portal-copyright="Image: The Verge" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/02/acastro_180529_1777_amd_0001.0.webp?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p class="has-text-align-none">AMD is partnering with OpenAI to provide six gigawatts worth of processors for AI data centers, a move that challenges Nvidia's AI chip market dominance. The five-year agreement aims to help OpenAI bolster its infrastructure to meet growing computational demands for AI applications like ChatGPT, starting with a gigawatt deployment of AMD Instinct MI450 GPUs in the second half of 2026, according to <a href="https://ir.amd.com/news-events/press-releases/detail/1260/amd-and-openai-announce-strategic-partnership-to-deploy-6-gigawatts-of-amd-gpus">AMD's press release</a>.</p>
<p class="has-text-align-none">AMD said it expects the agreement to deliver "tens of billions of dollars in revenue" without providing specific details. AMD's stock price is up by 24 percent in pre-market trading following the partnership announcement on Mo …</p>
<p><a href="https://www.theverge.com/news/792650/amd-openai-five-year-ai-chip-agreement">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
	</feed>
