<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Adobe Max 2025: all the latest creative tools and AI announcements &#8211; The Verge</title>
	<subtitle type="text">The Verge is about technology and how it makes us feel. Founded in 2011, we offer our audience everything from breaking news to reviews to award-winning features and investigations, on our site, in video, and in podcasts.</subtitle>

	<updated>2025-10-31T17:31:03+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.theverge.com/news/807867/adobe-max-2025-all-the-latest-announcements" />
	<id>https://www.theverge.com/rss/stream/807867</id>
	<link rel="self" type="application/atom+xml" href="https://www.theverge.com/rss/stream/807867" />

	<icon>https://platform.theverge.com/wp-content/uploads/sites/2/2025/01/verge-rss-large_80b47e.png?w=150&amp;h=150&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Jess Weatherbed</name>
			</author>
			
			<title type="html"><![CDATA[Adobe’s experimental AI tool can edit entire videos using one frame]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/811602/adobe-max-2025-sneaks-projects" />
			<id>https://www.theverge.com/?p=811602</id>
			<updated>2025-10-31T13:31:03-04:00</updated>
			<published>2025-10-31T13:31:03-04:00</published>
			<category scheme="https://www.theverge.com" term="Adobe" /><category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Creators" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Adobe demonstrated some of the experimental AI tools it's working on at its Max conference that provide new ways to intuitively edit photos, videos, and audio. These experiments, called "sneaks," include tools that instantly apply any changes you make to one frame across an entire video, easily manipulate light in images, and correct mispronunciations in [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="Project Frame Forward takes any change you make to the first frame of a video and applies it across the entire footage." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/10/Adobe-Project-Frame-Forward.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Project Frame Forward takes any change you make to the first frame of a video and applies it across the entire footage.	</figcaption>
</figure>
<p class="has-text-align-none">Adobe demonstrated some of the experimental AI tools it's working on at its Max conference that provide new ways to intuitively edit photos, videos, and audio. These experiments, called "sneaks," include tools that instantly apply any changes you make to one frame across an entire video, easily manipulate light in images, and correct mispronunciations in audio recordings.</p>
<p class="has-text-align-none">Project Frame Forward&#8239;is one of the more visually impressive sneaks, allowing video editors to add or remove anything from footage without using masks - a time-consuming process for selecting objects or people. Instead, Adobe's demonstration shows Frame Forward identifying …</p>
<p><a href="https://www.theverge.com/news/811602/adobe-max-2025-sneaks-projects">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Elissa Welle</name>
			</author>
			
			<author>
				<name>Jess Weatherbed</name>
			</author>
			
			<title type="html"><![CDATA[Adobe&#8217;s AI social media admin is here with &#8216;Project Moonlight&#8217;]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/807457/adobe-ai-agent-project-moonlight" />
			<id>https://www.theverge.com/?p=807457</id>
			<updated>2025-10-28T07:55:02-04:00</updated>
			<published>2025-10-28T08:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Adobe" /><category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Creators" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[As Adobe builds AI assistants into each of its applications, the company is also building an AI agent on its Firefly platform to act as a centralized creative director for social media campaigns. Project Moonlight's chatbot integrates with Adobe's creative software apps and pulls from your existing social media channels to brainstorm and edit content [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="Simulated screenshot of Adobe’s Project Moonlight AI chatbot, creating social media post suggestions from a text prompt." data-caption="Project Moonlight" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/10/ProjectMoonlight.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	Project Moonlight	</figcaption>
</figure>
<p class="has-text-align-none">As Adobe builds AI assistants into each of its applications, the company is also building an AI agent on its Firefly platform to act as a centralized creative director for social media campaigns. Project Moonlight's chatbot integrates with Adobe's creative software apps and pulls from your existing social media channels to brainstorm and edit content that's consistent with your personal style and voice.</p>
<p class="has-text-align-none">As seen in the above sample image, users describe their vision to the bot in text, and the AI assistant processes those ideas through Adobe's other AI-enabled editing tools to create personalized images, videos, and social posts. Adobe says  …</p>
<p><a href="https://www.theverge.com/news/807457/adobe-ai-agent-project-moonlight">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jess Weatherbed</name>
			</author>
			
			<title type="html"><![CDATA[You can tell Adobe Express’s new AI assistant to edit your designs for you]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/807802/adobe-express-ai-assistant-prompt-editing-beta-max-2025" />
			<id>https://www.theverge.com/?p=807802</id>
			<updated>2025-10-28T07:59:02-04:00</updated>
			<published>2025-10-28T08:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Adobe" /><category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Creators" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[A new generative AI experience is coming to Adobe's cloud-based Express design platform, enabling you to transform projects by vaguely describing what changes to make. Adobe describes the "AI Assistant in Adobe Express" launching in public beta today as a conversational creative agent that "empowers people of every skill level" to quickly create visual content, [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="An illustration showcasing Adobe’s new AI Assistant for Express." data-caption="The AI Assistant is designed to be used without needing professional design experience." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/10/Express-AI-Assistant.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	The AI Assistant is designed to be used without needing professional design experience.	</figcaption>
</figure>
<p class="has-text-align-none">A new generative AI experience is coming to Adobe's cloud-based Express design platform, enabling you to transform projects by vaguely describing what changes to make. Adobe describes the "AI Assistant in Adobe Express" launching in public beta today as a conversational creative agent that "empowers people of every skill level" to quickly create visual content, without having to understand specific design terms or creative tools.</p>
<p class="has-text-align-none">The feature is available as a toggle in the top-left corner of the Adobe Express web app. When activated, the usual homepage interface and tool options will be replaced by a chatbot-style text box, with options to  …</p>
<p><a href="https://www.theverge.com/news/807802/adobe-express-ai-assistant-prompt-editing-beta-max-2025">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jess Weatherbed</name>
			</author>
			
			<title type="html"><![CDATA[Adobe’s new AI audio tools can add soundtracks and voice-overs to videos]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/807809/adobe-firefly-ai-audio-generate-soundtrack-speech" />
			<id>https://www.theverge.com/?p=807809</id>
			<updated>2025-10-28T07:20:59-04:00</updated>
			<published>2025-10-28T08:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Adobe" /><category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Creators" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Adobe is giving filmmakers new generative AI audio tools that can quickly add thematically appropriate backing tracks and narration to videos. Generate Soundtrack and Generate Speech are being introduced to a redesigned Adobe Firefly AI app, while Adobe is also developing a new web-based video production tool that combines multiple AI features with a simple [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="An illustration of Adobe Firefly’s Generate SoundTrack tool." data-caption="Generate Soundtrack is like Mad Libs for backing instrumentals." data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/10/GenerateSoundtrack.jpg?quality=90&#038;strip=all&#038;crop=0,0,88,100" />
	<figcaption>
	Generate Soundtrack is like Mad Libs for backing instrumentals.	</figcaption>
</figure>
<p class="has-text-align-none">Adobe is giving filmmakers new generative AI audio tools that can quickly add thematically appropriate backing tracks and narration to videos. Generate Soundtrack and Generate Speech are being introduced to a redesigned Adobe Firefly AI app, while Adobe is also developing a new web-based video production tool that combines multiple AI features with a simple editing timeline.</p>
<p class="has-text-align-none">The Generate Soundtrack tool is launching in public beta in the Firefly app, and works by assessing an uploaded video and then generating a selection of instrumental audio clips that automatically synchronize to the footage. Users can direct the style of the music by se …</p>
<p><a href="https://www.theverge.com/news/807809/adobe-firefly-ai-audio-generate-soundtrack-speech">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jess Weatherbed</name>
			</author>
			
			<title type="html"><![CDATA[Photoshop and Premiere Pro’s new AI tools can instantly edit more of your work]]></title>
			<link rel="alternate" type="text/html" href="https://www.theverge.com/news/807811/adobe-photoshop-lightroom-premiere-pro-ai-max-2025" />
			<id>https://www.theverge.com/?p=807811</id>
			<updated>2025-10-28T07:58:49-04:00</updated>
			<published>2025-10-28T08:00:00-04:00</published>
			<category scheme="https://www.theverge.com" term="Adobe" /><category scheme="https://www.theverge.com" term="AI" /><category scheme="https://www.theverge.com" term="Creators" /><category scheme="https://www.theverge.com" term="News" /><category scheme="https://www.theverge.com" term="Tech" />
							<summary type="html"><![CDATA[Adobe has kicked off its annual Max event, giving us a first look at new and upcoming generative AI tools launching for the company's Photoshop, Premiere Pro, and Lightroom Creative Cloud apps. These include updates to Photoshop's Generative Fill feature that aim to give creators more control over adding, removing or modifying content, and tools [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="Photoshop’s Generative Fill tool using Google’s Gemini 2.5 Flash." data-caption="Photoshop’s Generative Fill tool can now use third-party AI models. | Image: Adobe" data-portal-copyright="Image: Adobe" data-has-syndication-rights="1" src="https://platform.theverge.com/wp-content/uploads/sites/2/2025/10/GenFill.jpg?quality=90&#038;strip=all&#038;crop=0,0,85.866662597656,100" />
	<figcaption>
	Photoshop’s Generative Fill tool can now use third-party AI models. | Image: Adobe	</figcaption>
</figure>
<p class="has-text-align-none">Adobe has kicked off its annual Max event, giving us a first look at new and upcoming generative AI tools launching for the company's Photoshop, Premiere Pro, and Lightroom Creative Cloud apps. These include updates to Photoshop's Generative Fill feature that aim to give creators more control over adding, removing or modifying content, and tools that can automate some of the more time-consuming elements of editing photos and videos.</p>
<p class="has-text-align-none">To start, Adobe is allowing Photoshop users to power Generative Fill capabilities using Google and Black Forest Labs' third-party AI models. After selecting their image and giving Generative Fill a prompt - such …</p>
<p><a href="https://www.theverge.com/news/807811/adobe-photoshop-lightroom-premiere-pro-ai-max-2025">Read the full story at The Verge.</a></p>
						]]>
									</content>
			
					</entry>
	</feed>
