Meta instagram incorrect made by ai photo labels – Breaking News & Latest Updates 2026
Skip to main content

Meta is incorrectly marking real photos as ‘Made by AI’

Many photographers say Meta added the label to photos they made or enhanced with editing tools.

Many photographers say Meta added the label to photos they made or enhanced with editing tools.

Meta logo on blue background
Meta logo on blue background
Illustration by Alex Castro / The Verge
Sheena Vasani
writes about tech news, reviews gadgets, and helps readers save money by highlighting deals and product recommendations for The Verge.

Multiple photographers are complaining that Meta is wrongly adding its “Made by AI” label to real photos they’ve taken, TechCrunch reports.

Several photographers have shared examples over the past few months, with Meta recently marking a photo former White House photographer Pete Souza took of a basketball game as AI-generated. In another recent example, Meta incorrectly added the label to an Instagram photo of the Kolkata Knight Riders winning the Indian Premier League Cricket tournament. Interestingly, like Souza’s photo, the label only shows up when viewing the images on mobile, not on the web.

Souza says he tried to uncheck the label but was unable to. He theorizes that using Adobe’s cropping tool and flattening images before saving them as JPEG images may be triggering Meta’s algorithm.

Related

However, Meta has also incorrectly marked real photos as AI when photographers use generative AI tools like Adobe’s Generative Fill to remove even the smallest of objects, PetaPixel reports. The publication tested this out for itself using Photoshop’s Generative Fill tool to remove a speck from an image, which Meta then marked as AI-generated on Instagram. Strangely, though, Meta didn’t add the “Made with AI” label when PetaPixel uploaded the file back into Photoshop and then saved it after copying and pasting it into a black document.

Multiple photographers have voiced their frustrations that such minor edits are unfairly being labeled as AI-generated.

“If ‘retouched’ photos are ‘Made with AI’ then that term effectively has no meaning,” photographer Noah Kalina wrote on Threads. “They might as well auto tag every photograph ‘Not a True Representation of Reality’ if they are serious about protecting people.”

In a statement to The Verge, Meta spokesperson Kate McLaughlin said that company is “taking into account recent feedback” and is evaluating its approach “so that [its] labels reflect the amount of AI used in an image.”

“We rely on industry standard indicators that other companies include in content from their tools, so we’re actively working with these companies to improve the process so our labeling approach matches our intent,” added McLaughlin.

In February, Meta announced it would start adding “Made with AI” labels to photos uploaded across Facebook, Instagram, and Threads ahead of this year’s election season. Specifically, the company said it would add the label to AI-generated photos made with tools from Google, OpenAI, Microsoft, Adobe, Midjourney, and Shutterstock.

Meta hasn’t disclosed what exactly triggers the “Made with AI” label, but all of these companies have — or are working on — adding metadata to image files to signify the use of AI tools, which is one way Meta identifies AI-generated photos. Adobe, for example, started adding information about a content’s origins into the metadata with the release of its Content Credentials system last year.

Correction, June 25th: An earlier version of this story did not properly attribute the quote from Meta’s spokesperson.

Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.