Meta is incorrectly marking real photos as ‘AI generated’

A number of photographers have shared examples over the past few months, with Meta recently featuring one photo. Former White House photographer Pete Souza took the picture of the basketball game Generated by AI. In another recent example, Meta added labels incorrectly on instagram photo Photo of the Kolkata Knight Riders winning the Indian Premier League cricket tournament. Interestingly, like Souza’s photo, the label is only visible when viewing the images on mobile, not on the web.

Souza says he tried unchecking the labels, but he couldn’t. He believes that using Adobe’s cropping tools and flattening images before saving them as JPEG images may trigger Meta’s algorithm.

However, Meta has also falsely flagged real photos as AI when photographers remove even the smallest objects using generative AI tools such as Adobe’s Generative Fill. Petapixel Reports. The publication tested this for itself by using Photoshop’s generative fill tool to remove a blemish from an image, which Meta marked as AI-generated on Instagram. Strangely, however, Meta did not add a “Made with AI” label Petapixel Uploaded the file back into Photoshop and then saved it after copying and pasting it into a black document.

Many photographers have expressed their frustration that minor edits like these are being unfairly attributed to AI.

“If ‘retouched’ photos are ‘AI-created,’ then that term no longer means anything,” said photographer Noah Kalina. Written on threads“If they’re serious about people’s safety they should auto-tag every photo with ‘not an accurate representation of reality’.”

In a statement VergeMeta spokeswoman Kate McLaughlin said the company is aware of the issue and is evaluating its approach “so that [its] The labels reflect the amount of AI used in an image.”

“We rely on industry standard indicators that other companies incorporate into the content of their devices, so we are actively working with these companies to improve this process so that our labeling approach matches our intent,” McLaughlin said.

In February, Meta announced it would begin adding a “Made with AI” label to photos uploaded to Facebook, Instagram and Threads ahead of this year’s election season. Specifically, the company said it would add the label to AI-generated photos created with tools from Google, OpenAI, Microsoft, Adobe, Midjourney and Shutterstock.

Meta hasn’t revealed what exactly triggers the “Made with AI” label, but all of these companies have – or are working on – adding metadata to image files to indicate the use of AI tools, which is one way Meta identifies AI-generated photos. Adobe, for example, started adding information about a content’s origin to its metadata with the release of its Content Credentials system last year.


Disclaimer : The content in this article is for educational and informational purposes only.

Leave a Reply

Your email address will not be published. Required fields are marked *