Take, for example, a photo of the Duchess of Cambridge issued by Kensington Palace in March: The news outlet retracted it after experts pointed out obvious alterations. Some also questioned the authenticity of an image taken during the assassination attempt on former President Donald Trump.
Here are some expert suggestions for the next time you come across an image that leaves you wondering:
Zoom in
It may sound basic, but a study by Sophie Nightingale, a researcher at Lancaster University in the UK, found that people of all ages who took the time to zoom in on a photo and carefully examine different parts of it were better at spotting altered images.
Try this out next time your photo seems off, but be careful not to focus on the wrong thing. For reference, I created this (slightly exaggerated) example image that highlights common signs of image manipulation.
Nightingale suggested that rather than focusing on shadows, lighting, and so on, people should look for “photometric” clues such as blurry edges of objects, noticeable pixelation in some parts of the image, or color differences that suggest they were added later.
Let’s think about this parrot: First, who would bring a parrot to a polling booth?
Look closely at the wings and note the contrast between the fuzzy edges of the tip feathers and the rounded cutouts closer to the body – this is clearly an amateurish photoshop job.
Find Funky Geometry
Editing small details seamlessly into an image is one of the hardest things to do and often leads to mistakes – it's easy to notice when regular repeating patterns are disrupted or distorted.
In the image below, notice how the bricks in the wall behind the partition are misshapen and crushed – something fishy is going on here.
Consider the now infamous photo of Catherine, Duchess of Cambridge.
The princess appeared with her arms around her two children, and online sleuths were quick to point out inconsistencies, including floor tiles that appeared to overlap and molding that looked slightly out of place.
In the polling booth example, did you notice that this person has an extra finger? Sure, it's possible that this person was born with a condition such as polydactyly, where the person has an extra finger or toe. But this is a bit unlikely, so if you notice something like an extra finger, it could be a sign that the image has been altered using AI.
It's not just bad Photoshop that can ruin the details: AI is notoriously shaky when it comes to manipulating detailed images.
So far, this has been especially true for structures like the human hand, but AI is getting more and more accurate. Still, it's not uncommon for AI-generated or edited images to have the wrong number of fingers.
Consider the context
One way to judge the authenticity of an image is to take a step back and consider its surroundings. The context in which an image is placed can tell you a lot about the intention behind sharing it. Consider the following social media posts created for an altered image:
Ask yourself: Do you know anything about the person who shared the photo? Is the photo attached to a post intended to provoke an emotional response? If there is a caption, what is written in it?
Photoshopped images, or even real images placed in a different context, are designed to appeal to our “intuitive, visceral thinking,” says Peter Adams, senior vice president of research and design at the News Literacy Project. A non-profit organization that promotes critical media evaluation. Such edits can be used to artificially generate support or elicit sympathy for a particular cause.
Nightingale suggests that if you come across an image that makes you angry, ask yourself a few questions: “Why would someone post this? Is there some ulterior motive that suggests this may be fake?”
Adams added that often the comments and replies to a photo will reveal it's fake.
Here's one example culled from X: An AI-generated image of Trump surrounded by six young black men first appeared in October 2023, but resurfaced in January, accompanied by a post saying the former president had stopped his motorcade to meet with the men in an impromptu meeting.
However, it wasn't long before comments began pointing out inconsistencies, such as the fact that Trump appears to have only three large fingers on his right hand.
Go to source
In some cases, authentic images may suddenly appear, making you wonder if they really happened. Finding the source of such images can sometimes reveal important facts.
Earlier this year, science educator Bill Nye appeared on the cover of Time Out New York in a more stylish outfit than the baby blue lab coat many remember him for. Some wondered if the image was generated by AI, but tracing the credit trail to the photographer's Instagram account revealed that The Science Guy was indeed was Wear edgy, youthful clothing.
For images that claim to come from actual news events, it's also worth checking news services like the Associated Press and Reuters, as well as companies like Getty Images, all of which offer sneak peeks at the edited images they take.
If you can find the original image, it is guaranteed to be the real thing.
Try a reverse image search
If an image is out of character, overtly biased, or just doesn't fit the overall vibe, you can try using a reverse image search tool like TinEye or Google Image Search to find the original. Even if you can't find it, these tools may reveal valuable context about the image.
One recent example: Shortly after a 20-year-old gunman attempted to assassinate President Trump, an image of a smiling Secret Service agent clinging to the former president was published on Threads, a social media service owned by Mehta, and used to bolster unfounded claims that the shooting was staged.
There is not a single smile in the original photo.
Even armed with these tips, it's unlikely you'll be able to distinguish between real and doctored images 100 percent of the time. But that doesn't mean you shouldn't hone your skepticism. Remembering that factual truth still exists, even in times of division and confusion, is part of the job we all need to do from time to time.
Losing sight of that, Nightingale says, just gives bad actors the opportunity to “ignore the whole thing”.
“That's where society is really at risk,” she said.
Editing: Karly Domb Sadof, Yun-Hee Kim.