When can a photo be trusted?

Fourandsix on TwitterFollow @Fourand6
« What is a Truthful Photo? | Main | Photo Forensics from Ears »

World Press Photo Contest: The Importance of Restraint

This week, a controversy regarding this year’s winner of the World Press Photo contest came to a head. Since the contest winner was announced earlier this year, there have been concerns that the photo looked a little too unreal and even cinematic to be an authentic photo. Those concerns reached a new level this week when the Hacker Factor blog posted evidence which it claimed showed that the award-winning photo was in fact a composite of multiple images. After this claim blew up further on the Extreme Tech website, Fourandsix was enlisted by World Press Photo to help settle the issue. We provided our own analysis refuting the other evidence, and demonstrating that the photo is not a composite but rather a single photo to which adjustments have been applied to selective regions.

This situation clearly illustrates the importance of showing restraint in any forensic image analysis. When applying any forensic test, it is critically important that you be mindful of exactly what that test can reveal, and not draw any broader conclusions. For example, our FourMatch product can tell you when a file has been unaltered since the time it was captured by a camera, but we are quite explicit in stating that a failure to pass the FourMatch test does not prove that an image has been modified. Instead, it merely introduces the possibility of editing, but further testing using other techniques is required to prove that specific editing has occurred.

One of the key pieces of evidence cited in the initial article criticizing the photo is a block of Photoshop metadata which was said to indicate that multiple files had been opened in Photoshop and combined. This claim immediately raised my suspicions, because I know from my 15 years working on the Photoshop team that tracking metadata from multiple, composited photos is a challenge that the team has never really tackled. Typically, when one photo is pasted into another, all of the metadata from the pasted photo is discarded. As expected, when I examined the metadata in question, I discovered that it indicated nothing more damning than a file that had been adjusted several times in the Adobe Photoshop Camera Raw dialog prior to being opened in the main Photoshop application and saved out as a JPEG. To verify this, I succeeded in creating the same pattern of metadata in one of my own files by doing just that.

I can understand why someone might think that the original metadata was an indication of multiple photos being combined, but thinking is not knowing. Making any claim about what this metadata represents requires a thorough investigation of how the metadata is generated.

Another pillar of the original article criticising the photo was a shadow analysis. Readers of this blog will know that we often rely on shadow analysis to determine whether the shadows in an image all consistently point to the same light source, because a failure for the shadows to match up indicates either that there were multiple light sources in the scene, or that multiple photos were combined. In fact, the shadows in this photo were in alignment, but the author of the article incorrectly used this analysis technique to deduce the location of the sun in the sky. This, again, was showing insufficient restraint in drawing a conclusion. This shadow analysis technique simply is not capable of telling you the actual location of the light source in the scene; it only can tell you whether the lighting is consistent.

Lastly, Hacker Factor relied on its favored technique of Error Level Analysis (ELA) to indicate areas of the image that may have been modified. The author himself has acknowledged that this technique can result in both false positive and false negative results. For this reason, we don’t believe that ELA meets our requirement of showing restraint in drawing conclusions. If both the negative and the postive results of the test are open to errors, then what can you conclude? That maybe the photo was edited? Isn’t that uncertainty the reason you’re performing a forensic test in the first place? We find that ELA offers more questions than answers.

In short, a lack of restraint in drawing forensic conclusions led to this award-winning photo being attacked to a degree that was unwarranted. It should be noted, however, that Hacker Factor didn’t initiate the controversy surrounding this photo; it merely escalated it. The photo was already controversial because of the dramatic lighting which appears on simple visual inspection to be unrealistic. Sure enough, when we compared the original raw capture of this image to the contest-winning JPEG, we found that the photographer had dramatically lightened certain areas, such as faces, while darkening others. This is essentially what’s known as dodging and burning, which has been accepted within photojournalism since the darkroom days. The extent of these adjustments was fairly pronounced, and though there is nothing inherently dishonest about these edits, I suspect they would be outside the comfort zone of some news organizations.

Driven both by the increasing power of image editing applications and the recent popularity of image “looks” created by consumer apps like Instagram, I’ve noticed an increasing tendency towards more stylized photojournalism images. Every media organization sets their own standards, and these standards may shift over time. Clearly the edits on this image still fall within the guidelines demanded by World Press Photo, and there’s nothing inherently dishonest about them. In a competition context, a photographer certainly wants his images to stand out, so it’s not unexpected that a photographer might aim to pump up the drama while still adhering to the guidelines of the competition.

On the other hand, the current online environment is one in which the level of public distrust of the media is high, and readers are quick to find reasons to question the images and information they see. In such an environment, I believe that journalism organizations, also, would be wise to show some restraint in their editing guidelines. Though the temptation towards more trendy editing styles is understandable, adhering to a more straightforward and limited approach may do more to reassure the public and build trust.

PrintView Printer Friendly Version

Reader Comments (1)

Thank you for very sane, knowledgeable and moderate discussion of this 'beat-up'. Of particular interest is the admission of how unlikely it is that tracking metadata will identify a past-up or collage, and your discussion of the inconclusive 'shadow analysis' by Krawitz. The motives of Hacker Factor beyond self-aggrandisement are difficult to gauge, but suspect.
Best Wishes,

May 16, 2013 | Unregistered CommenterJames McArdle

PostPost a New Comment

Enter your information below to add a new comment.
Author Email (optional):
Author URL (optional):
Some HTML allowed: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <code> <em> <i> <strike> <strong>