Recently I was sitting in the living room with my flatmates. Drinking wine and chatting I had my laptop resting on my knees, going through pictures taken on a hike in the Loch Lomond area earlier that day, and editing some. Curious, one of my flatmates observed how I moved the sliders in Lightroom about, exported two almost identical images that only differed in where I had focused, then imported them into Affinity Photo for focus stacking and edited them further. “This is too much,” she exclaimed as I adjusted the white balance in the sky and changed the exposure, saturation, contrast and clarity of the stones in the foreground, “this isn’t real any more!” Ah, yes. The ‘reality’ complaint. Most commonly fielded by non-photographers, but also occasionally uttered by those active in the field, it has become a staple of every discussion on the philosophy and ethics of photography. However, those critics fail to answer, often even consider, two fundamental questions: Why does a picture need to be ‘real’, and what is ‘real’ anyway?
Let us address the latter question first – what constitutes reality? What makes one image more real than another one? How would we set out to create an image that is truly and universally recognised as real? There are two prevalent definitions, one based on average human visual perception, one on the purely technological aspects of imaging.
Many intuitively adhere to the view that for an image to be classified as real, it needs to represent a scene in a way that is as similar as possible to how it would be perceived by the human eye. While this definition certainly sounds convincing, it has little to do with reality. From a neuropsychological perspective we do not actually perceive the world as it is, rather our visual reality is created along a pathway of several stages that starts in the rods and cones of the retina, follows the optical nerve to the primary visual cortex in the occipital lobe and, after a bunch of further processing stages, ends up somewhere in area V4 and V5/MT. The final image is a result of both bottom-up and top-down processing, which means that it is in part manipulated by higher order cognitive processes. A great example is the blind spot, the location where the optic nerve is connected to the retina – devoid of photoreceptors, it leaves us blind, but our brain simply fills in the missing information based on the surrounding areas so we do not actually notice it. What we see in our blind spot is certainly not ‘real’, but it appears so to us. And that is just human visual perception – most other species have a visual system that significantly differs from ours. To quote Morticia Addams from Charles Addams’ The Addams Family: “Normal is an illusion. What is normal for the spider is chaos for the fly.” This certainly makes sense in a visual context too. What we see is radically different from what a fly sees – while the human eye can perceive wavelengths of light ranging from roughly 390 to 700 nm, flies can also see ultraviolet light in the 300 to 400 nm spectrum.

But let’s for a second follow this approach and equate what the average human eye sees with what is ‘real’: the ugly truth is that in the vast majority of cases it is unachievable without any kind of editing. Your digital camera is a technological device, focusing light through a lens onto a sensor that records incoming photons and relays this information to an image processor. From this it should be blatantly obvious that what your camera captures cannot be the same your eye, a biological organ, sees. The image recorded by your camera is usually rather dull, lacking in contrast and saturation, having no determined white balance etc., so it needs to be edited to some extent. This is a crucial step, and one that a lot of people are completely unaware of. Modern cameras are essentially little computers, and every JPEG they produce they have edited first, increasing the contrast and saturation, reducing noise, applying some sharpening, setting a white balance. The problem here is that your camera decides how to edit the pictures, and what it thinks is best for your images might not necessarily produce the result you are after. I actually once came across someone in an online forum proudly declaring to be “too much of a purist” to edit his images, a comment which I sardonically replied to: “Leaving the editing to your camera doesn’t make you a purist, it just makes you lazy.” Even photographers sometimes make the mistake of claiming JPEG OCC (meaning ‘JPEG out of camera’) to be ‘real’, when in fact it is simply the result of you leaving the editing to your camera. And often even this editing is unable to match what our eyes see, so careful, targeted adjustments are necessary in order to achieve this. As a matter of fact, the time spent editing an image and the sophistication of adjustments is not necessarily correlated to how natural the result will look in the end. It takes me a mere 30 sec. or so to crank up a couple of sliders in Lightroom or Photoshop and turn an image of a flower into a crazy surrealist nightmare, but I may have to tweak the same image for several minutes in order to produce a pleasing yet natural result.

In summary, we can conclude that what our eyes see is far from being objectively ‘real’, and if we actually want to reproduce what we see, editing is often necessary rather than optional. But as mentioned in the beginning, there is an alternative definition that declares an image to show reality if it is utterly and completely unedited. As I explained above, JPEGs OOC are edited by the camera, so we need to look at the data before that stage. Most advanced cameras have an option that delivers a file without any editing applied, this is called a RAW. Anyone who takes photography seriously uses (or should use) these files, rather than JPEGs, for post-processing, as JPEGs are already compressed and stripped of important data, whereas a RAW file is really just that – the raw image data recorded by the sensor. Technically, we can now declare this to be the ‘real’ deal, however if you have a look at it, you will realise that RAW files usually look rather bland. They have less contrast, saturation and clarity than the world we see with our eyes, so what’s the point? Yes, we could now all agree to only use JPEGs converted from completely unedited (except for white-balance) RAW files and rejoice in the knowledge that what we get is completely and objectively ‘real’, but this approach would in turn violate the first one: the result would not be what our eyes see, and it would look very boring in general. So maybe not …
In the beginning of this article I posed two questions – what makes an image ‘real’, and does it matter? Having answered the first question, let’s address the second one.
In photography, the call for ‘real’ images is a strong and frequent one, however it is rarely heard in other fields of art. When was the last time you heard someone complain about Picasso’s works not depicting reality as it is? Have you ever scoffed at van Gogh’s The Starry Night for its wildly inaccurate representation of celestial objects? Turned your back on the Easter Islanders’ famous Mo’ai statues because they grossly violate human anatomy? It is only photography (and videography) that is burdened with the demand for reality, mostly because of its inherent ability to actually get very close to depicting the visual world as we see it, and due to its relatively young age as compared to other forms of art such as painting or sculpting. Painting, for example, has a history that reaches thousands of years into the past, affording the discipline a long time for development. Over time people got bored with realism and a bunch of different styles and movements emerged, from Realism to Romanticism, Impressionism, Expressionism … Meanwhile, photography only emerged in the early to mid 19th century, was practically applied only in the latter half of the century and did not become really widespread before the early 20th century. The fact that it had significantly less time than other disciplines to develop different styles and movements, combined with its inherent realism, is probably the reason why we are inclined to expect photography and videography to adhere to some sort of construed realistic standard, whereas we do not expect it from any other form of art.

Naturally, some areas of photography clearly should aspire to create naturalistic images. Photojournalists, on account of trying to show current events ‘as they are’, are well advised to keep their images true to nature. They do not only try to create something aesthetically pleasing, but are trying to convey a message, to be accurate in their depiction of events and causes, and their images can have great impact on public opinion and politics. The high standards that are usually expected in this field became particularly evident when in 2015 World Press Photo disqualified 20% of finalists in their prestigious annual awards for violating the competition’s post-processing rules. Wildlife photography, too, usually tries to keep things rather realistic. But when it comes to landscape, or macro, commercial, abstract, fashion, architectural or still life photography, there really is no objective reason why photographers should limit their own creativity in order to please the masses in their misconception of what photography should be, and what not.
So next time you see an image that just doesn’t seem ‘real’ to you, instead of automatically shouting “Fake!”, maybe ask yourself “Does it really need to be real?” And then, maybe, just accept it for what it is – an artistic interpretation, rather than a scientific representation.