You know those beautiful photos that emanate from your iPhone. They’re not real. They’re fake. How do I know? My mom told me. This story began innocently enough. My mom and I were at lunch and while waiting for dessert she popped out her iPhone and opened up Instagram.
What she saw and what she said and what she meant made for an interesting conversation. “Tera, darling. Those photos are just beautiful.” Mom should know. She was a portrait photographer for a few decades– back in the days of film– and still dabbles, albeit with a Canon Rebel something or other.
Mom: “It’s too bad they look so fake.” Me: “What? Fake? How? I just shot them on my iPhone and uploaded to Instagram without any additional filters or processing. Fake?” Ever the mother with a critical eye, my mom proceeded to point out that each of the photos I recently uploaded looked better than the actual scene (Chicago skyline, Lake Michigan, a few downtown scenes) could possibly look.
The photos looked fake because they looked better than the scene itself. That was a difficult pill to swallow but mom was right. iPhone photos, particularly with the newer iPhone X and Xs models with dual cameras and portrait mode, turn mundane photos into rather spectacular images which are a distant and future relative of the scene from which they were spawned.
I can’t help myself now. Every photo I take with my iPhone looks, well, fake; at best, fake-like. Am I a good photographer with an eye for composition, lighting, and scenes? It doesn’t matter. Geoffrey A. Fowler on Google’s Pixel 3 camera:
The little camera on this phone has a superpower: It can see things our eyes cannot.
New iPhones don’t have the same Night Sight feature as the Pixel 3 but photos still look better than the actual scene. You see it up close in iPhone’s new Portrait Mode.
All sorts of phonemakers brag about how awesome their photos look, not how realistic they are. The iPhone’s “portrait mode” applies made-up blur to backgrounds and identifies facial features to reduce red-eye.
That depth of field, bokeh blur around the portrait? Fake.
Selfies on phones popular in Asia automatically slim heads, brighten eyes and smooth skin.
Apple was raked over the coals with the camera app on iOS 12 because the selfies appeared to be doctored. That was fixed in a recent version but, to be honest, selfies still look better than their subjects.
I wouldn’t mind a setting on the iPhone’s camera app that delivered a 10-pound weight loss. And fewer blemishes. Maybe longer eyelashes, too. Yes, there are plenty of filters that deliver that kind of adjustment, but how can we call them real photographs when the end result is better than the image from which the photo started?
Fake photos are the new norm thanks to computational photography whereby the lens and the laws of physics– and light and aperture– don’t really matter.
Now artificial intelligence and other software advances are democratizing creating beauty. Yes, beauty. Editing photos no longer requires Photoshop skills. Now when presented with a scenic vista or smiling face, phone cameras tap into algorithms trained on what humans like to see and churn out tuned images.
Is there a problem with such fake photography? Expectations vs. reality.
How far will phones remove our photos from reality? What might software train us to think looks normal? What parts of images are we letting computers edit out? In a photo I took of the White House (without Night Sight), I noticed the algorithms in the Pixel 3 trained to smooth out imperfections actually removed architectural details that were still visible in a shot on the iPhone XS.
What is real? You can’t tell from a photo or a video anymore.