Technology is in the midst of disrupting a large portion of our daily lives. When was the last time you bought film for your camera or had film processed as prints? When was the last time you stopped at a gasoline station on a trip and bought a map?
It is not overly simplistic to say the iPhone has had a major impact on our day to day lives. Calendar, Contacts, Reminders, Notes, Mail, Camera, and other apps have helped to alter how we communicate, stay in touch, keep up to date. Whole industries have been disrupted thanks to a few billion people on planet earth using smartphones with cameras.
So, let me do a quick mashup of the iPhone’s camera, photos we take with the camera, where they’re stored and what can be done to enhance photos with almost no effort. iPhone is the world’s most popular camera (it’s a crazy statistic because iPhone has been around for a decade and no other brand has the same impact) so it’s safe to say we take more photos than ever before.
After all, photography– good photography– is free. Shoot as many photos or movies as you want because iCloud can take them all with iCloud Photo Library. Shoot on the iPhone, view on the iPad, edit to professional calibre with Photoshop on the Mac. Wait. Disruption dictates that modern technology cut out the middleman. Goodbye Photoshop on the Mac.
Google and MIT’s Computer Science and Artificial Intelligence Laboratory have combined to process your camera’s photos almost as quickly as you take them. No need to upload to the cloud, or transfer photos to your Mac, or bother with Photos enhancement tools or even Photoshop.
Machine learning is all the rage these days and Google can apply professional level photo enhancements right on a smartphone, and make changes so fast they’re done before you look at the photo. Google and MIT call it Deep Bilateral Learning for Real-Time Image Enhancement. I call it disruption because it cuts out the middle man– in this case, photo enhancement tools. Think of it as computational photography rather than using a camera and an editor.
We see something similar with camera filters and presets, and most visible within HDR (high dynamic range) photos where algorithms do the work to enhance an image, and in the case of the iPhone, almost immediately.
What we see in the Google and MIT project is a glimpse of the future that is sure to come to iPhone photography, too. Almost any photo taken with a decent smartphone camera can be enhanced– nearly instantly– into professional level photos that could be indistinguishable from anything coming out of Photoshop or taken with an expensive DSLR.
You see the same thing in the iPhone 7 Plus’ new dual camera mode, in Google’s Pixel HDR+, and even in Microsoft’s highly touted Pix app. All are examples of computational photography. This coming disruption doesn’t mean Photoshop won’t have a place, and photo enhancement tools will become passé but it means smartphone camera photos will be even better than we imagined just a few years ago because our handheld devices– iPhone and camera– will have become so powerful and capable that every photo we take could be indistinguishable from perfect; automatically at the click of a digital button.