I’ve made it a personal project of mine to capture moments in Seattle’s dive bars like Lucky Liquor before they’ve all been replaced by ground-floor retail hipster bars under condos building built for techies. Those techies would, like me, be very interested in the Spectre app, it’s built by the same developers behind Halide, a popular gesture-based iOS camera app that outputs RAW.
Spectre computationally generates a long exposure from Apple’s Live photos. That’s a simple description for what in effect is a short video clip with interpolated images to create what resembles a long exposure. After taking the photo, you can share a still or the short clip.
While some may think is Spectre app is a bit of a gimmick, for $1.99 it’s a quick look into the near future of photography. Artificial Intelligence and machine learning are what make real-time tracking AF possible in Sony’s cameras. Post-focus processing is happening already in cameras from Panasonic and how the iPhone XS camera can change depth of field after the photo is taken.
The Light L16 promised it would take an array of photos and then stitch them together so a photographer could pick an angle or focal length in post and find new creative freedom to capture an instance and do more it with it later.
It didn’t work so well.
The Spectre app does what it claims, albeit for me at a low resolution.
How Spectre Works
I asked Sebastiaan de With, one of the creators, how the app works. In a text message, he replied,
“Spectre uses AI to make long-exposure photographs. Spectre can remove crowds, turn city streets into rivers of light, make waterfalls look like paintings, and much more.
“Until now, it was difficult to take long exposures with an iPhone, from keeping the camera steady, framing the shot and guessing the amount of light for a perfect exposure — I am sure you’re quite familiar with that. Spectre takes care of all of that.
“Since we use many exposures instead of just one long exposure, you can even get a video instead of just a final still of your exposure.”
To be clear, the app doesn’t have a “remove crowds” button or “make a waterfall painting.” The app does consider all the pixels taken during the exposure and only keeps the constant ones. So it’s the non-constant nature of the subject that makes it turn out blurred.
Really neat stuff.
And, because I take demo cameras to bars, I can compare similar shots. Here’s a long exposure taken with a Phase One; not a fair comparison, I know, but it gives you an idea of what a photographer can create with the app. Also, compare a waterfall taken with an a7r III and with Spectre. The Phase One took several attempts, most thrown away, and futzing for an hour with the settings to get a useable still.
Spectre automatically detects the scene and blends the frames taken. It also detects the exposure and adjusts itself accordingly. You can turn that off, but why would you? Just let the app do its thing.
Regarding the low-rez image, my iPhone is a 7, and pre-iPhone 8 devices get a lower resolution output from Spectre and are also not AI stabilized. I used a tripod for my shots. If handheld, the app includes a “steady” icon that indicates movement, like the built-in panorama mode from Apple.
Spectre app is built with the technologies below. The real beauty of it is the clean, easy-to-use user interface. The functions glow and resemble dials. The type is harmoniously designed in conjunction with iconography, so it feels tactile like an actual physical camera.
- DCI-P3 Wide Color Pipeline
- Live Photos
- Metal Graphics Acceleration
- Tripod Detection
- Siri & Shortcuts
- AI Stabilization
- Machine Learning & CoreML
- Computer Vision
I was able to figure out how to use it quickly and, most importantly, had fun with it.
The best thing about Spectre is it’s built by just two developers in San Francisco, another town that’s losing dive bars inhabited by ghosts to condos for techies.
I should visit with the Phase One and this app.