Just over two years ago, Google dramatically raised the bar for smartphone image quality with the release of its Pixel range. As the rest of the industry tries to catch up, Google has now released a revolutionary new camera system that continues to impress.
The really impressive part? It doesn’t require any new hardware to work.
Introducing Night Sight
It’s called Night Sight and it effectively lets your phone camera see in the dark.
You’ve probably seen images from the Pixel 3’s camera that have been tweeted by the big Techtubers. What isn’t immediately apparent is just how dark it can be to get a good image from the magic software.
YouTuber RedskullPro has a pretty in-depth camera comparison (found below) with the same shots both in Night Sight mode and in the normal camera mode. In it, you can see the stark difference, that previously would have needed extra lighting and probably some post-production tweaks to realize – if it was even possible at all.
Don’t be fooled into thinking it’s just a clever long-exposure photo taker. To get a long-exposure image previously you’d need to use a tripod to stabilize your camera long enough to capture several seconds of light information. Google’s system works handheld with a clever trick or two, splitting the long-exposure up into multiple burst images which then get reassembled into one image by the magic algorithms in the app. It’s a vast evolutionary leap of the HDR+ processing pipeline that is used in the main Pixel camera system.
Here’s how it works
Even before you press the shutter to take a picture, Google’s Night Sight camera starts making multifactorial calculations. Using what Google calls “motion metering,” the Pixel measures its own movement (or lack of), the movement of all objects in the frame, and the amount of light available to decide how many exposures to take and how long they should be. Night Sight photos will take up to 15 frames and six seconds to capture one image.
If the phone is completely still, like on a tripod, there’s a one second per exposure maximum. If it’s handheld, there’s a maximum of a third of a second per exposure. That works out that you can get six one-second exposures if on a tripod, or up to 15 briefer exposures while the phone is handheld.
White balance is always a tricky proposition to automatically judge, so Google is using a more sophisticated learning-based algorithm that’s been trained to discount and discard the tints cast by unnatural light. This trained model will be coming to the normal camera mode sometime next year.
As you can see in the video above, the Night Sight mode doesn’t just brighten the normal Pixel images, but also cleans up noise and brings back color that didn’t appear in the normal images.
The best part – it doesn’t require any user input. If the phone detects that a scene is dark enough, it will pop up a suggestion to switch to Night Mode. The only controls are the usual tap-to-focus and an exposure slider.
Sure, there are some downsides to it
It does have some limitations, however, especially if you’re trying to capture anything in motion. It can account for small movements, but things like cars driving by will get blurred. That probably opens up the door to some interesting artistic shots once more people get their hands on it.
Google’s Night Sight just released yesterday as an update to the Pixel camera app for the Pixel 3, last year’s Pixel 2 and the original Pixel from 2016. The oldest model won’t quite get the same level of image quality as it doesn’t have optical image stabilization. The learning-based white balancer was specifically trained for the Pixel 3 so users of other phones won’t get the absolute best image quality either.
What do you think of the Night Sight mode? Is it enough to make you think about switching to the Pixel? Let us know in the comments.
- You can finally find iPhones and MacBooks on Amazon again
- Lame, Google is limiting its fast-charging functionality on the Pixel 3 to Google-approved chargers
- Google confirms (sort of) that Android’s dark mode saves battery life