Google is all set to bring second generation of Pixel series with a perfect pair of Pixel 2 and Pixel 2 XL in coming weeks. The camera setup of both these devices has become talk of the town as Google has opted the single lens shooter coupled with numerous innovations bringing it in direct completion with dual camera setup of iPhone 8 Plus and Galaxy Note 8; rather leading in a performance test.
If there’s any single thing that makes a great phone great, it’s the camera. Last year’s Pixel brought the best camera you could get on a phone for nearly a year, so the Pixel 2 is following the same trend. And if there’s any place where Google is going out on a limb with the Pixel 2, it’s with the choices it made on the camera setup.
What’s New With Pixel 2 Camera Setup
As already mentioned, rather than going with the recent trend of dual lens and a camera bump like Apple, Google has stuck to a single lens on the back and pairing it to a number of innovations that seem iterative when taken individually. But taken together and put through the filter of Google’s machine learning, I think they have a chance to be something really special.
Here are some of the hardware changes Google is cramming into its camera stack:
- It’s switching to a dual-pixel sensor on the back, which means that every single pixel is made of two smaller ones.
- It’s adding optical image stabilization for photos and videos, in addition to electronic image stabilization.
- The dual-pixel setup means that the pixels in the sensor are slightly smaller than last year’s Pixel: 1.4μm vs. 1.55μm.
- To compensate for the smaller pixels, the aperture on the lens is opening up to let in more light: f/1.8 compared to last year’s f/2.0.
- Although it gets more advanced phase detection for focus with the dual pixels, it’s keeping laser autofocus, too.
- It’s individually calibrating each phone in the factory to account for the tiny distortions that are inevitable on every camera lens.
Features of Pixel 2 Camera
Let us go through a bunch of things Google has done to make each of those iterative hardware changes multiply each other rather than just add up:
It’s not only Apple, but Google has also added a portrait mode to the Pixel 2. Seang Chau, VP of engineering at Google, says, “We have trained an algorithm on millions of faces to account for properly blurring around hair.”
The rear camera of Pixel 2 is also able to create the bokeh effect for any object. Doing that requires creating a depth map, and the usual way to create that depth map is to use two lenses. On the contrary, the Pixel 2 just has only one, but it utilizes the dual-pixel camera sensor. Chau elaborates the new technique by saying, “We have algorithms for that … we can increase the signal-to-noise ratio using our same HDR algorithm by overlapping multiple shots and get a much better depth map.”
Taking multiple shots in auto HDR mode is still the key to how Google approaches low-light photography. Even though the Pixel 2 has OIS, Chau says that it won’t leave the shutter open for a longer time when it’s dark. The OIS helps, but Google’s primary strategy is still to take a bunch of shots and let its algorithm jam on all that data to combine them into a single image.
Google says that usually video stabilization on phones is handled by software only, cropping in to the image to remove jitters. The OIS module is often locked in place for video, so it doesn’t float around and make life harder for the algorithm.
On the Pixel 2, the OIS module is free to float around, detecting your hand shake through gyros and location of the OIS module, and then combining those two jittery graphs into a giant pile of messy data that Google machine learning algorithms parse into stable video.
Like Apple’s Live Photos, you can set the Pixel 2’s camera to automatically record a short clip with every photo. You can even share your images on social media while playing with camera. Google appendes the moving image to a standard JPEG file, so maybe it will happen quickly. You’ll be able to export into standard formats like MOV and GIF files from the Google Photos app
Augmented Reality stickers:
Google’s AR Core framework is fully active on the Pixel 2, so Google is building little moving “stickers” that you can stick into your scene in real time. It’s also taking advantage of some partnerships to get custom stickers, starting with some pretty twee little avatars from Stranger Things. Google says that the individual calibration of each phone’s lens is important to AR performance.
Google has put a lot more innovation to the camera features of the new series. By default, the Pixel 2 takes a ton of shots whenever you hit the shutter button, and then does a ton of computer work on those shots to create a single image. If Google manages to establish the natural-looking images, it would be great to have Pixel 2 in your hand to shoot around.