Saturday , July 2 2022

Google & # 39; s Night Vision in Pixel 3 shows computational photography promise



[ad_1]

With several flagship smartphones using two or more lenses in their main cameras, it's great to see Google sticking to the gun with only one camera for Pixel 3 and Pixel 3 XL.

On paper, you might wonder that a single camera can exceed the performance of multiple camera systems, especially for night scenes.

Google's response Night Vision is a feature that appears after the sale of the latest Pixel phones. After testing on Pixel 3 for a while, I can say that the software is as important as hardware.

With computational photography, Google can dramatically improve images from Pixel phones. The key here uses algorithms to develop selective portions of the image.

This means more light, where more is needed. Where less than not. Plus, other enhancements that rotate the image you've just recently turned into a much more beautiful version without the time it takes to touch Photoshop.

Turn on Night Vision, hold your hand for 2 to 3 seconds, and the smartphone will continue to rest. The property is impressive. Moreover, it is only done with a sensor. PHOTO: Wilson Wong

The Night Sight method takes multiple frames of the same scene in different poses in a very short time and combines these frames with machine learning to create an image with the widest brightness.

In other words, we should see the details in the lightest and darkest parts of the image. This solves one of the most common headaches of the photograph – pulling the light to the right across a frame.

From my various tests, the images taken with the smartphone are absolutely impressive. You may no longer need to use a tripod as it can now detect motion in frames taken by the camera and extract it for a shaky image. Going without a tripod has certainly made it easier for me to shoot in low-light conditions.

Even though the milling people have light movements, Pixel 3 has taken into consideration that this is a handmade shot. That's why I was able to shoot very bright shots of the exhibitions in the dark rooms of the National Museum.

The interior of the museum is quite steep and usually requires a tripod, but Google Night Sight eliminates movement not only for low-light photography, but also for movement. PHOTO: Wilson Wong

Of course, despite Google's expertise, there are no miracle pills. In some cases, the color displayed is not correct for images that are illuminated by lighting, especially after the attribute.

This handmade image of the National Museum of Singapore is sharp, but the color of the lights did not seem to turn pink. PHOTO: Wilson Wong

I did the same experiment, but this time I use the Huawei Mate 20 Pro as a comparison. Although computational photography takes a step in shooting under difficult conditions, getting the right color is a priority. PHOTO: Wilson Wong

Although some claim that computational photography can lead to the loss of "traditional" photography methods, it does not deny that photography is made easier. Users can focus on capturing important moments of life.

What I've seen so far in Pixel 3 was impressive. And computational photography will only be better with faster processors on our new smartphones.


[ad_2]
Source link