Sunday , May 22 2022

Google's Night Vision on Pixel shows how phone cameras are fake



[ad_1]

The little camera on this phone has a superpower: it can see what our eyes can't do.

Over the past few weeks, I've been walking around at night among the dark places that take photos using a new mode in the $ 800 Pixel 3, called Night Vision of Google. They seem to have brought a lighting team of friends in a candlelight bar. The dark streets are full of red and greens. It was the afternoon as if it was a midnight city view. It goes beyond an Instagram filter you need to see this region.

Night Vision is a super step forward for smartphone photography – and an example of how our photos have evolved, super fake.

True: you don't look like photos. The photo has not only been about capturing reality, but the latest phones are taking more and more photos into undiscovered areas.

For now, Night Sight is a mode that only appears in dark shots on Google's Pixel phones. But this is hardly difficult: all kinds of phone manufacturers boast of how great their pictures look, they are not realistic. The ü portrait mode için of the iPhone applies blurred blur to the backgrounds and defines the facial features to reduce red-eye. Selfie phones on phones that are popular in Asia automatically light the thin heads, eyes and smooth skin. And the latest phones use a technique called HDR, which combines multiple shots to produce a hyper-toned version of reality.

When I took the same sunset photo with the iPhone & # 39; s iPhone 6R and this year's iPhone XR recently, I had to realize the difference – the new iPhone shot seemed to be painted with watercolors.

What is happening? Smartphones have democratized the photograph of 2.5 billion people – taking a great photo to use special equipment and instruction manual.

Now, artificial intelligence and other software developments are the democratization of creating beauty. Yes, beautiful. Editing photos no longer needs Photoshop skills. Now, when presented with a natural landscape or a smiling face, the phone cameras are touching the algorithms that people are trained to see and enjoy about the set images.

Your phone has really high-tech beer glasses. Think less of your camera as a reflection of reality and more an AI is trying to make you happy. It's a fake thing.

Taking pictures on a phone has been much more than a lens on a sensor. Of course, this hardware is still important and evolved over the last decade.

But more and more, it's not software, it's not hardware, which makes our photos better. Marc Levoy, a retired Stanford computer science professor.

Levoy's work is based on the natural size limitations of a smartphone. Telephones do not fit into large lenses (and sensors beneath them), such as traditional cameras, so producers have to find creative ways to make up for it. Enter techniques that replace optics with software, such as digitally combining multiple shots.

Levoy said that Apple, Samsung and Huawei have also used their new phones, but aw we have betting on the software and AI on the farm Lev. This freed Google to discover new ways of creating images.

"In terms of software, Google has an advantage," said Nicolas Touchard, vice president of marketing at DxOMark Image Labs. (Whether any of these helps transform Pixel from Apple & # 39; and whether Samsung is a separate question.)

With Night Vision, Google's software is at its extreme and captures up to 15 low shots and brings together colors to illuminate faces, provide sharp details, and color the colors prominently. No flashing – this artificially increases the light that is already there.

Anyone who tries to shoot in low light on a traditional camera knows how hard it is to take blurry pictures. With Night Vision, even before you press the button, it measures the flicker and motion of the scene to determine how many shots the phone takes and how long the shutter is open. When you press the shutter button, it warns ”still printed, and pulls up to 6 seconds.

Throughout the next second or two tracks, Night Sight divides all of your photos into a group of tiny tiles, aligning the best bits to get a complete image. Finally, AI and other softwares analyze the image to choose colors and shades.

Night Vision had difficulty in focusing and almost no light scenes. You – and your subject – you have to keep this pose. However, most of my test shots were fantastic. Portraits smooth the skin while keeping the eyes sharp. Night scenes illuminated hidden details and colored them like Willy Wonka's chocolate factory.

The problem is: How does a computer choose tones and colors of things we live in the dark? Should it create a sky like a twilight?

. If we can't see this, we don't know what it looks like, Lev said Levoy. Var There are too many aesthetic decisions. We did them somehow, you could do them in a different way. Perhaps these phones will ultimately need the var I see var and ğ what's really there Belki button. "

So if our phones are delivering colors and lighting to us, would it count for photography? Or is the art produced by the computer?

Some experts discuss the second. Dir This is what is always with destructive technology, Lev said Levoy.

Or What does the fake mean, ”he asks. Pro photographers have long made adjustments in Photoshop or in the darkroom. Prior to that, filmmakers changed colors for a particular look. One third of humanity can be an academic concern not to mention hobby in order not to mention memories.

How far away are the pictures from the phone? What could the software teach us to think normally? In which parts of the images do we allow editing computers? In a photo I took with the White House (without Night Vision), I noticed the algorithms in Pixel 3 & # 39; s, which appeared in the removal of architectural details still visible in an image in the iPhone XS.

In the camera measurement company DxOMark, the question is how to evaluate images when interpreted by software for features such as face beauty.

Lar Sometimes manufacturers are pushing too far. Touchard often says that they don't destroy information – if you want to be objective, you should think of a camera that captures information.

For another point of view, I called Kenan Aktulun, the founder of the annual iPhone Photography Awards. Over the past ten years, more than a million photographs were taken with the iPhone.

He said that the line between digital art and photography is çiz really blurred at some point Ak. Nevertheless, it welcomes the technological developments that make the photographing process and tools invisible. The charm of smartphone photography, be accessible – a single button and you're there. AI is an evolution of it.

. As the technical quality of images evolves, what we are looking for is the emotional connection,. He said. Değil More attention is not technically perfect. Photos of a person's life or experience. "

[ad_2]
Source link