HWM (Singapore)

IT’S ALL ABOUT UNDERSTAND­ING THE IMAGE

Dan Saunders, Director, Google Pixel Business

- By Ng Chong Seng y Charles Chua

Now that we’re into the third Pixel phone, has Google’s deni on of the Pixel changed in any way since the original’s debut?

We’ve become clearer in our thinking as we moved along, but the central thought has always been the same: a combinatio­n of hardware and software and articial intelligen­ce. And this applies not just for the Pixel phones, but for all our hardware products.

If you think about it, A.I. is a space that Google has been involved in for many years now, so it really goes to the heart of how Google thinks about organizing the world’s informatio­n and making it more accessible to people.

Does that mean Google sees the Pixel more of a vehicle to realize this mission of organizing the world’s informatio­n than to make money through hardware sales?

The Pixel is denitely an important piece to achieve that mission, but we also want to achieve the biggest sales volume possible. By having control of the hardware and software and A.I. in a vertically integrated way means we’re able to build the best experience that we know of and deliver it to end users. And of course increasing hardware sales then becomes one way to get these experience­s into people’s hands.

For example, the Pixel 3 is selling in four new countries this time round: Japan and Taiwan in Asia Pacic; and France and Ireland in Europe.

Let’s talk a bit about the Pixel 3. Why is the notch on the Pixel 3 XL in this shape and size?

It’s all about optimizing the layout of the other technologi­es within the footprint of the device and making use of all the available space. Because we can’t put the camera behind the screen yet, we need the notch to achieve the overall allscreen effect. And for the Pixel 3 XL, within the notch we’ve not one but two cameras, including a wideangle sele camera. The ambient light sensor, far-eld microphone, and the top front-ring speaker are also within that space.

Is Google’s approach to photograph­y different from other phone makers? I mean, Pixel 3’s new camera features such as Top Shot and Night Sight are heavily powered by A.I. and software.

Like what we’ve done in the past to organize images and make them searchable in a way that’s useful, it’s really about understand­ing, visually, what an image is about using the smarts on the phone, be it on-device machine learning or Google Assistant. For instance, with Night Sight we’re able to look at an image and based on the shapes that we see make informed choices about how to create color pop to bring out the detail of a photo that’s shot in a low-light setting.

At the end of the day, anyone can put a high quality camera module on a phone, what makes the difference is the ability to treat and leverage that module with your own software and articial intelligen­ce - which is what we’re doing.

So am I right to say the single rear camera is a deliberate choice because Google is confident with what it’s doing through software?

Yes. And if you’re wondering why there are then two front-facing cameras, that’s because we’ve identied a problem that requires us to make some hardware choices in order to solve it. And that problem is the limited eld of view when taking seles with a single camera. So for us, it’s really about optimizing for the problems that we see and we make hardware and/or software choices accordingl­y.

About that “Not Pink” Pixel 3 color: I see it as typical Google humor, which means that it is pink, no?

(Laughs) Yes, I think that’s exactly right.

“Articial intelligen­ce is really at the core of what Google does.”

 ??  ??

Newspapers in English

Newspapers from Singapore