iPad&iPhone user

Pixel envy: Why Apple needs to rethink the entire Siri experience

Apple’s catching up with Google on computatio­nal photograph­y, but its smart assistant still feels stuck in 2014.

- Jason Cross reports

The tech media has long compared Google’s Pixel phone with the iPhone, despite the incredible disparity in consumer appeal. After all, the Pixel is the only other phone actually made by the company that controls its primary ecosystem. It’s the Android phone by the Android maker.

For the past couple of years watching the introducti­on of a new Pixel phone, it was easy to imagine an iPhone user looking at the camera features and results and thinking, “I wish my iPhone did that!” This year, while the Pixel 4’s camera capabiliti­es might be better than the iPhone 11’s, Apple has at least caught up enough for it not to be the envy of an iPhone user’s eye.

This year, the thing that makes iPhone users say, “I wish my phone did that,” is Google Assistant. It’s past time for Apple to step up Siri in a big way.

Siri’s squandered lead

Siri was first released as an iPhone app in early 2010. Apple knows something groundbrea­king when it sees it, and snapped up the company that originally created Siri, before Android and BlackBerry (remember BlackBerry?) versions could be released. A year later, it debuted as a beta feature of the iPhone 4s.

It proved wildly popular. So popular that the Siri back-end infrastruc­ture couldn’t keep up with demand. No other phone had an assistant like Siri. Apple had a several-year head start on what would become a core feature of all smartphone­s and, eventually, smart home devices. As it sometimes seems to do, Apple failed to recognize that its advantage was tenuous and must be vigorously defended. It didn’t invest nearly enough in its assistant technology, allowing Google – and some would say Amazon – to catch up and eventually pass it by. Now, Google Assistant on the Pixel 4 looks like the future, and Siri just feels like a more polished version of what we’ve been using for years.

We need a next-gen Siri, not just a better Siri

Apple has gotten serious about machine learning and its virtual assistant in the last couple of years, going on a huge hiring and acquisitio­n spree to bolster its R&D efforts. But as a customer, I don’t feel like Siri is next-level. I feel like I’m fundamenta­lly using the same Siri I have been for the last seven years.

Siri is dramatical­ly better than it used to be, but it still works in essentiall­y the same way, and does essentiall­y the same things. Say “Hey, Siri” or press and hold the side/home button, and it takes over the entire screen, giving you hit-or-miss answers to certain classes of questions or performing carefully prescribed functions. It is an island unto itself, siloed into its own full-screen interface, and yet requires an Internet connection (despite Apple’s stance on privacy and performing operations entirely on your iPhone).

Google’s demonstrat­ion of its new voice recorder feature that does real-time transcript­ion was a dramatic display of its ability to understand speech, but more impressive is that it operated in airplane mode. In fact, many Google Assistant features will be run entirely on-device. This seems like the kind of thing Apple should have demonstrat­ed when it overhauled the Voice Memos app in iOS 12, doesn’t it?

With all of Apple’s talk about privacy and security, why can’t Siri do on-device real-time voice transcript­ion of our Voice Memos? Turn on airplane mode and you can’t even invoke Siri at all. You get a big fat error stating that you have to be connected to the Internet.

Why? Why can’t I tell Siri to launch an app, or convert pounds into ounces, or roll dice, or tell me

about any of the informatio­n that’s already on my phone (like calendar events or reminders)? Siri should only need to connect to the Internet when the answer to a question needs has to come from there, like stock prices or sports scores. “Remind me when I get home to call Jon” should be able to set the proper reminder without any network connection. There’s no need to be online for, “Show me photos of mum,” or “Set an alarm for 7.30am tomorrow.”

Perhaps worse than its Internet-connected requiremen­t is the way Siri still feels like a separate entity, rather than a holistic part of everything I do on my iPhone.

Invoking Siri takes over the entire display. Why? In iOS 13, Apple made Siri a simple overlay along the bottom of the screen in CarPlay, but on your

iPhone it still takes over your whole device. It’s visual distinctio­n that sends a clear message – Siri isn’t a part of what you’re doing, it’s something you stop what you’re doing to use.

It’s also blissfully unaware of the context of what you’re doing at the time. I should be able to have any web page open and ask Siri to, for example, “Translate this page into Spanish.” Or select a word in any app, in any language, and ask, “What does this word mean?” to get a definition. If I have the Calendar app open to a specific day, I should be able to tell Siri, “Make an event for 6pm to get drinks with Susie,” and it will know by context to put it on that day of my calendar, rather than today.

Queries and commands to Siri should understand the context of anything on my screen. If I’m watching a movie trailer on YouTube, I should be able to say, “Buy tickets to this,” and get nearby movie ticket results for that particular film.

This works in very limited capacity today. For example, if I’m looking at an iMessage conversati­on with my wife, I can say, “Where is she?” and my phone will open to her results in Find My Friends, because I have her added there. A true next-gen Siri should seek to draw proper context from anything on my iPhone or iPad’s display, in addition to the ambient sound, location – the full suite of sensor data.

The Siri experience needs a dramatic overhaul

Google may not sell as many Pixel phones as Apple does iPhones. The future of Apple may be services, not just hardware. But Apple would be well-advised to look

at Google’s latest phone and feel a sense of paranoia. No advantage sticks around forever, and no ecosystem has a moat too big to cross. Apple should make beating Google Assistant as big a priority as it must have been to beat the Pixel’s camera.

The iPhone of the future needs more than an incredible camera, 5G connectivi­ty, and a super-fast processor. Apple keeps reminding us that machine learning is used in all aspects of the operating system. It’s time to tie that all intelligen­ce together and surface it in a whole new Siri experience, instead of continuing to iterativel­y improve this iPhone 4s interactio­n model. It’s time for a whole new Siri experience that fully

integrates with everything we do with our phones and runs entirely on-device wherever possible.

If Siri continues to evolve in the ways it has recently – adding a domain here or there, improving its voice, delivering slightly better results to specific types of queries – it will be left hopelessly in the dust by Google Assistant on Android phones and Alexa on, well, everything else. It’s already way behind, and pretty soon, consumers are going to really start to notice.

 ??  ??
 ??  ?? It’s 2019, Apple is apparently The Privacy Tech Company, and yet Siri won’t even work without an Internet connection
It’s 2019, Apple is apparently The Privacy Tech Company, and yet Siri won’t even work without an Internet connection
 ??  ?? The iPhone of the future needs more than an incredible camera
The iPhone of the future needs more than an incredible camera

Newspapers in English

Newspapers from Australia