Imag­in­ing a next-gen­er­a­tion Ap­ple Maps fu­ture

Com­bined with new high-pre­ci­sion GPS and AR, the com­pany could change mo­bile nav­i­ga­tion for­ever, writes Ja­son Cross

iPad&iPhone user - - CONTENTS -

Ap­ple is im­prov­ing Maps in a big way. More pre­cise and ac­cu­rate data, bet­ter de­sign and read­abil­ity, faster data up­dates…it’s a huge project that should make Maps ten times bet­ter than it is to­day, and it will start rolling out this au­tumn.

But the new maps are just the tip of the ice­berg. Read­ing the tea leaves and pro­ject­ing just a lit­tle bit into the fu­ture, it’s easy to imag­ine a map­ping and lo­ca­tion ex­pe­ri­ence that makes to­day’s mo­bile map­ping look like MapQuest circa 2002. When we look at re­cent ad­vance­ments in map­ping data, GNSS (Global Nav­i­ga­tion Satel­lite Sys­tem) tech­nol­ogy, and Aug­mented Re­al­ity (AR), a pic­ture starts to form of mo­bile nav­i­ga­tion un­like any­thing we have to­day.

GPS su­per­charged

A key com­po­nent of the fu­ture of nav­i­ga­tion is ul­tra­pre­cise GNSS (a catch-all term for satel­lite nav­i­ga­tion sys­tems like GPS, GLONASS, Galileo, and Baidu). To­day’s phones sup­port mul­ti­ple GNSS sys­tems, but they’re only ac­cu­rate to within 5 me­tres or so. If you’re walk­ing down a city street, your phone may com­bine GNSS data with a cat­a­logue of known Wi-Fi hotspots and other mark­ers to im­prove ac­cu­racy, but it only goes so far.

It’s not un­com­mon to­day for your phone to think you’re ac­tu­ally in the shop next door, or on the wrong side of your of­fice floor. And for­get about ac­tu­ally know­ing which side of the street you’re on, or which lane you’re driv­ing down on the high­way.

Ad­vanced GNSS chips offer far greater pre­ci­sion; we’re talk­ing about one or two feet. Last year, Broad­com an­nounced its BCM47755 chip, which uses less power than cur­rent GNSS chips and can con­nect to and com­pare two fre­quen­cies at the same time each from GPS, Galileo (Euro­pean sys­tem), and QZSS (Ja­panese sys­tem). By com­bin­ing and com­par­ing two fre­quen­cies at once, Broad­com says it can achieve “cen­time­tre

ac­cu­racy” in a de­vice that is ef­fi­cient enough even for fit­ness wear­ables.

Of course, one can hardly ex­pect real-world ac­cu­racy to be what is claimed in a press re­lease, but even if it’s 30 times worse – 30 cen­time­tres – you’re look­ing at pre­cise lo­ca­tion to within one foot.

Hey, you know who hap­pens to use Broad­com GPS chips? Ap­ple. Broad­com said the BCM47755 chip would start ap­pear­ing in phones in 2018, but has only shipped in one phone so far: the Xiaomi Mi 8. And it’s not en­tirely clear if the firmware and soft­ware APIs in that phone are mak­ing this high-pre­ci­sion po­si­tion­ing data avail­able to de­vel­op­ers yet.

Imag­ine if the GPS on your phone could re­li­ably know which lane you’re in on the mo­tor­way? Which aisle you’re walk­ing down in the su­per­mar­ket? The ex­act row in which you’re seated in a sta­dium?

Com­bined with su­per-pre­cise maps, that kind of res­o­lu­tion changes ev­ery­thing.

Build­ing Maps for the fu­ture

Ap­ple is about to com­pletely over­haul its map­ping data. Be­gin­ning with the San Francisco bay area in the next iOS 12 beta, ex­pand­ing to all of North­ern Cal­i­for­nia by the au­tumn, and rolling out to other re­gions over the com­ing year, Ap­ple Maps will start to use data from a years-long project to take com­plete con­trol over the map­ping and lo­ca­tion data stack.

Un­til now, Ap­ple Maps has re­lied on a gi­ant mish­mash of third-party data, and some­times it takes for­ever for changes or cor­rec­tions to that data to be made by those com­pa­nies. Start­ing this au­tumn, Ap­ple

will re­place that with its own data, gath­ered by fancy cam­era-and-LIDAR-equipped trucks, satel­lite im­agery, and for the first time ever, safely anonymized data from mil­lions of iPhone users.

It’s not just go­ing to be vastly more ac­cu­rate, more re­li­able, and more rapidly up­dated. It’s also go­ing to be ex­tremely pre­cise. That will vastly im­prove the cur­rent Maps ex­pe­ri­ence, but when com­bined with a hy­po­thet­i­cal fu­ture iPhone that has lo­ca­tion data ac­cu­rate to less than one me­tre, it’s easy to see how Ap­ple is build­ing (and for the first time, fully con­trol­ling) a set of lo­ca­tion data for the fu­ture, not just for the way we use our phones to­day.

Imag­ine driv­ing down the mo­tor­way and be­ing told not just what your next turn is or which lane you

need to be in, but get­ting in­di­vid­u­al­ized guid­ance based on know­ing which lane you’re cur­rently in. Maps’ driv­ing di­rec­tions could tell you to “move two lanes to the right”. Or, if you’re in the cor­rect lane al­ready, say noth­ing at all or just prompt you to “stay in this lane”. It could help you avoid slow traf­fic by know­ing that, five miles ahead, the left lane is mov­ing freely while the right two lanes are very slow, and you should move over now. With mil­lions of iPhones anony­mously feed­ing sub-me­tre lo­ca­tion data to Ap­ple’s server, Maps could tell you not just where traf­fic is slow, but how traf­fic is slow.

Can’t find the sprin­klers in your lo­cal B&Q? Af­ter look­ing around for a few min­utes of frus­tra­tion, pull up your phone and ask “Hey Siri, were are the sprin­klers?” It knows not only what store you’re in, but has pre­cise enough lo­ca­tion data to see you’re walk­ing down aisle three. It tells you to turn right, then go down four more aisles to aisle seven, where sprin­klers will be on your right.

Imag­ine get­ting walk­ing di­rec­tions that can tell you when you need to cross the street be­cause it knows you’re on the op­po­site side from the store you’re look­ing for. It knows the en­trance is around the cor­ner and down the al­ley, and gives you step-by-step di­rec­tions that guide you right to it.

You ar­rive at the foot­ball match and don’t know how to get to sec­tion C, row 37, seat 14. For­tu­nately, your iPhone al­ready knows what seat you’re in (thanks to the tick­et­ing app or email con­fir­ma­tion), and as soon as you ar­rive at the sta­dium, Siri prompts you to ask if you want di­rec­tions. With pre­cise plans of are­nas and event

venues, com­bined with a phone that knows its lo­ca­tion within a cou­ple feet, your phone could give you stepby-step walk­ing di­rec­tions that take you to your ex­act seat. No need to stare at your phone while in a big crowd, ei­ther. Your Ap­ple Watch will tap your wrist and give you di­rec­tions each step of the way.

These sce­nar­ios don’t need to be years in the fu­ture. The fun­da­men­tal tech­nolo­gies are ei­ther al­ready here or im­mi­nent. With the right pri­or­i­ties and struc­tures in place, Ap­ple could give us all these ca­pa­bil­i­ties and more when iOS 13 ships next year.

Just add Aug­mented Re­al­ity

As amaz­ing as a map­ping and di­rec­tions at sub-me­ter scale could be, it’s noth­ing com­pared to what you get when you layer Aug­mented Re­al­ity on top of it. Tim Cook has, on mul­ti­ple oc­ca­sions, said that AR is go­ing to be “pro­found” and “change ev­ery­thing”. It’s not just about mea­sur­ing ob­jects or play­ing mul­ti­player games.

Take all the ad­vanced map­ping sce­nar­ios I men­tioned be­fore, and over­lay di­rec­tions, lo­ca­tion icons, ar­rows, and more onto the real world. Pull up your phone and point it at the rows of seats in the foot­ball ground and see a bright beacon of light shin­ing down on sec­tion C, row 37, seat 14.

You ask your phone for rec­om­men­da­tions of a good place to grab lunch nearby, and when you hold up your phone, restau­rant rat­ing cards are su­per­im­posed over the real world, each one pre­cisely lo­cated right where the restau­rants are. Tap the one you want and a help­ful guide ar­row leads you around the cor­ner, across the cross­walk, up the street, and then down the al­ley. It keeps you on the side­walk sounds an alert through your AirPods if you step off the curb onto the street be­cause you weren’t pay­ing at­ten­tion. You never would have found Nan­dos with­out it.

Get­ting di­rec­tions has long been her­alded as an ob­vi­ous use case for aug­mented re­al­ity – fol­low­ing di­rec­tions is in­her­ently eas­ier when they’re su­per­im­posed onto the real world – but to re­ally make it truly work great, you need pre­ci­sion that just isn’t avail­able to­day. You need map­ping data that knows ex­actly where things are, and to know where the user is, to within a few feet.

Com­bine fu­ture iPhones that use Broad­com’s su­per-pre­cise GNSS with ARKit and ad­vanced new map­ping data that built from in­cred­i­bly pre­cise data, hu­man edi­tors, and in­for­ma­tion pulled from hun­dreds of mil­lions of iPhones, and you’ve got a recipe for a truly next-gen­er­a­tion map­ping ex­pe­ri­ence. Be­cause Ap­ple has such reach and con­trols both the hard­ware and soft­ware, main­tains its own maps plat­form, has the most ad­vanced phone-based AR plat­form, and is amass­ing in­cred­i­bly pre­cise lo­ca­tion data, it is uniquely po­si­tioned to bring these ex­pe­ri­ences to the mass mar­ket be­fore any­one else.

Ap­ple has been gath­er­ing pre­cise map data for years, and is fi­nally about to be­gin a phased roll­out of it

Ap­ple will use a va­ri­ety of tech­niques, in­clud­ing seg­men­ta­tion, to pull lo­ca­tion data from hun­dreds of mil­lions of iPhones with­out vi­o­lat­ing your pri­vacy

Google Maps is get­ting an AR mode, but it lacks the pre­ci­sion of data and de­vice lo­ca­tion to fully re­al­ize the con­cept

Newspapers in English

Newspapers from UK

© PressReader. All rights reserved.