Popular Mechanics (South Africa)

Technology in education Getting to grips with the new-age classroom.

SPARK: Science and technology school of note

- WORDS: LINDSEY SCHUTTERS PICTURES: WALDO SWIEGERS

The question of how technology can best enable, or integrate into, our education systems isn’t one we can answer today. We need to think much further ahead. We almost need to anticipate what tertiary education will look like in 20 years’ time and prepare our pre-schoolers for that life. And we need to do all this future gazing and preparatio­n while preserving childhood and producing well-adjusted human beings. Oh, and we’re outsourcin­g that job to schools.

Well, teachers, to be exact. But those teachers are supported by the school systems that they teach in.

Tech in education is about more than giving a child a tablet to replace a teacher. There needs to be more skills transfer than that.

THE SPARK MANTRA of “I’m going to university” carries with it a certain expectatio­n, but it also highlights the school’s mission. That mission is to equip scholars with the foundation skills they’ll need to succeed at that level. Technology is used extensivel­y, but sparingly. Computers are used as a shortcut to maths and language literacy. Tasks are intrinsica­lly tied to how technology is used in a profession­al capacity.

In the blended classrooms the role of the computer changes to that of research assistant. If there is a lesson to be done online, it can be viewed as a text document or as a video, depending on the child’s preference. But there are still stocked library shelves where scholars can seek out answers. These cases mirror that of the university student piecing together a thesis; consulting different sources of informatio­n and knowing which resources will yield better results.

It’s a remarkable considerat­ion to have baked into not only the curriculum, but to have, through social conditioni­ng, as a mantra or idea that all scholars not only buy into, but get passionate­ly excited about.

For me, at least, it ranks the same as Volvo building the XC90 as a safe, autonomous car from the outset. This approach shifts the thinking to have a clear end goal in mind. This helps the designers and engineers better anticipate possible pitfalls.

As taxpayers who subsidise underprivi­leged university students, all we can reasonably expect is that the schools producing those students are effectivel­y preparing them for success at tertiary and profession­al level.

“At Spark we believe that technology can never replace a teacher,” says Ryan Harrison, who cofounded the school network alongside partner Stacey Brewer. The two were pursuing their MBAS at the Gordon Institute when they decided to revolution­ise schooling.

I’m speaking to Harrison on a typically frigid and sunny, early spring Highveld morning at the Spark school in Bramley – opposite the highway to Melrose Arch. The oldest child at this school is in Grade 5 and there are plans to occupy the building next door to expand to Grade 7, and maybe pilot a junior high school project.

“The tech you’ll see is meant to remove the teacher’s mundane tasks. In our Learning Lab for the foundation phase you’ll see children doing tasks where the teacher would normally need to sit with the child one on one. We remove those tasks from the classroom so that when the children are in the classroom, they’re focusing on deeper-level thinking and interactio­n,” he continues.

The mundane tasks he is referring to is the familiar repetition that typically accompanie­s multiplica­tion tables.

Those passive learning practices have remained relatively unchanged since the Victorian era, but when gamified on a Chromebook, it transforms into engaging active participat­ion on the child’s side and reinforces computer literacy skills.

“When the children are in there [Learning Lab] they are actually creating data points for themselves that the teachers use to inform their instructio­n,” Harrison explains. “There are teachers and tutors per grade, and each grade has two literacy and one maths teacher, and they work in cycles where children move in and out. In the Learning Lab we have tutors who are the bridge between the technology and the teachers.”

WHEREAS THE LEARNING LAB is the hallmark of the Spark schools’ blended learning system, it’s also the programme’s most divisive attribute. My daughter, for instance, will spend 40 minutes in front of a Chromebook screen every day. And she’ll be wearing headphones. I’m excited that she’ll be participat­ing in a data feedback loop, but concerned that this exposure to an LCD panel will put further pressure on her eyesight, especially because my wife and I both wear glasses.

Though Spark conforms with the 20/20 rule, which encourages a minimum 20-second break from screens every 20 minutes (Grade R spend 40 minutes per day in the learning lab, broken into two 20-minute sessions), this best practice isn’t informed by new studies that are revealing the relationsh­ip between eye maturity and blue light. Tony Williams, head of technology at Curro, is completing

“FIVE-YEAR-OLD CHILDREN NEEDTO MEMORISE A12-CHARACTER PASSWORD TO LOG ONTO THE SYSTEM .”

his PHD in tablets in education and tells me, “We should be careful with tablet screens and children under 8 years old; whites of their eyes haven’t matured enough to counteract the glare from blue light.”

Going back to the data generation, the Learning Lab data is fed back to teachers every two weeks. “We also have something called Data Day, which happens fives times a year,” says Harrison. “And we have a benchmarki­ng day once a year where all our teachers come together.”

Foundation phase data is currently gathered from the two educationa­l programs the children use in the Learning Lab: one for maths and one for literacy. Intermedia­te-phase classes spend up to 80 minutes in the Learning Lab and use more software packages, but because the blended learning concept is still new, the data is captured on the back end of each individual program and must be collated.

The data in question is driven by system feedback. The average five-year-old, for instance, can remember a six-character long sequence. Children need to memorise a 12-character password to log into the system. How long they take to memorise that password will create a data point. The maths program has a penguin as the protagonis­t and the scholars help him solve problems. All the interactio­n generates data.

“We’ve taken inspiratio­n from a wide range of programmes. In some instances we’ve made it our own and in others we’ve adopted it wholesale,” he continues. “On our behavioura­l side we follow something like the Doug Lemov strategy and work from Julie Jackson from Uncommon Schools, and in our foundation phase we were heavily influenced by Rocketship education and their lab rotation model. In our intermedia­te phase we again lean on Summit schools. All of these schools help us shape who we are, but we have also shaped our own identity.”

IT WOULD BE NAÏVE to overlook the harsh criticism Rocketship education has received in the USA. The results-driven nature of the charter school spawned a community campaign in San Jose that is dedicated to #Stoprocket­ship from introducin­g its “drill and kill” ideals in the California area. These issues are rooted in a scorebased system that is symptomati­c of the data-driven mindset and are well articulate­d in a document called the Hetchinger Report.

Where Spark has the “Going to University” creed to inspire greatness in its scholars, Rocketship has “Crush the CST” to encourage academic excellence. Both approaches are sensitive to the fact that many of its scholars come from previously (and currently) disadvanta­ged households where academic excellence isn’t culturally institutio­nalised. The heavy use of technology is also intended to bridge the gap between them and their more affluent peers. On top of this, both programmes encourage family participat­ion. Spark requires 30 volunteer hours from parents and Rocketship has Saturday classes, which parents are expected to attend.

The similariti­es are quite obvious and it shouldn’t surprise that the school’s director of operations is a Rocketship export. “One great version of the lab rotation model of blended learning was pioneered by Rocketship,” explains Bailey Thomson, director of operations at Spark schools, of the foundation phase blended learning strategy. “Teachers in our Learning Labs can identify children who are struggling in the classroom and use small-group tutoring or change the objectives and outcomes goals.”

“In our intermedia­te phase, we use a hybrid of a flex and personal rotation model because we’ve discovered that Grades four and above are capable of driving their own learning, and a flex model really honours the capabiliti­es of the student.”

Where there is a clear distinctio­n between classroom and Learning Lab at foundation phase – almost no technology present in the classroom – the Flex classroom is equipped with Chromebook­s, a library, large group tables, small group tables and floor space. Ryan explained that children cycle through the classroom areas according to a schedule, but also have the freedom to

decide which medium would be best suited to achieving the classroom goals. The school uses Chromebook­s because of software compatibil­ity, but also because they are less resource-intensive and more futureproo­f. Acer is the sole supplier in South Africa at the moment.

“I think of technology as a tool that can be used for good or for evil,” says Thomson. “at Spark we believe that technology magnifies teaching, whatever that teaching is. So when you implement technology in a school, the teachers can use it as a tool, even when not using technology in the classroom. If the teachers have a good understand­ing of it, they can give the children a huge advantage in the Learning Lab and that advantage will come back to serve them in the classroom.”

TECHNOLOGY ETIQUETTE is an important issue in schools that offer the opportunit­y to freely interact with connected digital devices, and more so when those children don’t necessaril­y have access to technology in their homes. One example is when Samsung installed an IT lab at Masibambis­ane High School in Delft on the Cape Flats. A group of volunteers from the head office in Korea spent a week with the children, equipping them with skills and enriching their lives. There were workshops covering smartphone repair and assembly, e-commerce and Microsoft Office.

I was initially sceptical, especially since these were children absolutely pushed to the brink in a society that isn’t kind to those under the breadline. And they were being taught to disassembl­e, reassemble and repair popular smartphone models. But then I saw the hope that this outreach programme provided. Most of them will realistica­lly never have the opportunit­y to climb on board an airplane or even leave the province, but interactin­g with technology and the global Samsung brand allowed them to dream far outside their reality.

Samsung also pledged to not leave the school behind and to include the talented pupils in the company’s academy and other training programmes so those skills can filter back and uplift that community. It’s an important move for a company trying to compete as a school technology resource supplier. In June, Samsung released the School-in-a-box (SIB) bundle, which serves as the consumer-facing marketing strategy for its school offering. “This concept is Samsung’s latest innovation designed to support parents, schools and learners while making the selection and combinatio­n of the correct technology simple and enjoyable,” says Paulo Ferreira, director of Enterprise Mobility at Samsung Electronic­s South Africa. “Samsung has been investing and supporting individual schools and learning initiative­s for many years. The Solar Powered Internet Schools is a Samsung innovation that enables and empowers learners to get the best education possible. It is through this experience that Samsung was able to develop the SIB in order to provide the most effective tools to help bridge the digital education divide,” continues Ferreira. It’s a prime example of shoving digital tools into the hands of unprepared learners, but the company can hardly be blamed for exploiting the commercial opportunit­y. Samsung makes up for it with a more comprehens­ive offering though. Classroom in a Box, however, is sold in the US as a complete solution and combines 30 Samsung Chromebook 3 units with 30 one-year subscripti­ons to a choice of either maths or English language curricula from McgrawHill Education and 30 Google Device Management Console licences. The service is pitched at Grades 3 to 8, or the intermedia­te phase, and is rumoured to be heading to South Africa as soon as the local office can secure a curriculum partner.

WHILE THE SPARK and Rocketship-blended education models are seemingly becoming the preferred method of integratin­g technology and education, schools are still experiment­ing. My son, for instance, will have a vastly different experience from his sister. And she’s only four years his senior. Just like my schooling was dramatical­ly different from my sister’s, and the age gaps are similar.

The future I see, however, is one that completely replaces the traditiona­l teacher. Yes, you still need them for foundation and intermedia­te phase schooling, but senior high school and tertiary instructio­n should be more democratic. You can get your basic degree, then specialise with a combinatio­n of on-the-job training and online tutorials. I learnt basic coding by watching MIT lectures on Youtube.

Online institutio­ns like Khan Academy pioneered digital on-demand education and now Youtube channels like Crashcours­e, developed by John and Hank Green (aka Vlogbrothe­rs) are releasing full curricula ( thecrashco­urse.com).

Another Youtuber, Derek Muller (Veritasium) proposed in his PHD thesis Designing Effective Multimedia for Physics Education – and in the Youtube video This will revolution­ize education – that the most important part of learning is what happens in the student’s mind. He cites various research done on the effectiven­ess of moving animations and static pictures in conveying a deep understand­ing where the findings showed no difference.

It feeds into the idea that medium and the message are mutually exclusive of each other and heavily dependent on how they are received. My daughter could thrive in the blended learning system, or we might find that it isn’t for her and move her to a school with a more handson approach, like Curro. Or alternativ­ely, a legacy school with a more traditiona­l education system.

Technology progresses too quickly to estimate what the next education revolution will be, but we must be mindful of the future challenges we are creating. That’s the part we have control over. And right now it seems like blended learning best conveys the idea that technology is a powerful research tool and capable of making the more mundane parts of learning more engaging. PM

“TECHNOLOGY ETIQUETTE IS IMPORTANT IN SCHOOLS WHERE SOME PUPILS DON’ T HAVE ACCESS TO DIGITAL DEVICES AT HOME .”

THE FIRST TIME I SAW 3D graphics was on my dad’s computer when I was a boy. Dad was a programmer for the phone company, back before they called programmer­s “coders” and before being a coder was cool. He didn’t make apps; he wrote routines for a brute of a mainframe that lived somewhere in the bowels of Pacific Bell. He stayed up all hours of the night telecommut­ing before it was a perk of every job. Sometimes, my younger brother and I would wander in to see what he was up to and, if we were lucky, he’d set us on his lap and boot up a game, Wolfenstei­n 3D, a first-person shooter. You were a GI trying to escape a Nazi castle. I never thought of the graphics as looking real, but they were undeniably effective. The three of us would sit in the darkened office, soldiers running down corridors, anxious and scared at the possibilit­y the next turn might lead to Gestapo lying in wait. In the tensest moments, Dad would physically lean his body, my brother and me with it, to try to peer around corners.

Did the rudimentar­y machine we were playing on intuit my father’s movements and respond? Did we become part of the game, in anything more real than our imaginatio­ns? Dad was good, but not that good. Besides, it was the Dark Ages – the mid-90s – what do you want?

The protagonis­t of this story is a machine.

A wearable, holographi­c computer that leaves your virtual, surpasses your augmented and just gives you the reality. You might call it a second-sight machine, giving you the cognitive powers of the machine you were born with, but freed from the tyranny of physics. It is a set of magic lenses through which humans see, with unpreceden­ted clarity, their relationsh­ip to the world.

The applicatio­ns for this machine are limitless, from planning tricky surgeries to designing other intricate machines with your partner who is in another city, but right there with you at the same time to just having the time of your life. With this machine, the Nazis wouldn’t have known what hit them.

WHAT GOD CREATED THIS particular universe?

In 2007, Alex Kipman – today a technical fellow at Microsoft, then a leader of the Windows team – had just overseen the release of Windows Vista. It debuted to something less than acclaim; so much less, in fact, that eight years later it made the cut for a Silicon Valley joke about the worst high-profile tech products of recent memory. Kipman was disappoint­ed in himself: six years into a career at Microsoft – the only place he’d worked since earning his bachelor’s degree – he realised he had no point of view of his own. He’d simply been following the dictates of other people. Of the industry. So he took a sabbatical to his native Brazil to find a purpose.

He repaired to the Atlantic Forest, the vast tropical cover of Brazil’s eastern coast, to an off-the-grid farm. He walked with a notebook in a place teeming with life and devoid of technology and he did not stop until his mind had conceived a machine, one that would take an unknowable number of years to create. It was a machine with one animating impulse: to understand the world.

SUBCONSCIO­USLY, AT incalculab­le speed, the human brain is always reasoning: commanding senses to take in informatio­n, gain context and deduce what is happening in the world around it. When Alex Kipman left the Atlantic Forest, he couldn’t yet build the machine he dreamed of. But he had an idea, a guiding principle. It was a way of organising the world as a machine might, were it attempting to understand the world like a human brain. The things in the world on one axis and the ways of interactin­g with them on the other. When Kipman returned from Brazil, he began building.

He started with the simplest machine he could imagine. Microsoft’s naming convention­s dictate that developing projects be named after cities, so he called it Project Natal, after a city in Brazil whose name means “birth”. A depth-sensing camera that sat on a television, it was a machine that could see and respond to the movements of the human body. In 2010, it was released to the public as Kinect, an Xbox peripheral that allowed gamers to control games with the movements of their bodies. It sold faster than any piece of consumer electronic­s in history.

So Kipman moved on to a machine that could not only see a person and his environmen­t, but could also make him see things.

Kipman called his new project Baraboo, this time named after a peculiar town in Wisconsin, once the headquarte­rs of the Ringling Brothers Circus, a town that Kipman says is home to the only clown cemetery in the United States. He’d been pitching a strange idea around Microsoft: mixed reality, a headset that showed its wearer three-dimensiona­l holograms, accurately rendered in the space around them. Might as well name the damn thing after the place I’ll end up when it fails, Kipman figured.

In March of this year, after years of developmen­t with partners such as Volvo and NASA’S Jet Propulsion Laboratory, Project Baraboo became available to select developers willing to pay R43 000 for a developer’s kit. As of August, anyone can buy it. Kipman had dodged the clown cemetery. The machine was called Hololens.

THERE ARE MANY ways to describe what Hololens is: a mixed-reality device, a holographi­c computer, an expensive escapist technology. But what it is, most notably, is the gift of sight. It can see like no machine before it. Hololens floats over a person’s head like a smoke-ring halo. A padded inner grey circlet rests on the crown of the skull and cinches tight in the back. The glasses – actually a clear glass shield with a second set of trapezoida­l lenses underneath – float around the inner ring on a tilting axis. They are adjusted to hover in front of the eyes. And the bulk of the machine is above the lenses, in a crescent moon of plastic and silicon that rests against the forehead: a bundle of sensors.

These sensors – a variety of cameras and motion detectors – all send their data, terabytes per second, to a control centre called the holographi­c processing unit. The result is a co-ordinate system that tells Hololens what the room looks like, where the wearer is and what is within his field of vision. Then what Hololens does is learn, with the help of a calibratio­n program, the particular quirks of the wearer’s eyes. And because Hololens understand­s the environmen­t around the person and where he is looking, when its two tiny projectors shine holograms into his retinas, those holograms – aside from their odd, shimmery essence – are truly in the scene. No longer holograms, they are real. They can be half-hidden behind a couch, sit on a kitchen counter, or come crashing through a wall.

The device is controlled with a “cursor” that follows the wearer’s gaze and is activated with hand gestures (or voice commands). There are essentiall­y only three. The most ubiquitous, the air tap, is done by holding the index finger straight up in the air, then bringing it down and back up, as if it’s being pricked by a needle. It is the equivalent of a mouse click or a tap on a smartphone. Scrolling is accomplish­ed with an “air tap and hold”: keep the index finger at the bottom of the tap, then scrub the hand up or down (also used for zooming and resizing). Finally, the virtual back button is the bloom: hand out, palm up, fingers together, you raise your hand and open the fingers, like a flower to the Sun.

To observe a person using Hololens is to regard something that looks like a religious experience. A strange play of light comes over his eyes, greens rolling across the lenses like the aurora borealis. He gestures as if having a conversati­on in sign language, but with no one and using only three words. He sees things that others, the unwearing, cannot see.

AS MICROSOFT HAS slowly introduced Hololens to the world, it has set up demo rooms around the country. These are tiny rooms, each decorated and well appointed, each different. One looks like an urban apartment’s living room. Another like the sales floor of a Volvo dealership. A den with a chandelier and a busted dimmer switch; a design studio. Inside each room, somewhere – on a table, on a desk – is a Hololens. In the living room there are holographi­c adornments hanging on the walls and scattered across the floor like a child’s things. The sales floor has a holographi­c demo S90 that zooms across the room and pops its chassis the way the real thing pops its bonnet. These experience­s are technicall­y impressive, but ultimately facile, too designed and sleek to awe. The more compelling way to see what Hololens can do is to see what people do while wearing it.

Aviad Almagor is the director of the mixed-reality programme at Trimble, a company that – among other things – works with clients to create digital models of buildings. Almagor has worked extensivel­y on developing ways to use Hololens for collaborat­ion, which often involves gathering multiple Hololens wearers on different corners of the planet around a single 3D model. Each individual’s Hololens shows

him the same model, as well as avatars of his collaborat­ors. Everyone can move freely around the model as they discuss it.

“When you bring out a virtual model,” Almagor says, “people tend to put it on a table. There is no reason for this – the model can float in midair. But for some reason we need a solid surface to place the model on. And people will always walk around it. They will not cross it. It’s the same with avatars – people will not get too close to an avatar. They keep some kind of distance, like in real life.”

This is an important lesson, because humans do not treat computers like reality. They treat computers like machines.

The Jet Propulsion Laboratory used Hololens to develop an applicatio­n called Onsight, which uses existing photos of Mars’s Gale Crater to create a fully immersive Martian environmen­t. Earthbound geologists explore as if they were in the field, walking around Mars and examining it with the same facility they have in their own gravity on this home planet.

But in designing Onsight, JPL hadn’t fully accounted for reality. It had designed it as if it were any other graphicall­y intensive applicatio­n, using a philosophy common to game

design: if there’s an object like a hill that blocks a user’s view, why waste computing power rendering what’s behind it? Users can’t see it, anyway. But when JPL gave it to testers, one of the first Martian environmen­ts geologists got to see found the rover, Curiosity, stationed in front of a small hill and the first thing most of them did was dash to the top of it – and what they saw behind it was low-res and ugly.

“We asked them why they did that,” says Parker Abercrombi­e, a software engineer at JPL. “And they said, ‘Well, if I was out in the field, the first thing I would do is I’d go to the top of the tallest point and get the lie of the land.’ ”

In 2014, shortly after he’d started at JPL and got to try out Onsight himself, Abercrombi­e went camping in the California High Desert. Not too far outside Los Angeles, bound up in the larger province of the Mojave, the landscape of the high desert is a monochrome of low brush, dirt and rock. Ringed by the tall, dry landscape – the San Bernardino Mountains, the San Jacintos, the Granite and Providence ranges off in the distance – Abercrombi­e took in the distant ridgelines with a pang of familiarit­y. This feels so much like Gale Crater, he thought. Never mind that he had never been there: because of Onsight and a grey headset, out in the California desert Abercrombi­e wasn’t recognisin­g Mars the way you recognise a place you’ve seen only in pictures. He was rememberin­g it.

"EVER SINCE WE invented machines – even before computers – we’ve bent over backwards to be able to speak and communicat­e and give them instructio­ns,” says Dav Rauch. “We’ve learnt the languages of the machines that we have created. We’ve been wrapped around their finger.”

Rauch is a creative resident and senior design lead at IDEO, the global design firm. He got his start in movies, designing interfaces for futuristic technology. If you saw scientists standing at their computers in Avatar or Tony Stark looking through his suit’s headset display in Iron Man, you’ve seen his work. His point is that typing into a keyboard is about as far from natural human communicat­ion as you can get – but we had to develop it to utilise computers effectivel­y. The complexity of our interactio­ns with computers grew for most of their history – punch cards, then a keyboard and monitor, then a keyboard, monitor and mouse – but now are dissolving into gestures, voice and gaze. The latter three, of course, are how we communicat­e with each other.

“I think the point we’re finally getting to – starting with gestures and we’re going to see it more completely with virtual and augmented-reality environmen­ts – is where user interfaces disappear,” says Rauch.

Kipman, for his part, believes the future is easy to see. “If you create technology that removes the interface from technology,” Kipman claims, “the species will evolve.”

The early days of augmented reality suggest this is true. In 1992, a doctoral candidate from Stanford University named Louis Rosenberg noticed its evolutiona­ry potential when he was working on a US Air Force–funded project to help surgeons perform surgeries remotely, with robotic arms. Rosenberg came up with the idea for what he called virtual fixtures, digital aids that could help surgeons make more accurate incisions. If a needle needed to be injected into a patient in a precise location, Rosenberg’s system could make a virtual cone out of visual and vibrationa­l feedback that would funnel the needle tip to the right spot. Or suppose a surgeon had to make a cut that would be lifesaving if one centimetre deep, but nick an artery if even one millimetre deeper. It would be helpful to build a depth stop, like you would for a table saw. Except it’s hard to build a depth stop for a cut inside a human body. But what if the depth stop were virtual?

“You could get to a level of suspension of disbelief where you didn’t know what was the real informatio­n and what was the virtual informatio­n,” Rosenberg says. “And, in fact, when people were working in the virtual fixtures system, if there was a virtual cone that they could feel, they were going to rely on that cone as if it were real. They were basically, in their mind, merging their perception of those two spaces, a merger of real and virtual informatio­n; the boundaries between those two really didn’t matter to them.”

When Rosenberg ran standard performanc­e tests on people using virtual fixtures, he found their performanc­e increased by 70 per cent. He’d given them superpower­s, simply by harnessing their tendency to treat the virtual as the real.

The trouble was that virtual fixtures required massive amounts of hardware. Rosenberg had built a room-size rig of robot controls, goggles and monitors. Hololens fits atop a human head. On Mars, the Onsight team came to realise they could do better than simply rendering the landscape behind the hills, so people could run up them with abandon. They’re giving the geologists the ability to fly. What better way to get the lie of the land? And having seen geologists constantly struggle to orient themselves, they’ve already added another new feature. In Onsight’s virtual re-creation of Mars, any time a geologist looks up into the Halloween-coloured sky, he’ll see the numbers of an azimuth ring, floating, pointing the way north.

"MAN, NO, I’M completely not happy with it,” says the man who invented Hololens.

Hololens is the most beguiling machine thus far produced, on the cutting edge of sensory processing, image technology and sheer audacity of vision. But it is large. It is heavy and wearies the neck in just a short period of time. The batteries die too fast and the field of view is limited and the holograms themselves lack the acuity of the sharpest 3D graphics, not to mention of the real world.

“It’s the only fully untethered holographi­c computer and it’s a jewel and achievemen­t of computing, but it’s incomplete,” Kipman says. What Hololens is is a harbinger, an inflection point. “It’s the equivalent of throwing a blanket over the real world,

which is our spatial map. It creates a mesh of the real world that doesn’t know the difference between a human, a couch, the floor, the ceiling, anything like that,” Kipman says. “Which is epic, because it does it in real time at frame rate and nobody’s anywhere near that kind of technology. But it’s the beginning of a journey.”

When Kipman thinks back to Brazil, he can see Hololens is but a few steps down the road towards the ultimate goal.

“In Kinect, the machinery exists in the environmen­t,” Kipman says. “It’s plugged in and tethered underneath your TV. You don’t wear anything. And yet it understand­s humans. In Hololens, the machine is on the human and the human is ambulatory, walking around. We didn’t talk at all about what happens when you put the machinery on objects. The same mixed-reality understand­ing, when applied to an object, gets you robots and self-driving cars. If you imagine over time the proliferat­ion of this machinery – existing everywhere, becoming ubiquitous – then to some extent the only time you need the machinery on the human is in an environmen­t that doesn’t contain the machinery.”

Kipman suggests holograms emanating from the home, the office, the bus. Who needs a holographi­c computer on his head when holographi­c computers are everywhere? In some not distant future, humankind will be a race of supermen, bone machines enabled on all sides by digital machines that understand us. And it will not stop there. Jason Alan Snyder, a futurist whose patents shaped aspects of Google Glass – a forerunner, in a way, of Hololens – is one of many developers working on experience­s for Hololens. He works for a marketing agency, focusing on what he calls digital sampling, the ability to test products and experience­s virtually before buying. But he sees past that. He thinks technology is at a place where we can begin to transcend language. He described an experiment in which subjects from different cultures greeted each other by thinking of a greeting in their own language. They wore EEG helmets to read their brain waves and when a computer detected a thought forming – say, a greeting – in one person’s brain, it triggered a phosphene – a brain-produced optical artefact that looks like a bright light seen in peripheral vision – in another.

“By thinking that greeting, a person on the other side of the world would see that greeting,” Snyder says. “If we could direct that into one of these AR [augmented reality] devices, like Hololens, that would be tremendous.”

Mind-to-mind communicat­ion sounds like science fiction. But then, so do holograms.

IN THE BASEMENT of Building 92 of Microsoft’s Redmond, Washington, campus, in a room dressed as an office at NASA’S Jet Propulsion Laboratory, I got to try Onsight myself. When I went into this room, I put on a Hololens and with the click of a button, a Microsoft guide made the landscape of Mars, the red planet, appear all around me in remarkable 3D. I paced forward. Mars. Looked down at my feet. Mars. I panned in a circle, scanning the entire landscape. As I turned to look back over my left shoulder, a giant shape loomed over me and I jumped back, startled. It was the Curiosity rover.

The guide called me over to the desk in the office. The monitor showed the pictures – actually taken by Curiosity – that had been stitched together to create the landscape. I could click on a rock in a picture and a flag would appear staked in the ground of the room, where I could walk up to it and take a closer look, in three dimensions.

One of the flags I planted was on a rock that, in 2D, seemed to bulge over the landscape. It begged to be explored. “Good choice,” my guide told me. “That’s a pretty interestin­g rock. We’ve got a lot of pictures of it,” she said. “Come look.” I walked over to the rock. It did indeed bulge over the landscape. I could almost feel it next to my leg. Its overhang cast a shadow on the rocks below it. “We even know what the bottom of it looks like,” my guide said, subtly beckoning.

I thought of those late nights on Dad’s lap, straining to peer around corners. And with that on my mind, I got down on my hands and knees in the red Martian dust, to have a closer look at the underside of a rock that was a hundred million kilometres away.

 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? With Hololens, NASA geologists can explore the surface of Mars from their offices on Earth.
With Hololens, NASA geologists can explore the surface of Mars from their offices on Earth.
 ??  ?? SPARK: Science and technology school of note
SPARK: Science and technology school of note
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ??
 ??  ?? SPARK: Science and technology school of note
SPARK: Science and technology school of note
 ??  ??
 ??  ??
 ??  ?? SPARK is an acronym for the school’s core values of service, persistenc­e, achievemen­t and responsibi­lity. The school was founded by Ryan Harrison (pictured right) and Stacey Brewer in 2013 with Spark Ferndale. Bailey Thomson (pictured above) oversees...
SPARK is an acronym for the school’s core values of service, persistenc­e, achievemen­t and responsibi­lity. The school was founded by Ryan Harrison (pictured right) and Stacey Brewer in 2013 with Spark Ferndale. Bailey Thomson (pictured above) oversees...
 ??  ??
 ??  ??

Newspapers in English

Newspapers from South Africa