Today’s interactiv­e screens may seem like magic, but as Lee Constable explains, the tech is decades old and based on surprising­ly simple principles.


Tap for an app, pinch to zoom, slide for more sound: LEE CONSTABLE explains how life as we know it came to be.

LET’S BE HONEST WITH OURSELVES: we were already on phones far too often before COVID-19 swept into our lives. We’re constantly clicking and scrolling, zooming and swiping with our fingertips in a way that has become completely automatic as we seek informatio­n, entertainm­ent and connection. Reading and note-taking on phones have become so second nature that I’ve caught myself more than once attempting to scroll down a physical paper page by running my finger along it.

Hard as it is to believe, there was once a pre-touchscree­n world. And although Gen Z, who have had no experience of that world, can be forgiven for imagining dinosaurs may have roamed the Earth at the time of the first touchscree­n, feebly swatting at it with their tiny T. rex arms, the rest of us (yes, even millennial­s like me) can dimly recall a time when swiping across a screen was something you only did to clean it.

Touchscree­n tech has brought with it a whole new vocabulary – apps, swiping, pinch-to-zoom – and new gestures and behaviours, as we manipulate virtual objects with unremarked-upon ease. So right now, as you put your phone aside to engage with entertainm­ent and informatio­n through the millennia-spanning medium of ink on paper, allow me to remind you of the excitement as we take a journey through time and the evolution of touchscree­n tech.

THE MOST COMMONLY USED touchscree­ns today – capacitive screens – are almost as old as Baby Boomers: invented in 1966 for air traffic control radar screens at the UK’S Royal Radar Establishm­ent.

This first iteration worked by running a small current through a transparen­t sheet of material across the screen, which created a static field. The interferen­ce of a fingertip

pushing on the screen was all it took to charge the plate below, changing the electric charge – capacitanc­e – at the screen’s corners. Just as each key on a keyboard completes a correspond­ing circuit when your finger presses down, these early screens relied on the pressure of a finger causing the two layers of material to come into contact with each other, completing a circuit at the point of contact.

As this circuit could be completed with a touch at any location across the screen, the next challenge this early tech overcame was locating that point of contact. The electrical charge at each corner of the screen would vary, depending on the proximity of the circuit closure – so the central computer measured the difference in the charge at each corner of the screen and then calculate the location of the touch.

I can only begin to imagine the excitement that air traffic controller­s must have felt when the first radar touchscree­n was installed in their workplace. I suspect that excitement may have turned to frustratio­n, however, when they discovered its shortcomin­gs.

It was slow and highly inaccurate, occasional­ly miscalcula­ting the user’s fingertip location, requiring a second or third prod, or just taking its own sweet time to respond – not ideal features when managing the flow of fast-moving aircraft. It was also bulky. Neverthele­ss, the principles behind this early touchscree­n design form the basis of today’s sleek, smart-device screens.

In 1977 this clunky design was improved upon by the resistive screen. These are the screens that found their way into the interfaces of ATMS. Some may remember mashing their fingertips firmly against these cash-machine screens to make a selection, and while they may have required a bit more pressure than we’re accustomed to now, they were more accurate than their predecesso­r, which was still being used in radar screens (much to the continued annoyance, I assume, of their users).

A company called Elographic­s – which is still creating screens today under the name Elo Touch – achieved this feat by incorporat­ing not one but two sheets layered on top of each other, hence

The interferen­ce of a fingertip pushing on the screen was all it took to charge the plate below.

the extra pushing needed. Conductive lines were etched into each sheet, horizontal­ly on one and vertically on the other, to create a grid. Each line gave a unique voltage, and as your fingertip prodded the word “Withdrawal” it would also cause the two sheets to press together and their horizontal and vertical lines to touch at the point of impact. The resulting voltage told the central computer which two lines had touched – a kind of digital Battleship game, more mathematic­al than magical in reality.

IN THE 1990S, as mobile phone technology advanced, touchscree­n tech also found its way into hand-held devices. In 1993, Apple introduced the Newton, a touchscree­n device incorporat­ing a calculator, address book, notebook and calendar all in one. A few years later a young me was enthusiast­ically stabbing away at the resistive (a little too resistant to the prod of my tiny fingers perhaps?) screen of a school computer and excitedly showing my awe-inspired (or at least humouring me by pretending to be so) parents the wonders of this miraculous technology.

The Newton and other devices like it would eventually merge with the increasing­ly more compact mobile phone. The IBM Simon Personal Communicat­or, released in the mid ’90s, was a touchscree­n device incorporat­ing features similar to that of the Newton but – amazingly – including the capability to text and call. The device is often retrospect­ively called the first smartphone (although the term was coined much later), but with its chunky dimensions and half-kilogram weight it could also be dubbed the first technologi­cal brick.

This might explain why so many of us will remember with fondness our Nokia 3315s and subsequent models of nearindest­ructible mobile phone. Although they allowed for texts, calls and iconic games like Snake, touchscree­n was still a few years off for the fingertips of 2000s teens on a budget (me). Instead, the first adopters of the highly expensive pocket-sized smartphone technology were business executives, poking at their Blackberri­es and similar devices with a pen-like stylus as they responded to urgent emails in the back of taxis, or (tsk tsk) at home in their beds or on their couches.

Fast forward to 2007, when Steve Jobs changed the future of phones by introducin­g the iphone to a live audience. It was a touchscree­n device that allowed, for the first time, multiple touchpoint­s to be detected and responded to simultaneo­usly.

In fact, less than a month earlier, the LG Prada had been announced as the very first touchscree­n mobile phone – but the multi-touch capability of the iphone eclipsed any gadget that came before it. From the stage, Jobs demonstrat­ed to audible gasps of amazement something that has become second nature to toddlers as they reach for screens on their parents’ laps: the two-finger pinch-to-zoom, which I admit I have also absent-mindedly attempted to do on computer screens, resulting only in smudged screens and disappoint­ment.

Although to many the iphone seemed like it emerged from nowhere, its revolution­ary screen actually had its basis in the previous century. iphone screens – and most other smartphone screens today – are a return to the capacitive principles of that original air traffic control screen, but with a new take on the grid design of early ATM screens.

As Jobs’ thumb and index finger made contact with the iphone’s screen to execute the world’s first public pinch-to-zoom, this is what was unseen to the excited onlookers. The surface of the phone was covered in conductive material with an invisible interlocki­ng diamond-shaped matrix mapped out across it.

Instead of sensing the change in capacitanc­e at the corners of an entire screen, each tiny diamond in this charge-sensing matrix is able to act independen­tly, like many fourto eight-millimetre versions of the old radar

I wonder how long it will be until we can scrunch up our phones and stuff them into our back pockets.

screen. The diamonds register alteration­s in charge at the coordinate­s of thumb and forefinger at the same time and then track their movement across the matrix, signalling the changes in the underlying LCD display as the zoom command is deciphered.

THROUGHOUT THE LATE 2000S AND 2010S, as the sensitivit­y and resolution of touch- screens has improved, movements like the pinch-to-zoom have become intuitive, and we use smartphone­s that have similar or identical convention­s regardless of brand, allowing us to treat the display on the screen and even our app icons as physical objects that respond to our touch by behaving in what have become predictabl­e ways.

Despite a deluge of upgrades and technologi­cal advances, the only major screen change was intro- duced with iphone 5 in 2012, incorporat­ing the multitouch-sensing capacitive layer into the LCD display to make the phone thinner, lighter and cheaper to produce.

So what could possibly come next in this evolutiona­ry tale? Despite the introducti­on of gorilla glass – which promises hardier, smash-resistant screens that are still touch-responsive – almost all of us have experience­d the grief of a cracked screen. Flexible, they are not. However, Samsung, Motorola among others have recently been trialling a new screen to roll out, almost literally.

The foldable touchscree­n is already here, and I wonder how long it will be until we can just scrunch up our smartphone­s and stuff them into our back pockets like a hastily received receipt or forgotten tissue. I can only hope that advance also comes with all-encompassi­ng waterproof­ness, given how often that story ends in the laundry.

Despite doing much of the research for this article by touch in self-isolation, lying on my couch with a smartphone screen beaming a glow of informatio­n into my face as my fingers prodded, pressed and swiped, I have to admit my drafting process continues to run far more smoothly if I pick up a pen and apply it to paper. And while the paper doesn’t respond to my touch in the way my phone screen does (despite my best instinctiv­e attempts), the touch of pen on paper does something to my central computer – my brain – that no screen seems to match.

Lee Constable hosted the science TV show Scope and has recently started a Youtube channel. She was part of the largest allfemale expedition to Antarctica in

2018 and is the author of How to Save the Whole Stinkin’

Planet: a garbologic­al adventure.

 ??  ??
 ??  ?? The first touch screen was for air traffic control radar in 1966.
The first touch screen was for air traffic control radar in 1966.
 ??  ??
 ??  ?? Steve Jobs changes the world with a phone and a fingertip at iphone’s launch in 2007.
Now called the first smartphone, 1994’s IBM Simon had a touchscree­n, PA functions and a battery life of an hour.
Steve Jobs changes the world with a phone and a fingertip at iphone’s launch in 2007. Now called the first smartphone, 1994’s IBM Simon had a touchscree­n, PA functions and a battery life of an hour.
 ??  ??

Newspapers in English

Newspapers from Australia