More scientists who changed the tech world
IN THIS THIRD INSTALMENT, TECHLIFE LOOKS AT THE SCIENTISTS BEHIND THE INVENTIONS THAT CHANGED THE TECH WORLD, AND GAVE YOU THE GADGETS WE NOW TAKE FOR GRANTED.
If you’ve coded your own apps, you’ll have enjoyed the fruits of Grace Hopper’s pioneering work in program compilation (included in our first ten scientists in TechLife #57), turning high-level English code into processor-specific machine language. However, since the early days of computing, programmers and engineers have looked for ways to improve computer performance. Increasing CPU speed is always one (expensive) option, but another is what’s known as ‘compiler optimisation’, a branch of computer science that looks to make compiled machine-code more efficient and faster to process. Frances Allen, a mathematician and computer scientist, was a pioneer in compiler optimisation research, developing techniques with collaborator John Cocke that are still in use today. She was awarded for her work in 2006, becoming the first woman to receive the prestigious Turing Award. The Association of Computer Machinery (ACM) reported in its citation for Allen that her inspiration for becoming a mathematician was her high-school maths teacher — given the current global shortage of maths teachers, you can’t help but wonder how this is affecting the choices many students are making about future careers. If you're a high-school maths teacher, keep up the good fight!
Johnson was a high-school science teacher when the court system landed him with two youths, convicted for stealing a radio from Johnson’s school. Turning what could have been a negative experience into a positive, Johnson had them work on his project to build the world’s first automated test scoring machine, featuring cards marked with standardised multiple-choice answer options. IBM picked up the idea and hired Johnson in 1934. The IBM 805 Test Scoring Machine appeared in 1937. By the 1950s, he was running IBM’s San Jose lab and, according to Johnson himself, he was given just two guidelines by his IBM bosses — keep the head count to around 50 and work on stuff no one else in IBM is working on. His decision to look into random-access storage turned into a masterstroke when he came up with the idea to use laminated steel disks to store data, rather than the tape or wire in use at the time. The IBM 305 RAMAC computer system was released in September
1956 and featured the IBM 350 Disk Storage unit, the world’s first hard disk drive, capable of storing roughly 3.5MB. Later, while seconded to Japanese giant Sony in 1971, he invented the videocassette, cutting the standard one-inch (2.54cm) videotape strip in half and fitting it to an easier-to-handle ‘cassette’ cartridge. The half-inch strip size stuck and later became the standard for Betamax and VHS cassettes. By the time of his death in 1998, Johnson had amassed over 90 patents. Being able to write your own computer code is fast becoming a prerequisite for a whole range of careers, including some you’d probably never imagine — for example, the University of Technology, Sydney (UTS) now offers tech design courses to its law students. One of the most popular coding concepts or ‘paradigms’ today is object-oriented programming (OOP), where the primary focus is on objects as distinct entities. For example, you can consider ‘fruit’ as an object ‘type’ that has a series of features or ‘attributes’ such as size, colour, dimensions, taste and so on.
Barbara Liskov completed a degree in mathematics from the University of California, Berkeley in 1961 and later became one of the first women to receive a PhD in computer science from Stanford University. In 1987, she presented a research paper on ‘Data Abstraction and Hierarchy’ that detailed one of the nowcornerstone tenants of OOP called ‘subtyping’. Put simply, if computer code contains an object of a particular type, say ‘fruit’, that object should be able to be substituted with any derived ‘subtype’, such as ‘apple’, ‘lemon’ and so on without causing programming errors. The concept has since been dubbed the ‘Liskov Substitution Principle’. For this and other important work undertaken in the area of distributed programming, Liskov was awarded the prestigious Turing Award by the ACM in 2008.
Next time you make a call from your mobile phone, you might want to give Martin Cooper a thought. Before 1983, the only way you could make a genuine ‘mobile’ call was to use a car-phone, which, at the time, seemed as much about having something to carry the sizeable amount of electronics required as being a semi-portable device (the first car phone in 1948 carried 36kg of electronics). Cooper was head of the communications systems division at US electronics giant Motorola when he developed the idea of a hand-held phone, demonstrating and making the first call on a prototype in 1973.
It took a further 10 years of development before Motorola launched the first commercial cell phone, the DynaTAC 8000X, onto the market in late 1983. Why so long? The tech simply didn’t exist. Today’s phones are built using ‘System on a Chip’ (SoC) processors, but in 1973, none of the integrated circuits (ICs) we take for granted today existed (Intel had only invented the 4004 CPU two years earlier) and everything had to be developed from scratch. As it was, the DynaTAC 8000X reportedly cost US$3,995 when it launched in 1983, took 10 hours to charge and gave you 30 minutes of talk-time. While it’s pretty amazing just how far mobile phones have come since then, the first one of anything is always the toughest to make.
By all reports, all who saw it were gobsmacked, so much so, that it is still referred to today as the ‘Mother of all Demos’. Douglas Engelbart’s demonstration at the ACM’s Fall Joint Computer Conference of his oN-Line System (NLS) on 9 December 1968 showed his dream of how computers could revolutionise the way we use information. While it included the first example of teleconferencing and shared editing, it also featured a hand-held control device enabling him to manipulate and highlight text on a screen. What was so impressive about Engelbart’s demo wasn’t just the fact that he’d developed the first computer mouse, but that he’d worked out how to make such excellent use of it. The device, with a push-button switch and an X-Y arrangement of two wheels underneath to enable a cursor position to be set, seemed simple enough, yet the overwhelming majority of the thousand or so people present knew they had just seen the future — they reportedly gave him a standing ovation at the demo’s end.
Engelbart, with the help of associate Bill English, built the first mouse back in 1964 at the Stanford Research Institute and received the patent for the invention in 1970. Tech company Xerox would release the Alto, the first computer
THE WORLD'S FIRST DIGITAL CAMERA WAS DEVELOPED IN 1975, WEIGHED 3.6KG, STORED BLACK-AND-WHITE IMAGES ON CASSETTE TAPE AND THE IMAGE SIZE WAS ALL OF 100 x100 PIXELS.
to come with a mouse, in 1973, but it wouldn’t be until the mid-1980s that the mouse achieved wide public acclaim, as Apple launched the Lisa personal computer and Microsoft added mouse support to its Word word-processor.
One of the chief reasons why mobile phones have been able to evolve into the personal computers we have today is thanks to an invention by Japanese electrical engineer Fujio Masuoka. As the computer market grew during the 1970s, the need for the components that made computers work grew in parallel. One of those components was dynamic random-access memory or ‘DRAM’, but the problem with DRAM is that it only retains data while it’s powered up — turn off the power and the data is lost. Masuoka’s invention of ‘flash’ memory in 1980 that retained data even when powered off would eventually revolutionise computer storage, heralding the arrival of ‘flash cards’, USB flash drives and solid-state drives (SSDs). But what’s just as intriguing about this story is that Masuoka’s then-employer, Toshiba, apparently wasn’t particularly interested in his invention. In fact, it’s reported his presentation of flash memory to the IEEE International Electron Devices Meeting in the US in 1984 prompted chipmaker Intel to kickstart its own flash storage efforts, launching its first product primarily for cars in 1988. Toshiba eventually released flash storage based on Masuoka’s invention in 1989, although you could argue the horse had bolted by that stage. Nevertheless, digital cameras, personal digital assistants (PDAs) and MP3 players drove flash memory into the mainstream in the late-1990s and today, the global flash memory market is now valued at over US$50 billion a year.
Growing up in the 1970s and 80s, almost every kid had a Kodak Hawkeye camera, with either a 126 or 110 film cartridge. They were the original ‘point and shoot’ cameras. The problem was film was a continuous cost; each cartridge could hold no more than 24 shots, and developing it necessitated a continuous outlay. Still, that didn’t really matter in those days — Kodak owned the market and there was no real alternative to film. However, it was during this era that the seeds of film’s demise were sown. In fact, it was at Kodak itself that the world’s first digital camera was developed by then-25-year-old electrical engineer, Steven Sasson, in 1975. At the time, the prototype unit was fashioned from various parts including a new ‘charge-coupled device’ (CCD), a lens from a Super-8 movie camera and multiple circuit boards. It weighed 3.6kg, stored black-and-white images on cassette tape and the image size was all of 100 x 100 pixels. Nevertheless, it proved the concept of digital photography and it would only be a matter of time before the technology evolved into a practical and commercially-viable product. Sasson was awarded a patent for his design in 1978, but was reportedly forbidden to talk about or show the camera to the outside world. The patent eventually earned Kodak millions but being so wedded to film, the company failed to fully profit from the lucrative digital camera market and filed for bankruptcy in 2012. It emerged from bankruptcy in 2013, focusing on commercial photography, yet in January 2018, Kodak surprised many, announcing the launch of its own cryptocurrency called KodakCoin.
JOHN RANDALL AND HARRY BOOT
Head off overseas and you join the throng of commercial aircraft flying around the globe, tracked using one of the many forms of radio-detection and ranging or ‘RADAR’. However, radar owes its development to the looming threat of war, this time in the mid-1930s. British scientists Robert WatsonWatt and Arnold Wilkins proved in 1935 that pulses of transmitted high-frequency radio waves were reflected back from an aircraft in their path. The concept later became the secret weapon known as ‘Chain Home’, early warning radar stations dotted around the south-east of England. It played a major role in foiling the German Luftwaffe during the Battle of Britain in late 1940. Yet even before Chain Home’s successful exploits, two British scientists, Sir John Randall and Dr Harry Boot, had earlier that year developed what is still considered one of the most important inventions of World War II. At the time, the radio waves required for radar were still low in power and frequency, which meant that both the radar range and detection precision were limited. The device created by Randall and Boot, called the ‘Cavity Magnetron’, was forged from a solid block of copper, making it all but indestructible. It instantly produced much higher frequency pulses and at ten times the power of anything available at the time. Engineers, including Alan Blumlein (included in our first 10 scientists), would later develop new radar systems using the cavity magnetron that helped win the war. The story of the magnetron is one of simultaneous secret research in a number of countries during the 1930s, but it’s generally accepted Randall and Boot’s efforts resulted in the most significant advances and the first easily reproducible model. In peacetime, the cavity magnetron powered commercial radar systems for decades after, yet for sheer numbers, the magnetron’s greatest success has been finding its way into kitchens since the 1970s, re-heating last night’s pizza in the humble microwave oven.
INSPIRED TO INNOVATE
It’s often said that necessity is the mother of invention, but it can often start much earlier — with inspiration. For Frances Allen, her path to compiler optimisation began with a highschool maths teacher. Reynold Johnson was himself a high-school science teacher who took in two wayward kids to have them learn while building the first automated test scoring machine in the early 1930s.
That teachers are often the first to inspire students into particular career paths makes Australia’s current shortage of qualified maths and science teachers all the more concerning. In June 2017, a parliamentary committee in Canberra heard that in 2011, only one-in-five Year 4 students was taught maths by someone trained in the area and just one-in-six when it came to science ( tinyurl.com/ycwcjbpo).
There’s no doubt teaching these days is a tough gig, but the positive influence a good maths or science teacher can have in providing role models for students means governments urgently need to address the numbers of trained STEM teachers. Australia’s future innovation potential depends on it.
FLASH MEMORY WAS INVENTED BY FUJIO MASUOKA AROUND 1980.