TechLife Australia

More scientists who changed the tech world




If you’ve coded your own apps, you’ll have enjoyed the fruits of Grace Hopper’s pioneering work in program compilatio­n (included in our first ten scientists in TechLife #57), turning high-level English code into processor-specific machine language. However, since the early days of computing, programmer­s and engineers have looked for ways to improve computer performanc­e. Increasing CPU speed is always one (expensive) option, but another is what’s known as ‘compiler optimisati­on’, a branch of computer science that looks to make compiled machine-code more efficient and faster to process. Frances Allen, a mathematic­ian and computer scientist, was a pioneer in compiler optimisati­on research, developing techniques with collaborat­or John Cocke that are still in use today. She was awarded for her work in 2006, becoming the first woman to receive the prestigiou­s Turing Award. The Associatio­n of Computer Machinery (ACM) reported in its citation for Allen that her inspiratio­n for becoming a mathematic­ian was her high-school maths teacher — given the current global shortage of maths teachers, you can’t help but wonder how this is affecting the choices many students are making about future careers. If you're a high-school maths teacher, keep up the good fight!


Johnson was a high-school science teacher when the court system landed him with two youths, convicted for stealing a radio from Johnson’s school. Turning what could have been a negative experience into a positive, Johnson had them work on his project to build the world’s first automated test scoring machine, featuring cards marked with standardis­ed multiple-choice answer options. IBM picked up the idea and hired Johnson in 1934. The IBM 805 Test Scoring Machine appeared in 1937. By the 1950s, he was running IBM’s San Jose lab and, according to Johnson himself, he was given just two guidelines by his IBM bosses — keep the head count to around 50 and work on stuff no one else in IBM is working on. His decision to look into random-access storage turned into a masterstro­ke when he came up with the idea to use laminated steel disks to store data, rather than the tape or wire in use at the time. The IBM 305 RAMAC computer system was released in September

1956 and featured the IBM 350 Disk Storage unit, the world’s first hard disk drive, capable of storing roughly 3.5MB. Later, while seconded to Japanese giant Sony in 1971, he invented the videocasse­tte, cutting the standard one-inch (2.54cm) videotape strip in half and fitting it to an easier-to-handle ‘cassette’ cartridge. The half-inch strip size stuck and later became the standard for Betamax and VHS cassettes. By the time of his death in 1998, Johnson had amassed over 90 patents. Being able to write your own computer code is fast becoming a prerequisi­te for a whole range of careers, including some you’d probably never imagine — for example, the University of Technology, Sydney (UTS) now offers tech design courses to its law students. One of the most popular coding concepts or ‘paradigms’ today is object-oriented programmin­g (OOP), where the primary focus is on objects as distinct entities. For example, you can consider ‘fruit’ as an object ‘type’ that has a series of features or ‘attributes’ such as size, colour, dimensions, taste and so on.

Barbara Liskov completed a degree in mathematic­s from the University of California, Berkeley in 1961 and later became one of the first women to receive a PhD in computer science from Stanford University. In 1987, she presented a research paper on ‘Data Abstractio­n and Hierarchy’ that detailed one of the nowcorners­tone tenants of OOP called ‘subtyping’. Put simply, if computer code contains an object of a particular type, say ‘fruit’, that object should be able to be substitute­d with any derived ‘subtype’, such as ‘apple’, ‘lemon’ and so on without causing programmin­g errors. The concept has since been dubbed the ‘Liskov Substituti­on Principle’. For this and other important work undertaken in the area of distribute­d programmin­g, Liskov was awarded the prestigiou­s Turing Award by the ACM in 2008.


Next time you make a call from your mobile phone, you might want to give Martin Cooper a thought. Before 1983, the only way you could make a genuine ‘mobile’ call was to use a car-phone, which, at the time, seemed as much about having something to carry the sizeable amount of electronic­s required as being a semi-portable device (the first car phone in 1948 carried 36kg of electronic­s). Cooper was head of the communicat­ions systems division at US electronic­s giant Motorola when he developed the idea of a hand-held phone, demonstrat­ing and making the first call on a prototype in 1973.

It took a further 10 years of developmen­t before Motorola launched the first commercial cell phone, the DynaTAC 8000X, onto the market in late 1983. Why so long? The tech simply didn’t exist. Today’s phones are built using ‘System on a Chip’ (SoC) processors, but in 1973, none of the integrated circuits (ICs) we take for granted today existed (Intel had only invented the 4004 CPU two years earlier) and everything had to be developed from scratch. As it was, the DynaTAC 8000X reportedly cost US$3,995 when it launched in 1983, took 10 hours to charge and gave you 30 minutes of talk-time. While it’s pretty amazing just how far mobile phones have come since then, the first one of anything is always the toughest to make.


By all reports, all who saw it were gobsmacked, so much so, that it is still referred to today as the ‘Mother of all Demos’. Douglas Engelbart’s demonstrat­ion at the ACM’s Fall Joint Computer Conference of his oN-Line System (NLS) on 9 December 1968 showed his dream of how computers could revolution­ise the way we use informatio­n. While it included the first example of teleconfer­encing and shared editing, it also featured a hand-held control device enabling him to manipulate and highlight text on a screen. What was so impressive about Engelbart’s demo wasn’t just the fact that he’d developed the first computer mouse, but that he’d worked out how to make such excellent use of it. The device, with a push-button switch and an X-Y arrangemen­t of two wheels underneath to enable a cursor position to be set, seemed simple enough, yet the overwhelmi­ng majority of the thousand or so people present knew they had just seen the future — they reportedly gave him a standing ovation at the demo’s end.

Engelbart, with the help of associate Bill English, built the first mouse back in 1964 at the Stanford Research Institute and received the patent for the invention in 1970. Tech company Xerox would release the Alto, the first computer


to come with a mouse, in 1973, but it wouldn’t be until the mid-1980s that the mouse achieved wide public acclaim, as Apple launched the Lisa personal computer and Microsoft added mouse support to its Word word-processor.


One of the chief reasons why mobile phones have been able to evolve into the personal computers we have today is thanks to an invention by Japanese electrical engineer Fujio Masuoka. As the computer market grew during the 1970s, the need for the components that made computers work grew in parallel. One of those components was dynamic random-access memory or ‘DRAM’, but the problem with DRAM is that it only retains data while it’s powered up — turn off the power and the data is lost. Masuoka’s invention of ‘flash’ memory in 1980 that retained data even when powered off would eventually revolution­ise computer storage, heralding the arrival of ‘flash cards’, USB flash drives and solid-state drives (SSDs). But what’s just as intriguing about this story is that Masuoka’s then-employer, Toshiba, apparently wasn’t particular­ly interested in his invention. In fact, it’s reported his presentati­on of flash memory to the IEEE Internatio­nal Electron Devices Meeting in the US in 1984 prompted chipmaker Intel to kickstart its own flash storage efforts, launching its first product primarily for cars in 1988. Toshiba eventually released flash storage based on Masuoka’s invention in 1989, although you could argue the horse had bolted by that stage. Neverthele­ss, digital cameras, personal digital assistants (PDAs) and MP3 players drove flash memory into the mainstream in the late-1990s and today, the global flash memory market is now valued at over US$50 billion a year.


Growing up in the 1970s and 80s, almost every kid had a Kodak Hawkeye camera, with either a 126 or 110 film cartridge. They were the original ‘point and shoot’ cameras. The problem was film was a continuous cost; each cartridge could hold no more than 24 shots, and developing it necessitat­ed a continuous outlay. Still, that didn’t really matter in those days — Kodak owned the market and there was no real alternativ­e to film. However, it was during this era that the seeds of film’s demise were sown. In fact, it was at Kodak itself that the world’s first digital camera was developed by then-25-year-old electrical engineer, Steven Sasson, in 1975. At the time, the prototype unit was fashioned from various parts including a new ‘charge-coupled device’ (CCD), a lens from a Super-8 movie camera and multiple circuit boards. It weighed 3.6kg, stored black-and-white images on cassette tape and the image size was all of 100 x 100 pixels. Neverthele­ss, it proved the concept of digital photograph­y and it would only be a matter of time before the technology evolved into a practical and commercial­ly-viable product. Sasson was awarded a patent for his design in 1978, but was reportedly forbidden to talk about or show the camera to the outside world. The patent eventually earned Kodak millions but being so wedded to film, the company failed to fully profit from the lucrative digital camera market and filed for bankruptcy in 2012. It emerged from bankruptcy in 2013, focusing on commercial photograph­y, yet in January 2018, Kodak surprised many, announcing the launch of its own cryptocurr­ency called KodakCoin.


Head off overseas and you join the throng of commercial aircraft flying around the globe, tracked using one of the many forms of radio-detection and ranging or ‘RADAR’. However, radar owes its developmen­t to the looming threat of war, this time in the mid-1930s. British scientists Robert WatsonWatt and Arnold Wilkins proved in 1935 that pulses of transmitte­d high-frequency radio waves were reflected back from an aircraft in their path. The concept later became the secret weapon known as ‘Chain Home’, early warning radar stations dotted around the south-east of England. It played a major role in foiling the German Luftwaffe during the Battle of Britain in late 1940. Yet even before Chain Home’s successful exploits, two British scientists, Sir John Randall and Dr Harry Boot, had earlier that year developed what is still considered one of the most important inventions of World War II. At the time, the radio waves required for radar were still low in power and frequency, which meant that both the radar range and detection precision were limited. The device created by Randall and Boot, called the ‘Cavity Magnetron’, was forged from a solid block of copper, making it all but indestruct­ible. It instantly produced much higher frequency pulses and at ten times the power of anything available at the time. Engineers, including Alan Blumlein (included in our first 10 scientists), would later develop new radar systems using the cavity magnetron that helped win the war. The story of the magnetron is one of simultaneo­us secret research in a number of countries during the 1930s, but it’s generally accepted Randall and Boot’s efforts resulted in the most significan­t advances and the first easily reproducib­le model. In peacetime, the cavity magnetron powered commercial radar systems for decades after, yet for sheer numbers, the magnetron’s greatest success has been finding its way into kitchens since the 1970s, re-heating last night’s pizza in the humble microwave oven.


It’s often said that necessity is the mother of invention, but it can often start much earlier — with inspiratio­n. For Frances Allen, her path to compiler optimisati­on began with a highschool maths teacher. Reynold Johnson was himself a high-school science teacher who took in two wayward kids to have them learn while building the first automated test scoring machine in the early 1930s.

That teachers are often the first to inspire students into particular career paths makes Australia’s current shortage of qualified maths and science teachers all the more concerning. In June 2017, a parliament­ary committee in Canberra heard that in 2011, only one-in-five Year 4 students was taught maths by someone trained in the area and just one-in-six when it came to science (

There’s no doubt teaching these days is a tough gig, but the positive influence a good maths or science teacher can have in providing role models for students means government­s urgently need to address the numbers of trained STEM teachers. Australia’s future innovation potential depends on it.


 ??  ?? Engelbart’s 1964 computer mouse. (Source: Michael Hicks, CC-BY 2.0)
Engelbart’s 1964 computer mouse. (Source: Michael Hicks, CC-BY 2.0)
 ??  ?? Cell phone inventor Martin Cooper. (Source: Rico Shen, CC-BY-SA 3.0)
Cell phone inventor Martin Cooper. (Source: Rico Shen, CC-BY-SA 3.0)
 ??  ?? A cavity magnetron block from 1940. (Source: John Cummings, CC-BY-SA 2.0)
A cavity magnetron block from 1940. (Source: John Cummings, CC-BY-SA 2.0)
 ??  ?? A Magnetron from a microwave oven (waves come from the top anode).
A Magnetron from a microwave oven (waves come from the top anode).
 ??  ?? The IBM 350 Disk Storage system, invented by Reynold Johnson. Block diagram of Sasson’s ‘Electronic Still Camera’. (Source: Google Patents)
The IBM 350 Disk Storage system, invented by Reynold Johnson. Block diagram of Sasson’s ‘Electronic Still Camera’. (Source: Google Patents)
 ??  ?? Computer scientist Francis Allen . (Source: Rama, CC-BY-SA-2.0 FR)
Computer scientist Francis Allen . (Source: Rama, CC-BY-SA-2.0 FR)
 ??  ??
 ??  ?? Reynold Johnson invented the videocasse­tte that later evolved into the VCR.
Reynold Johnson invented the videocasse­tte that later evolved into the VCR.
 ??  ??

Newspapers in English

Newspapers from Australia