MRI and CAT scans
The digital signal processing tech developed by NASA to enhance pictures of the Moon in the Apollo program is used in MRIs and CAT scans. It also lives on in ACTIS, the Advanced Computed Tomography Inspection System, which produces scans of rocket engines or metal castings to identify flaws. It has also been used to examine T-rex fossils without removing them from the rock they’re embedded in.
Mylar
Coat some plastic film (biaxially-oriented polyethylene terephthalate, to be precise) with a thin layer of aluminum, gold, or another metal, and you’ve got Mylar, a thin sheet that reflects 99 percent of light, including much of the infrared part of the spectrum. Add another layer of polyethylene, and you’ve got something that’s puncture-resistant and impermeable to gases.
Its uses are myriad. It’s in speakers, headphones, and mics. It’s in keyboards. It’s in floppy disks and cassettes, if you’ve got any hanging around. From pinball machines to the capacitors on your motherboard to the silver blankets emergency services use, it’s there. If you play banjo, it’s there, too.
You’ll also know it as the silver coating of the descent stage of the Apollo lunar module, or the wrapping for the Hubble Space Telescope. Its ability to insulate while being thin and light makes it ideal for spacecraft, which is why NASA developed it.
Actually, its history goes back to the chemical companies DuPont, ICI, and Hoechst in the 1950s. They developed the BoPET film, and Kodak went on to use it as a backing for the 6,000-feet reels of film used on long-range U2 reconnaissance flights. NASA’s input was the metallic coating – in 1960, it launched Project Echo, a passive communications satellite that was really a large balloon. The Mylar spheres left the atmosphere on top of a rocket, then inflated to 100 feet in diameter. They were passive, serving only to reflect the microwave transmissions beamed at them.
There was one problem with them. Because they were large and light, they were pushed by the solar wind,
something NASA has been keen to take advantage of with its experimental technology demonstrator Sunjammer. Other space agencies are in on the idea, too, such as Japan with IKAROS: Huge sheets of Mylar that use the stream of energised particles expelled from the Sun as a means of propulsion.
Dustbuster
Every time you suck the dust out of your PC’s case with a handheld vacuum cleaner, you’re following in the footsteps of astronauts who collected samples from the Moon.
It all comes down to a cordless power drill designed for use on the Moon, which had to be able to drill into rocks yet consume the smallest amount of power possible. While most of the samples collected by astronauts were picked up from the surface, subsurface soil was also dug up from depths of up to 10 feet. As they didn’t want to carry huge numbers of batteries around with them, the drills needed their own power source, and to be as small, light, and powerful as possible.
Black and Decker worked with NASA on the problem, eventually coming up with a computer program that optimized the design of a magnetized motor for minimal power consumption. All the R&D carried out was folded back into Black and Decker’s product line, resulting in a range of cordless appliances, and leading to the launch of the Dustbuster in 1979. It sold a million in its first year, and more than 100 million since then.
Dust is a problem in space, so the tech came full circle when NASA put out a call to universities in 2019 asking for ideas for keeping abrasive Moon dust off surfaces and spacesuits. A team from the University of Colorado Boulder came up with a negatively charged electron beam which, when applied to dust in a vacuum, causes it to leap away from the surface it’s sticking to. So, in 20 years, we’ll be cleaning our PCs with electron beams.
Virtual reality
The 1980s were a good time for virtual reality. There was investment from games companies and the military, and a dream of a space station with a virtual environment, in which astronauts lay on couches and used VR to control robots to do all the work.
This didn’t happen. The ISS isn’t a gleaming white sci-fi utopia, but a cramped and occasionally leaky conglomeration of modules, nationalities, and technologies. The tech of the 1980s proved to be unsuited to the task – the helmets were too heavy, early touch interfaces didn’t work, and computers were too slow to create what was needed.
Following a re-think, equipment such as the AERCam Sprint – a free-flying camera powered by nitrogen thrusters and enough battery for seven hours’ flight – were steered using laptop screens rather than through a VR headset, but NASA’s attempts to create a robot to work alongside astronauts have seen it collaborating with Sony. PlayStation VR was the platform for Mighty Morphenaut, a training game designed to help humans operate the Robonaut 2 robots, which look like a terrifying amalgam of Bishop from Aliens and a Cylon. There have been teething problems, however, especially due to lag. “The hope is that by putting people in an environment where they can look around and move in ways that are much more intuitive than with a mouse and keyboard, it would require less training to understand how to operate the robot and enable quicker, more direct control of the motion,” said Garrett Johnson, a software engineer at NASA’s JPL, in a 2016 interview. “[With Mighty Morphenaut] we were able to explore a possible solution and … demonstrate the problems of operating with delayed communication. With time delay, anticipating the motion of a floating object makes it nearly impossible to interact, so further research might include ways to help users predict that kind of motion.”