THIRTY YEARS LATER...
A robot just made me french fries. Delicious, they cooked for four minutes less than the instructions dictated. One minute less, they’d’ve been soggy. A few more, burnt. An eagle-eyed artificially intelligent oven made the timing and temperature calls. My contributions: Arrange fries on tray, slide tray into oven, acquire ketchup. (Invent ketchup drone?)
That smart cooker, which we’ve nicknamed the Millennial Oven, perfectly follows the trajectory of innovation over the past 30 years. It’s built around bedrock technologies—convection cooking, image recognition, microprocessors, compact cameras, wireless radios—but elevated by the addition of an anyone-proof interface.
We’ve seen this story time and again since 1988, the year Popular Science editors first anointed 100 products as the Best of What’s New. The cultural shift over those years is remarkable.
Thirty years ago, science and tech were the domains of enthusiasts: audiophiles, mechanics, and IT gals MacGyvering together the components for makeshift local networks. Today, specialized ideas—like printing wirelessly or blasting into space—have rocketed into the mainstream. Because of that shift, by today’s standards, many of the first BOWN winners are just plain wonky.
Two classes of product dominated those early years. First, you have the stuff that makes the other stuff work, the underlying technology: proof-of-concept wireless internet, early neural-network computers. Then you have devices defined by nuanced improvement or a single attention-grabbing feature, which, if we’re honest, most folks neither wanted nor understood.
Consider Panasonic’s PV-4826 VCR from 1988. A combination video-cassette recorder and answering machine, the $470 (that’s $938.30 in 2017 money) deck let owners call in to program recordings using touch-tone key codes. Useful? Yes. Cool? Sure. Kludgy as hell? Most definitely.
Fast-forward (sorry) 30 years, though, and the ability to remotely cue recordings is still impressive, a feature you’ll find only on more-advanced set-top boxes. These days, we access our DVRs over cable, DSL, or fiber-optic hookups instead of phone lines; we set programs via app instead of touch-tone, and record Game of Thrones onto hard drives instead of magnetic tapes. Even cord-cutters tap the same back-end technologies to stream or download a binge-bender’s worth of episodes from Amazon, Netflix, or YouTube.
For the person doing the watching, the difference between then and now lies in the smoothness of the process. It would take until the late ’90s for technology to finally work well— and to do so for everyone. Palm Pilots and iMacs and Motorola StarTACs weren’t objects people put up with in their offices or homes because they had to; they were things folks wanted to and could use, free of excruciating early-adopter nonsense. Technology was subcultureturned-zeitgeist, with nowhere to go but everywhere.
Think of this tidal shift as the ascension of the user experience or the democratization of innovation. But the sum total of the past three decades is the same: It’s up to us to perfect products, or to decide when perfection is achieved. It’s borne out again and again, no matter the field of endeavor. Where NASA once dominated space, we now have private enterprises like SpaceX, Bigelow Aerospace, Virgin Galactic, and Blue Origin. Where AIs once did their thinking only in university labs, we now interact with the simple interfaces of voice-recognizing Google assistants and facedetecting security cameras as casually as we’d chat with a coworker. And we sprinkle this once-rarified gear throughout our homes and offices as casually as radios and lamps.
Technology, once impenetrable, has become the wrecking ball that breaks down barriers. We can down malicious drones. We can teach our kids to build robots. We can blast ourselves to Mars. And, while we’re at it, we can make the capsule pretty damn comfy.
IT WOULD TAKE UNTIL THE LATE ‘90S FOR TECH TO WORK WELL—FOR EVERYONE.