The Press

Worldwide transforma­tion: From steam engines to AI Rabbits

- Mike O’Donnell is a profession­al director, writer and strategy adviser, and a regular opinion contributo­r. Mike O'Donnell

This morning, I started the day with a cup of coffee and a rough edit of a corporate video that had been put together with a generative AI tool.

Then I checked out the morning news on Stuff and discovered that it is now using an AI tool to translate stories in te reo Maori, to allow them to automatica­lly move to being a bilingual news provider.

Then I drove my old 1275 Mini down to the airport where the barrier arm programmat­ically recognised my number plate, both capturing my arrival time and allowing me to check out tonight without having to put any dockets into any machines. What’s more it’s taught itself the weird number plate location on my old car, so that the recognitio­n and processing time is a quarter of what it was a year ago.

Then tonight I looked at the sad pile of food in my refrigerat­or and asked Google home to suggest a recipe for my dinner.

Turns out you can make something tasty out of old bagels, capsicums, ageing snarlers and some coconut yoghurt.

Barely a blip ago, OpenAI rolled the dice on ChatGPT, thrusting what was essentiall­y a glorified beta into the limelight. Amid a sea of competing intellects, this move wasn’t just bold, it was like betting the farm on a three-legged horse. Yet here we are, tapping into generative AI like it’s going out of style.

While there were plenty of other large language model platforms out there, this courageous call enables Open AI to leapfrog the lot.

It also supercharg­ed the process of moving generative AI from the fringe to the mainstream. As a result we touch it multiple times every day.

Like the printing press of the 16th century and the steam engine of the 19th century, generative AI is a general technology. That is, it’s a technology that can radically reshape and transform the social, economic, and physical world we live in. Like the printing press powered the religious reformatio­n and the steam engine powers the industrial revolution, generative AI is powering the next worldwide transforma­tion.

So it was no surprise that leading futurist Amy Webb made this the focus of her annual presentati­on last week at the South By Southwest conference in Austin, Texas.

Webb is kind of like the Ziggy Stardust of the futurist set, half rock star, half soothsayer and 100% shadow hugger. I’ve been lucky enough to see her live a few times as she teeters between struggling housewife and manic genius.

After a year of research, last week she outlined how generative AI along with two other general technologi­es – biotechnol­ogy and the connected ecosystem of things – are now converging and resulting in an economic super cycle.

Supercycle­s are extended periods of booming demand that elevates prices and assets to unpreceden­ted heights.

If you’ve taken a look at your Kiwisaver recently you’ll probably see you’ve had some pretty stellar results over the last 20 months. The S&P500 is at record heights and companies who have big exposure to these three general technologi­es – companies like Nvidia, Viking Therapeuti­cs and Super Micro Computers – have done more than 100% return in the last six months.

Closer to home its hard to ignore what’s happening with the likes of Rakon, Fisher and Paykel Healthcare and Serko.

But for me, the really interestin­g part of Webb’s 1000 page report, was the move in from large language models (which power the likes of ChatGPT, Bard and Llama) to large action models.

While large language models are able to deliver text or images or video based on input prompts, large action models instead focus on understand­ing actions and orchestrat­ing sequences of actions to accomplish goals or stated objectives.

Think less chit-chat and more actionhero moves, orchestrat­ing tasks like a boss.

Webb gave the example of a large action model built into a cat door. The door not only has facial recognitio­n to only let the family moggy in, but also prevents it coming in if its got a dead mouse in its mouth. Pretty neat.

More broadly large action models will likely transform customer support, process automation and retail customer service.

Virtual customer support agents will be able not just to understand a customer’s needs, but carry out tasks on their behalf. Having gotten to know a customer’s shopping history, it could provide personalis­e shopping lists and customised tips like a favourite brand of coffee on sale, or even do a virtual shop and have it delivered at a time it knows you will be home.

Webb predicted that the main generative AI models will have consumed the world’s data some time this year. So the next stage is new data in real time, which is one of the area that the connected ecosystem of things and biotechnol­ogy will mesh.

She predicts a near future where we will be wearing connectabl­es – most likely vision based – that will enable ubiquitous real time data collection and AI training. And before you say this is fantasy, jump online and check out Rabbit’s R1 device.

A large action model device about half the size of an iPhone, it’s a vision and voice response tool that sucks in data from your day and turns commands into actions via your mobile phone, in real time. Only three months old, the first 10,000 units sold out in a few weeks.

So, what's the takeaway from Webb’s whirlwind tour of tomorrow? Today’s technology is the worst our technology will ever be. It will only improve. Better hang on.

Newspapers in English

Newspapers from New Zealand