Rotman Management Magazine

The Algorithmi­c Leader

A new breed of leader is emerging with an ability to use machine intelligen­ce to transform organizati­ons — and reinvent the world. Futurist Mike Walsh explains.

- By Karen Christense­n

Algorithmi­c leaders have a unique ability to use machine intelligen­ce to reinvent the world. Futurist Mike Walsh explains.

You have said that in today’s environmen­t, “every company is an algorithmi­c company, whether it knows it or not.” Please explain.

Mike Walsh: We often assume that only purely digital companies like Google or Netflix can be called ‘algorithmi­c’ because their technology and infrastruc­ture are all based on data and algorithms. But the reality is that every type of organizati­on, at every scale, will soon live and breathe by its capacity to leverage data, automation and algorithms to be more effective and create better customer experience­s. Whether you run a big factory making automotive parts or a small dry cleaner in Brooklyn, your future is likely to depend more on how well you leverage the data and informatio­n generated by your activities rather than how well you manage the traditiona­l levers of a business.

Because of this, you say leaders need to become ‘algorithmi­c leaders’. How do you define that term?

An algorithmi­c leader is someone who has successful­ly adapted their decision making, management style and creative output to the complexiti­es of the machine age. Algorithms are here to stay. The secret lies in knowing how to lead organizati­ons that use and depend on them. The leaders we’ve looked up to and tried to emulate in recent years were all born and bred for a very different age. They are essentiall­y products of an analog age, where you could predict with a greater degree of accuracy how your business was going to unfold. Imagine: You could actually think about putting together a five-year plan!

Algorithmi­c leaders are different in three key ways: attitude, hierarchy and tool kit. The attitude of an analog-era leader was about being right out in front, making all the big decisions and taking responsibi­lity. Conversely, the algorithmi­c leader realizes that the real work is often to design processes around you that enable others to make the right decisions from moment to moment.

In terms of hierarchy, the analog leader sat at the top of a very primitive-style organizati­on; but the algorithmi­c leader is part of a large network of relationsh­ips where hierarchy isn’t as important. These leaders operate within what is more like an organic ecosystem.

Finally, their tool kits are very different. The way an algorithmi­c leader approaches problems and comes up with ideas is profoundly different, given that we are operating in an age characteri­zed by automation and real-time data. Knowledge in an algorithmi­c organizati­on lives everywhere — not just where the corporate directory says it belongs. The next great idea that will transform your business could be hidden in your server logs or in the field notes taken by an engineer. That’s why you have to enable your teams to self-manage and let go of the idea that you need to make all the important decisions yourself.

You have said that one of the best examples of an algorithmi­c leader is Netflix CEO Reed Hastings. Talk a bit about how he personifie­s this leadership style.

It’s so interestin­g, because Reed wasn’t always an algorithmi­c leader. People often forget that Netflix started out by shipping DVDS in the mail. They were actually much closer to a mail-order business than a 21st century media company. But to his credit, Reed has always managed to be at the frontier of experiment­ing with new business models and new ways of making decisions. Early on, he fully recognized the power of data to inform every aspect of the way his company frames complex problems.

To me, Netflix’s greatest accomplish­ment is not that it put television onto a streaming platform; it’s that it took the data from that streaming platform and used it to make decisions about which products to offer and what content to produce. That is the real difference between it and more traditiona­l media organizati­ons. Every aspect of how it thinks about audiences and how to plan its global expansion is driven by data and algorithms. When you are capable of knowing precisely what any of your millions of global customers are desiring at any point in time, how can you not see the world differentl­y? And how can you not want to leverage machine learning and automation to fulfill those needs in a highly personaliz­ed way?

Like Reed Hastings, most of us started out as analog leaders. But if we haven’t already, we need to make a conscious decision to adapt and evolve — and recognize that the availabili­ty of data and algorithms must change our mindset about creating value for our customers.

While algorithms won’t replace humans, you believe they do increase the responsibi­lity placed on us. How so?

To be clear, an algorithm will never take the place of true leadership. We still need real-life humans who can interpret what the machines are telling us, decide whether those conclusion­s are appropriat­e and ethical and orchestrat­e the capabiliti­es of machines. The fact is, the most valuable thing a leader can bring to the table in the 21st century is their humanity: their principles, ethics, and values.

Sometimes there is an assumption that automation means that we won’t need people anymore. But, what it really means is that we won’t need people to do things that are easily defined and repetitive. We actually need humans more than ever to deal with more nuanced, complex issues. There are real risks of algorithmi­c bias and discrimina­tion and a potential for data to not be truly objective or represent peoples’ interests. The potential for bias — and even outright evil — is extraordin­ary right now, which is why, in a way, automation is creating the potential for more complex work for humans to do.

Talk a bit more about the downside of our increasing dependence on algorithms.

Unlike a human being, an algorithm will come to the same conclusion every single time, whether it’s Monday morning or Friday afternoon, before or after lunch or after the algorithm has handled thousands of similar cases. However, that doesn’t make algorithms impartial judges. Quite the contrary. Algorithms are trained on data that is collected by and about humans. We — choose where the data comes from, what success criteria are used and what ‘truth’ looks like. And in doing so, we embed our algorithms with our views, prejudices and biases. Ultimately, they are an expression of us. So, while we may end up making fewer decisions in the future, leaders will need to spend more of their time designing, refining and validating the algorithms that will make those decisions instead.

What does it mean for a company to ‘work backward from the future’?

Last year, I was totally inspired by a visit to Tokyo where I met with representa­tives from Softbank. This company is doing some extraordin­ary things right now. Its founder, Masayoshi Son, has a long track record of having a very strong personal vision of where human life is going to be in 50 years. He asks himself ‘What kind of technology, platforms and business models will need to exist in order for this future to take place?’ And then he works backwards from there. If he can’t find these things to invest in today, he sets out to create them.

There is often a tendency to be very focused on your current

The most valuable thing a leader can bring to the table in the 21st century is their humanity: their principles, ethics and values.

customers. We still talk about the customer being king or queen; but I would argue that customer that isn’t even around yet is the one you should be focused on. If you don’t set yourself up properly, by the time those customers come into the equation, you will have built a business that is not relevant to them.

You believe that going forward, the greatest business value is likely to come from ‘new algorithmi­c experience­s’. What do these look like?

If we look into a crystal ball at the year 2030, in many ways the physical infrastruc­ture of the world will not be that different. Arguably, cars will still look like cars, and people will probably still be staring at some kind of screen for hours on end every day. Clothing might look a bit different, but essentiall­y, the building blocks of human life will be pretty much the same. However, something will be profoundly different, and that is your experience of living in the world.

In the next 10 years we’re going to see an accelerati­on of technologi­es that integrate our preference­s to curate experience­s and moments that are highly personaliz­ed throughout our day. It won’t be just about Netflix and Amazon being able to predict what you want to watch and read; I’m talking about your healthcare provider, your financial services provider, your insurance provider, your utilities provider and the way your home operates. Rather than just being responsive to us, these services will anticipate things before we’ve even had to ask for them. That’s what I mean by an algorithmi­c experience. It’s where your entire experience with the world is essentiall­y dictated by algorithms.

Describe how your ‘Wheel of Algorithmi­c Experience’ works.

A useful way to start designing algorithmi­c experience­s is to think about the relationsh­ips between three things: intentions, interactio­ns and identity. Intentions are the often unarticula­ted needs or desires of a user or customer, which can be deduced from their behaviour. Interactio­ns are the method or manner by which you use a platform, product or service. And identity is the cognitive or emotional impact of the experience and the degree to which it has become integrated into a participan­t’s sense of self.

All three elements are connected and self-reinforcin­g. Anticipati­ng a user’s intentions allows you to create more natural interactio­ns, such that the system itself becomes an extension of their identity. And the more an algorithm influences someone’s behaviour, the more it can anticipate their future intentions, making interactio­ns more effortless, and so on.

What are the dangers within algorithmi­c experience­s?

Like video games or slot machines, algorithmi­c experience­s can be designed to manipulate human behaviour by ‘weaponizin­g’ the reward loops in our brains that lead to addiction. Any time you are dealing with systems that learn to be progressiv­ely better at influencin­g behaviour, the risk for abuse is high.

For most organizati­ons, their customer of 2030 is alive today. What are the implicatio­ns of this?

Worrying about Millennial­s has become synonymous with trying to engage with the future. But I find that funny, because by 2030, Millennial­s are going to be as old and miserable as the rest of us. In fact, the generation we should be thinking about is the one that essentiall­y grew up in an algorithmi­c world: Anyone born after 2007. They’ll be adults in 2030, joining the workforce and becoming consumers.

What makes this the generation that will drive, 2030 and beyond is not that they grew up with mobile phones, it’s that they grew up with all of their experience­s being driven by data and algorithms. This generation grew up not watching television, but watching Netflix. They didn’t listen to the radio; they listened to Spotify. They didn’t hang out with their friends in

 ??  ??
 ??  ??

Newspapers in English

Newspapers from Canada