The De­moc­ra­ti­za­tion OF JUDG­MENT

In an age of Big Data and Ar­ti­fi­cial In­tel­li­gence, the ex­er­cise of good judg­ment by em­ploy­ees through­out an or­ga­ni­za­tion has never been more im­por­tant.

Rotman Management Magazine - - FROM THE EDITOR - by Alessan­dro Di Fiore

the an­nounce­ment of a promisHARDLY A DAY GOES BY WITH­OUT ing new fron­tier for Ar­ti­fi­cial In­tel­li­gence (AI). From fin­tech to edtech, what was once fan­tas­ti­cally im­prob­a­ble is be­com­ing a com­mer­cial re­al­ity. At the same time, cor­po­rate in­vest­ments in Big Data and the div­i­dends they yield in terms of con­sumer in­sights are trum­peted on a daily ba­sis.

Oddly, we don’t hear much about the de­mand cre­ated by this ris­ing ‘sup­ply’: In a world of Big Data and AI, the de­mand for sound and dis­trib­uted judg­ment is in­creas­ing. ‘Qual­i­ta­tive judg­ment’ — the abil­ity to make a de­ci­sion based on a per­sonal in­ter­pre­ta­tion of the con­text and avail­able facts — has never been more im­por­tant. In this ar­ti­cle I will exlain why judg­ment has be­come so im­por­tant, and how to go about en­abling it through­out your or­ga­ni­za­tion.

The Ris­ing De­mand for Judg­ment

A ba­sic de­ci­sion process can be de­con­structed into four log­i­cal steps: col­lect and or­ga­nize avail­able data; an­a­lyze it for pat­terns and in­sights; pre­dict the best pos­si­ble cour­ses of ac­tion; and use judg­ment to make a final de­ci­sion. This last step is more im­por­tant than ever, and there are three main rea­sons for this.

1. Qual­i­ta­tive judg­ment is the last pre­serve of hu­man­ity in mak­ing de­ci­sions.

There is no ques­tion that Big Data and AI of­fer im­por­tant ad­vances in the realm of man­age­ment. Al­ready, they are help­ing or­ga­ni­za­tions an­a­lyze their mar­kets and con­sumers more ef­fec­tively and make in­formed pre­dic­tions. But cer­tain types of de­ci­sions — par­tic­u­larly those around in­no­va­tion and those re­lat­ing to con­sumers — will al­ways en­tail a com­po­nent of qual­i­ta­tive judg­ment.

For ex­am­ple, in health­care, AI is hav­ing a huge im­pact. But, even if AI can sup­port a doc­tor in mak­ing a di­ag­no­sis and sug­gest­ing med­i­cal treat­ments for a par­tic­u­lar can­cer pa­tient, only

the doc­tor her­self is able to fac­tor in the over­all con­di­tion (phys­i­cal and men­tal) of the pa­tient and his emo­tional con­text (and that of his fam­ily) to de­cide whether to pro­ceed with surgery or chemo­ther­apy. It is not pos­si­ble for a ma­chine to fac­tor in the emo­tional and po­lit­i­cal con­text of any sit­u­a­tion; yet few would ar­gue that both con­texts are crit­i­cal for most de­ci­sions in busi­ness — and else­where.

Some of the best man­age­ment de­ci­sions in busi­ness his­tory have been made based on qual­i­ta­tive judg­ment rather than data alone. Con­sider the story of Ne­spresso by Nestlè, which has be­come the lead­ing global brand of premium-por­tioned cof­fee. Ne­spresso ma­chines brew espresso from alu­minium cap­sules — preap­por­tioned sin­gle-use con­tain­ers of var­i­ous high-qual­ity cof­fees and flavour­ings. Those fa­mil­iar with the Ne­spresso story know that the brand only took off when it stopped tar­get­ing work­places and started mar­ket­ing it­self to house­holds.

Quan­ti­ta­tive ev­i­dence had sug­gested that in­di­vid­ual con­sumers’ in­ten­tions to pur­chase did not meet the thresh­old re­quire­ments set by Nestlè’s prod­uct-launch pro­ce­dure. How­ever, Jean-paul Gail­lard, the young mar­ket­ing head of Ne­spresso, be­lieved strongly in the idea, and thanks to his skill­ful in­ter­pre­ta­tion of the data and his will­ing­ness to go against Nestlè’s pre­vi­ous in­no­va­tion ‘rules’, he con­vinced the com­pany to take the risk. If he had only lis­tened to the data, the con­cept would never have got­ten off the ground.

Busi­ness his­tory is full of sim­i­lar sto­ries, where peo­ple have will­fully com­ple­mented data with their qual­i­ta­tive judg­ment and reaped great re­wards. Cre­ativ­ity, emo­tional un­der­stand­ing and pure imag­i­na­tion are things that hu­mans ex­cel at, and the avail­abil­ity of a huge amount of ad­di­tional data will not change this fact of life.

2. As the cost of prediction goes down, the de­mand for judg­ment will in­crease.

In their Novem­ber 2016 ar­ti­cle for Har­vard Busi­ness Re­view, “The Sim­ple Eco­nom­ics of Ma­chine In­tel­li­gence”, [Rot­man School of Man­age­ment Pro­fes­sors] Ajay Agrawal, Joshua Gans and Avi Gold­farb framed the trade-offs be­tween Ar­ti­fi­cial In­tel­li­gence and judg­ment. I would like to elab­o­rate on this bril­liant ar­ti­cle, stress­ing the au­thors’ anal­ogy to Pro­duc­tion The­ory — i.e., the eco­nomic process of con­vert­ing in­puts into out­puts.

As Prof. Agrawal et al. in­di­cate in the ar­ti­cle, tech­no­log­i­cal rev­o­lu­tions im­pact the cost and value of im­por­tant in­put fac­tors. In our case, thanks to the ad­vent of Big Data, the cost of find­ing and or­ga­niz­ing data and run­ning analy­ses has be­come much cheaper. As the au­thors in­di­cate, AI is a prediction tech­nol­ogy, so the cost of prediction will also be­come cheaper over time.

When the cost of any in­put fac­tor falls, cer­tain mi­croe­co­nomic rules can be ap­plied — and not only to pro­duc­tion, but also to the de­ci­sion mak­ing process. First, we will sub­sti­tute other in­put fac­tors (hu­man skills) with the low cost (and bet­ter) tech­nol­ogy to col­lect data and de­velop pre­dic­tions; and sec­ond, the value and de­mand of com­ple­men­tary fac­tors will rise.

For ex­am­ple, when data and prediction are cheap, com­pa­nies can gen­er­ate more fre­quent cus­tomer in­sights, which cre­ates the need for more-fre­quent de­ci­sions re­gard­ing cus­tomer sup­port, pro­mo­tions, prod­uct cus­tomiza­tion and new prod­uct devel­op­ment. This, in turn, will lead to greater de­mand for the ap­pli­ca­tion of judg­ment and emo­tional un­der­stand­ing — pro­vided by hu­mans — to make de­ci­sions. This is ex­actly what hap­pened at Unilever, af­ter it in­tro­duced a num­ber of data-driven sys­tems ac­ces­si­ble to all of its global mar­keters: The avail­abil­ity of real-time, fre­quent, data-driven con­sumer in­sights gen­er­ated ever-greater de­mand for judg­ment and de­ci­sions by the com­pany’s mar­keters.

3. As data-prediction tech­nolo­gies are dis­trib­uted more widely, so must judg­ment be.

Big Data and AI will pro­vide man­agers and em­ploy­ees at all lev­els with ac­cu­rate data and pre­dic­tions at their fin­ger­tips. Us­ing dis­trib­uted IT ar­chi­tec­tures, these tools can al­low em­ploy­ees through­out an or­ga­ni­za­tion to make the right de­ci­sion for a par­tic­u­lar con­text in a timely man­ner. As a re­sult, the smartest com­pa­nies will en­sure the dis­tri­bu­tion of judg­ment-based de­ci­sion pow­ers.

Rec­og­niz­ing the power of data-based dis­trib­uted de­ci­sion mak­ing, Affin­ity, the Min­nesota-based credit union, is­sued a frame­work to guide its em­ploy­ees in mak­ing de­ci­sions re­gard­ing loans. Its ‘MOE’ sys­tem (Mem­ber, Or­ga­ni­za­tion, Em­ployee) op­er­ates like a ‘constitution’ to free up the judg­ment pow­ers of

It is not pos­si­ble for a ma­chine to fac­tor in the emo­tional and po­lit­i­cal con­text of any sit­u­a­tion.

em­ploy­ees and pro­vide a ‘North star’ to guide them when ap­ply­ing these pow­ers. Em­ploy­ees have full latitude with re­spect to rates and can over­ride the bank’s poli­cies based on their judg­ment of ‘what is right for the cus­tomer in that con­text’, sup­ported by cus­tomer an­a­lyt­ics. The MOE Constitution states:

“No em­ployee will ever get in trou­ble for do­ing what is right for the cus­tomer. There is only one op­er­at­ing pol­icy or guide­line you ever need: Trust your feel­ings. If it feels right and makes sense, do it on be­half of the cus­tomer. Do not con­sider the sys­tem ca­pa­bil­ity, pol­icy, or pro­ce­dure; err on the side of do­ing what­ever is nec­es­sary for the cus­tomer and al­low your man­ager or su­per­vi­sor to take care of the rest. Fi­nally, be pre­pared to de­fend your de­ci­sion! If your in­ten­tion is to do what is right for the cus­tomer, you will have the sup­port of man­age­ment and your co-work­ers.”

Ev­ery Affin­ity em­ployee can now de­cide, on the spot, whether to pro­vide or not pro­vide a loan to a par­tic­u­lar cus­tomer, and if so, at which rate, by us­ing a blend of cus­tomer an­a­lyt­ics and per­sonal judg­ment. When an em­ployee de­vi­ates from the bank’s poli­cies, she is re­quired to jus­tify her de­ci­sion and post the ra­tio­nale in Affin­ity’s Touche sys­tem, which stores all data and elec­tronic records of mem­bers/clients for all to see, as well as a full his­tory of em­ployee ex­pla­na­tions for lend­ing. The re­sult: When Affin­ity em­ploy­ees started to make judge­ment-based de­ci­sions in large numbers, charge-off rates for higher-risk clients dropped by al­most 50 per cent — from 1.9 to 1 per cent.

Im­pli­ca­tions for Or­ga­ni­za­tions

The three fac­tors dis­cussed above in­di­cate that now, and in the fu­ture, com­pa­nies will re­quire more rather than less hu­man judg­ment for their in­no­va­tion- and cus­tomer-re­lated de­ci­sions.

To get there, judg­ment must be de­moc­ra­tized across the or­ga­ni­za­tion. Most com­pa­nies can­not rely on a lone in­di­vid­ual like Jean Paul Gail­lard to over­ride the ex­ist­ing cul­ture and pro­ce­dures, and that is why ev­ery or­ga­ni­za­tion needs to cre­ate its own Judg­ment Pro­to­col. Much like Affin­ity’s MOE, this is a sys­tem that le­git­imizes the ex­er­cis­ing of judg­ment within your or­ga­ni­za­tion across all lev­els — and one that will change the cen­tury-old

When data and prediction are cheap, com­pa­nies can gen­er­ate more cus­tomer in­sights, which cre­ates the need for more-fre­quent de­ci­sions.

‘com­mand and con­trol’ phi­los­o­phy that many com­pa­nies still use to make de­ci­sions.

Fol­low­ing are four guid­ing prin­ci­ples for lead­ers who are eager to em­brace this new im­per­a­tive.

1. De­moc­ra­tize Judg­ment Power

Com­pa­nies tend to be­lieve that in­no­va­tion and mar­ket-re­lated de­ci­sions are the re­spon­si­bil­ity of a few, highly-po­si­tioned peo­ple. There is a wide­spread au­to­cratic view, which con­ceives that only the ‘elected ones’ are en­ti­tled to make de­ci­sions that af­fect cus­tomers. By way of con­trast, con­sider the credo that Toy­ota em­braced in its Toy­ota Pro­duc­tion Sys­tem (TPS). In TPS, ev­ery­body is re­spon­si­ble for the search and im­ple­men­ta­tion of ideas to im­prove op­er­a­tional per­for­mance. Re­spon­si­bil­ity is pushed down to the very low­est level in the or­ga­ni­za­tion. In the TPS, two worlds — man­u­fac­tur­ing and mar­ket in­no­va­tion, which ap­pear so re­mote from each other — share the same phi­los­o­phy for suc­cess.

2. Fos­ter Qual­i­ta­tive Judg­ment Skills

As soon as we push down the re­spon­si­bil­ity to iden­tify is­sues and make de­ci­sions, we will want to in­crease the prob­a­bil­ity that our em­ploy­ees will chose the right course of ac­tion and ex­e­cute on it prop­erly. The sec­ond core prin­ci­ple of the TPS is to train ev­ery­one in the work­force in qual­ity, lean/six sigma tools and tech­niques. Wide­spread train­ing on stan­dard­ized tools in­creases the prob­a­bil­ity that peo­ple will come up with the right in­sight, de­ci­sion and ex­e­cu­tion to im­pact per­for­mance.

Other or­ga­ni­za­tions should ap­ply this same prin­ci­ple and stan­dard­ize tools, meth­ods and tech­niques to im­prove their em­ploy­ees’ skills in gen­er­at­ing in­sights — and ap­ply­ing judg­ment. Do­ing so will re­quire a shift in per­spec­tive, to a mind­set that views judg­ment as a key or­ga­ni­za­tional ca­pa­bil­ity wor­thy of in­vest­ment.

For ex­am­ple, Unilever en­cour­ages ev­ery one of its em­ploy­ees to en­gage with con­sumers to gain in­sights about their needs, pro­vid­ing al­lot­ted time dur­ing the work­day for this ac­tiv­ity on a reg­u­lar ba­sis. To raise the ef­fec­tive­ness of the time and free­dom pro­vided, Unilever trains its em­ploy­ees in both con­sumer ob­ser­va­tion and prob­ing meth­ods, as well as on how to use some of their newly de­vel­oped Big Data mar­ket­ing tools like the Peo­ple Data Cen­tre, which com­bines so­cial me­dia and busi­ness an­a­lyt­ics cap­tur­ing con­ver­sa­tions in 40 lan­guages.

As an ex­am­ple of this ap­proach in prac­tice, con­sider the Knorr brand’s ‘Love at First Taste’ cam­paign. Data sug­gested that ‘peo­ple are at­tracted to oth­ers who like the same flavours as they do’; Knorr mar­keters de­cided to act on this find­ing by set­ting-up peo­ple with the same taste on blind tests and video­tap­ing the re­sults. The video reached 100 mil­lion views in a few weeks: Data plus in­sight and judg­ment spawned a mar­ket­ing hit.

3. Pro­vide Data Ac­cess to All

Data ac­cess will raise the ef­fec­tive­ness of em­ploy­ees in us­ing their judg­ment. Of course, some com­pa­nies are bet­ter than oth­ers at trans­form­ing data into ac­tion­able in­sights. Prior re­search has tended to em­pha­size the role of data sci­en­tists who have the skills to an­a­lyze data. This im­plies that com­pa­nies with more data sci­en­tists have bet­ter chances of gen­er­at­ing value. My own ex­pe­ri­ence as a con­sul­tant, sup­ported by aca­demic re­search, in­di­cates a dif­fer­ent view: Firms that hire an army of data sci­en­tists do not al­ways gen­er­ate bet­ter value. Rather, it is the process of data man­age­ment — and par­tic­u­larly, the de­moc­ra­ti­za­tion of ac­cess and use of data among man­agers and em­ploy­ees — that cre­ates tan­gi­ble value.

Con­sider in­ter­net plat­form com­pa­nies, where data is at the core of the busi­ness model. Airbnb has taken a step ahead in the de­moc­ra­ti­za­tion of data: Its en­tire work­force, in­clud­ing hu­man re­sources, has ac­cess to its data sci­ence tools to make timely de­ci­sions re­lated to re­quests from both users and providers of homes, as well as act swiftly on in­no­va­tion op­por­tu­ni­ties.

How­ever, Airbnb also un­der­stands that fully-in­clu­sive data ac­cess is not enough: Its em­ploy­ees are also trained on how to use data tools and ex­tract in­sights to make in­formed de­ci­sions. Data Univer­sity is Airbnb’s at­tempt to make its en­tire work­force — not just its en­gi­neers — more data lit­er­ate. It has de­signed 101-level cop­urses on data-in­formed de­ci­sion mak­ing which are avail­able to all em­ploy­ees. The re­sult: Since launch­ing the pro­gram in late 2016, Airbnb has seen the weekly ac­tive users of its in­ter­nal data tools rise from 30 to 45 per cent.

4. Loosen the Reins of Con­trol

Or­ga­ni­za­tions tend to be un­com­fort­able at the prospect of de­ci­sion-mak­ing au­thor­ity be­ing pushed down the hi­er­ar­chy. For many, the loss of con­trol is syn­onomous with risk, and this has been a ma­jor bar­rier to the true em­pow­er­ment of the work­force. The so­lu­tion lies in shift­ing from a tra­di­tional ‘Pre­ven­tion-con­trol Model’ to a ‘Post-de­tec­tion Model’.

For ex­am­ple, in a bank, if an ex­cep­tion to a loan pol­icy is be­ing re­quested, the Pre­ven­tion-con­trol Model would re­quire au­tho­riza­tion sig­na­tures sev­eral lev­els up. Even when a loan ap­pli­cant has a a per­fect credit score and fits with the bank’s pol­icy, most likely the loan will need to be signed by the em­ployee and her su­per­vi­sor be­fore be­ing ap­proved. Pre­ven­tion Con­trol Mod­els are the great­est bar­rier to true em­pow­er­ment.

Let’s re­turn to Affin­ity as an ex­am­ple of a Post-de­tec­tion Model: When an em­ployee de­cides to of­fer a loan to a cus­tomer be­cause it ‘feels right’ (per the MOE Constitution) but is an ex­cep­tion to the bank’s pol­icy, the em­ployee must write up a ra­tio­nale for the de­ci­sion taken and post it on the client data sys­tem. As such, the ra­tio­nale is trans­par­ent to col­leagues and su­per­vi­sors, gen­er­at­ing a so­cial con­trol that re­acts only in ex­treme in­stances.

In clos­ing

The time has come to walk the talk with re­spect to de­moc­ra­tiz­ing de­ci­sion-mak­ing au­thor­ity. Low cost data-prediction tech­nolo­gies, cou­pled with an of­fi­cial, com­pany-spe­cific judg­ment pro­to­col can help to free em­ploy­ees from the shack­les of hi­er­ar­chy and cre­ate truly ag­ile and cus­tomer-cen­tric or­ga­ni­za­tions that are able to adapt quickly to mar­ket sig­nals. And, if the com­pa­nies who have em­braced this ap­proach are any in­di­ca­tion, prof­itable growth is sure to fol­low.

Data ac­cess can raise the efff­fec­tive­ness of em­ployee judg­ment.

Alessan­dro Di Fiore is the Founder and CEO of the Euro­pean Cen­tre for Strate­gic In­no­va­tion (ECSI) and ECSI Con­sult­ing, based in Bos­ton and Mi­lan. He is the founder and for­mer Chair­man of Har­vard Busi­ness Re­view Italia.

Newspapers in English

Newspapers from Canada

© PressReader. All rights reserved.