beyond the blame game
Automation and AI need not necessarily be viewed as forces obviating the need for human expertise, opines Rajesh Maurya, Fortinet.
In an article in The Guardian, Stephen Hawking wrote, “The automation of factories has already decimated jobs in traditional manufacturing, and the rise of artificial intelligence is likely to extend this job destruction deep into the middle classes, with only the most caring, creative, or supervisory roles remaining.”* Here, the author holds a slightly contrarian view, explaining how AI would create more jobs, than rather eliminate them, and why human insight would remain crucial to success.
Adapting to the new digital economy requires organizations to retool not just their networks, but in many cases, core business processes as well. The creation, exchange, and analysis of data—about customers, products, and their usage— enables organizations to gain the insights they need to improve operational efficiency, business agility, and customer experience. The three pillars of digital business are automation, agility, and analytics. As the speed of business accelerates, critical processes need to occur at digital speeds, which means that human beings, and human error, need to be removed from many of the basic operations that support the organization. The first steps towards AI and automation are, indeed, within reach. If organizations quickly prioritize the strategic enablers—speed, integration, advanced analytics, risk-based decision engines—they are primed to create a highly efficient business model
Rajesh Maurya is Regional Vice President, India & SAARC at Fortinet.
that utilizes both human and machine resources for what each does best, and does so with extraordinary agility. Our best talent must be focused on the most critical decisions, while automated systems handle lower-order decisions and processing. That means we need to develop and deploy risk-based decision-making engines that take humans out of the loop, and instead, put them above the loop. After fast, specialized analysis and integration, risk engines are the third major step toward AI. They will execute the ‘OODA loop’ (observe, orient, decide, and act) for the vast majority of situations. Pre-planned Courses of Actions (COAs) will free up valuable cybersecurity experts to concentrate on the more difficult decisions, where human cognition and intervention are most required. The most sophisticated of such engines will actually suggest COAs rather than only rely on predefined ones. So far so good on the case for AI and automation to invade all spheres of business and the need to replace human decision-making. On the flip side, there are arguments that automation is set to replace human beings, and will usher in an era of increased job loss and unemployment. Over the past two years, this debate has grown increasingly heated, an argument which has often been introduced as a counter to some of the more abrasive stances. At first glance, it may have appeared to be a fact-based response to the animosity and divisiveness that defined the debate. For those of us with deeper, first-hand knowledge though, it was just as fear-based and misinformed. The argument was most succinctly summed up by an op-ed in Los Angeles Times, ‘Robots, not immigrants, are taking American jobs’. It states, “A White House report released in December says 83 per cent of US jobs in which people make less than $20 per hour are now, or soon will be, subject to automation… and warns Americans to get ready for an era of 60 per cent unemployment.” But here is the truth and counterargument. At Fortinet, we have been investing in AI for years. It is an incredible technology that presents extraordinary opportunities for how to protect networks and, ultimately, the internet. As AI becomes more common and more sophisticated, it consistently clarifies an important truth: the value, power, and efficiency of AI do not arise from its ability to replace human beings. In fact, AI does just the opposite. Both automation and AI underscore how central and critical human insight and expertise are to success. Arguments and headlines that cast technology as an encroaching threat, widening social divides, and limiting opportunities may provoke stronger reactions (and more clicks), but in general, innovation is not additive or subtractive—it is multiplicative. It creates exponentially more opportunities for more people in more ways than even those most directly impacted by it can often imagine at first. Has email replaced the post office? While the number of total career employees declined from 2007 to 2016, it is now just a bit more than it was in 1965. The volume of marketing mail and first-class mail has decreased, but the volume of total shipping of packages has increased from 3.3 billion to 5.2 billion packages. Delivery points have increased from 148 million to 156 million, and there are also thousands of additional delivery trucks on the roads.
Innovation is not additive or subtractive—it is multiplicative. It creates exponentially more opportunities for more people in more ways than even those most directly impacted by it can often imagine at first.
Did ATMs replace banks? No—by lowering the cost of opening a branch, ATMs helped increase the number of banks by more than 40 per cent. In fact, they did not even replace bank tellers, whose ranks increased to meet the demand of more branches. Over the decade, we have seen where one avenue closes in the jobs market, others open. An example given by a Deloitte economist hammers the nail on its head. The Deloitte economists believe that rising incomes from the adoption of technology have allowed consumers to spend more on personal services, such as grooming. That in turn has driven employment of hair dressers. So, while in 1871, there was one hairdresser or barber for every 1,793 citizens of England and Wales; today there is one for every 287 people. It is especially important to recognize these facts in light of the particularly callous argument that the only jobs that AI kills are the ones nobody would want. All of us value the job that provides for our families and lives. If anything, the rise and spread of AI force us to take a closer look at how charging employees to do the only kinds of tasks that AI is good at—repetitive, precise, controlled tasks that require no reasoning, higher-order thinking or even common sense— represents an outdated, divisive management style. It is hard to imagine an industry more heavily reliant on digital technology than cybersecurity. As of Q3 2017, our cybersecurity tools and technologies were responsible for neutralizing 91,000 malware programs, blocking access to 150,000 malicious websites, and resisting 4.4 million network intrusion attempts—per minute.
The Deloitte economists believe that rising incomes from the adoption of technology have allowed consumers to spend more on personal services.
In a digitally driven world that is teeming with threat actors—from malicious pranksters to criminals, ideologically motivated sects to state-sponsored cyber terrorists, threatening everything from our individual identities to the critical infrastructure of our society—there is no way to protect data without self-learning AI and automation. For effective cybersecurity, we must utilize AI to do time-consuming tasks, such as data mining and parsing data logs, while allowing cybersecurity teams to focus on the much higher order tasks of threat identification and elimination. And yet, one of the gravest challenges our industry faces is a shortage of talent. Our industry’s unemployment rate stands at 0 per cent. In 2016, one million new cybersecurity jobs were created, and estimates project an increase of five or six million over the next few years. In 2015, there was a 74 per cent increase in cybersecurity job postings, half of which went unfilled. Across industries, 45 per cent of organizations claim to experience a problematic shortage of cybersecurity skills. As a result, cybersecurity teams must race from one crisis or breach to the next, with little time for strategic planning or continued learning to keep up with threat sophistication. These are certainly business challenges, and increasingly costly ones at that. The demand itself is driving an expensive bidding war for talent, and the cost of cybercrime is estimated to reach $2.1 trillion globally by next year. These are also national, global security risks, with everything from financial systems to healthcare to critical infrastructure in the crosshairs. Automation and AI are not eliminating jobs. They are creating them—high-paying, high-level, and secure ones at that—at an unprecedented rate. As the levels of data continue to grow, that will create even greater demand. We will never be able to fill these jobs without greater awareness of the need for them, early training in middle school and high school, and more outreach to veterans and college students—particularly women, who presently comprise just 14 per cent of the cybersecurity workforce. There are definitely clear causes of wage stagnation and job loss—from who disproportionately benefits from economic gains to the impact of wealth creation through capital management rather than goods or services, to the funding and priorities of our educational system, to an increasingly volatile financial system, to the impact of rapid globalization. Those are all extraordinary challenges, and it is far easier to scapegoat technology than to address the challenges it presents. Blaming innovation will not solve real problems or prevent crises. It will only drive misunderstandings and clicks in an increasingly unsafe digital landscape.
Blaming innovation will not solve real problems or prevent crises. It will only drive misunderstandings and clicks in an increasingly unsafe digital landscape.