Dive into data
Take a closer look at how data analytics is shaping the future of society.
IT’S time to set the record straight: Your personal data really isn’t as valuable as you think it is. “Each of us, when we look at our personal data, we think it’s worth a lot. But if I’m a (data) aggregation company, your personal data is worth next to nothing,” says Brian Prentice, research vice president at Gartner Inc. “It’s only worth something when it’s put together with millions of other people’s data.”
But that is not all. He points out that from 2014 onwards many of us may soon find ourselves willing to volunteer data about ourselves to companies out there due to the incentives that we are offered in return.
For instance, Prentice shares an example where shoppers have the opportunity to enjoy exclusive offers in various stores within a particular mall whenever they are logged in to the free WiFi network provided.
At the same time, by connecting to such a network, these shoppers are agreeing to allow the operators of the mall to collect personal details about themselves and to track their movements around the mall.
Despite an awareness of their data being recorded by the mall operators, he feels that most shoppers would still choose to log onto the WiFi connection, as they are able to benefit from special discounts and promotions by doing so.
“By 2017, 80% of consumers will collect, track and barter their personal data for cost savings, convenience and customisation,” Prentice says.
According to Gartner, three sectors that would be particularly affected by this trend would be online retail businesses, communications service providers and financial services. It also said that companies could shield themselves from any potential opposition from consumers by first seeking explicit consent from them before proceeding to collect their personal data.
Another likely scenario where it would be beneficial for you to share data about yourself to a third party would be in the case of public policymaking. The United Nations’ (UN) Global Pulse initiative is an example of this.
Established in 2009, it explores how digital data and real time analytics tools can be used hand in hand to help policymakers efficiently track and deal with various socio-economic issues such as poverty, hunger and the spread of disease.
“While long term developments (within a nation) still rely largely on trackers such as household surveys and national census data, authorities need to have access to reliable, real-time information on a big picture level that would aid them in responding quickly to any problems that arise,” explains I-Sah Hsieh, global manager of international development at analytics software development company, SAS Institute.
Through the analysis of publicly available data such as conversations conducted via social media, telecommunications call records and more, he says governments would be able to predict upcoming economic trends. Consequently, they would then be able to plan better in terms of devising suitable policies to safeguard a nation’s interests despite unavoidable circumstances such as an impending economic downturn.
“In the past, this (analysis of big data) simply wasn’t possible due to the nature of conventional information collection and analysis methods which are both laborious and time consuming,” Hsieh says.
“Using SAS Social Social Media Analytics and SAS Text Miner... enables researchers to accurately and efficiently track all the chatter that was taking place in the online space as it happens.
Once captured, researchers can then analyse the documents and materials obtained to quantify feelings, moods, concerns and strategies. They can also use this data to correlate with officially available statistics to see how it matches up.”
Desensitised to data
Besides this, Gartner also expects governments around the world to loosen up about what has previously been regarded as classified information in 2014 and beyond.
By 2020, it foresees that enterprises and governments will fail to protect 75% of sensitive data, causing them to resort to declassifying the data and grant broad public access to it.
“There’s a huge benefit in opening up stuff that really isn’t that secure in the first place,” says Prentice.
“So we see governments around the world looking at open data initiatives. And when you do that, people can start to use this information in ways that is useful to the government.”
For instance, if data was released to the public regarding the kind of diseases that the government is researching on, he said that parties who had the relevant expertise could actually contribute towards these efforts.
The same principles on data classification can be applied to companies as well, in Prentice’s opinion.
“The more we try to lock stuff down, the more it runs counter to driving value back for the business,” he says.
Labouring the point
Besides data and privacy issues, Prentice says that ongoing global digitalisation trends will result in a “labour reduction effect” which would, by the year 2020, “cause social unrest in a quest for new economic models in several mature economies”.
“The labour intensity in jobs will be diluted by technology,” he says. “It’s creating new opportunities, but it is also destroying a lot of jobs. I am hard pressed to find any industry right now that is not being impacted by it. I think the only debate here is the level of digitalisation that’s going on, not whether it’s happening or not.”
Notable economic sectors that would be impacted by this predicament include the construction industry, which relies heavily on manual labour.
However, knowledge workers are in no ways exempted from this wave of change either.
“By 2020, the majority of knowledge worker career paths will be disrupted by smart machines in both positive and negative ways,” says Prentice.
The impact to the knowledge worker is that if you’re the type whose job is to provide answers to people or your value is because you have certain information or knowledge that others don’t, then your value becomes marginalised.”
This is because of the anticipated increase in dependence on artificial intelligence systems such as IBM’s Watson which uses natural language processing and analytics to process information and help in decision making.
“Credit Agricole predicts that Watson derived systems will account for 12% of IBM’s total revenue by 2018,” Prentice says.
Since its debut on television in 2011, Watson’s system performance is 24 times faster and its size is now 90% smaller.
The cognitive computer system can now be operated on a single IBM Power 750 server using Linux, thus transitioning from its original size which was that of a master bedroom to something much smaller — the equivalent of four pizza boxes. In addition, its services are now deliverable via cloud offerings and online chat sessions.
Watson is now being deployed across a wide range of industries including healthcare, banking and telecommunications via the company’s Customer Early Engagement Program, which was launched in the first quarter of 2013.
Organisations across the globe including local telecommunications company, Celcom Axiata, Singapore-based DBS Bank, ANZ Banking Group, IHS, Nielsen and the Royal Bank of Canada are among those participating in pilot tests using Watson’s technology.
Through these trials, Watson’s ability to understand the nuances of human language, improve its performance through continuously learning based on a user’s behaviour, process complex questions, and search vast amounts of big data to produce relevant, evidence based responses will be fine tuned to enable companies to better understand and cater to their customers’ needs.
In a report entitled Gartner top predictions 2014: Plan for a disruptive, but constructive future, Gartner predicts that Watson will account for at least 1.5% of IBM’s revenue by the end of 2015, and would then increase to 10% by the end of 2018.