Cracking the operational analytics nut
Analytics managers or business people interested in analytics often say that performing some analytics on data is not the primary problem they have. They say they have to get the analytics integrated with the process and the systems that support it. This issue, sometimes called “operational analytics,” is the most important factor in delivering business value from analytics. It’s also critical to delivering value from cognitive technologies, which are essentially an extension of analytics anyway.
Three things make operational analytics tough. First, to make it work, you have to integrate it with transactional or workflow systems. Second, you often have to pull data from a variety of difficult places. And third, embedding analytics within operational processes means you have to change the behaviour of the people who perform that process.
If you are successful, you eventually will run into a fourth problem: The embedded analytical models will have to be monitored over time to make sure they remain correct.
To succeed with operational analytics, a company has to combine transaction systems, workflow systems, analytical systems, databases, and display/user experience tools—no easy task. Integrating with transactional systems takes a good deal of effort, although modern systems architectures make it a bit easier. Most transactional systems these days (including SAP and Oracle ERP systems) allow API-based connections. But there is usually a fair amount of effort involved in integrating an operational system extracting the data you need, doing the analytics somewhere (the cloud, in-database processing), and embedding the result into an interface for the frontline user.
You might be able to accomplish much of the integration with a workflow-oriented overlay tool like case management, business process automation (BPA), or robotic process automation, although those types of systems generally don’t do any analytics. That means human labour - from your organisation or an external services provider - will be required to combine workflow and analytics. For example, a Bostonbased BPA company, Pegasystems, partners with professional services firms to combine analytics-based recommendation engines with Pega’s multichannel marketing automation capabilities.
Various Data Sources
Problem two is getting all the needed data. That can be handled fairly easily if the data is in an information system in some sort of accessible format. But in many cases, the data is in a variety of formats - including paper reports, PDF files, unstructured articles, medical records, and more. To get that kind of data into your operational analytics system, you need more than analytics - you need artificial intelligence (AI).
One of the few vendors that combines AI capabilities with BPA is RAGE Frameworks, headed by a former professor, Venkat Srinivasan, who holds a doctorate in computational linguistics. The AI capabilities allow RAGE applications in, for example, financial asset management to extract and classify relevant content from analyst reports and drive investment recommendations. RAGE also has worked with audit firms to extract data from paper and PDF files for account reconciliations. You simply can’t automate such processes if you can’t automate the “data ingestion” process. In addition, RAGE employs a variety of other “engines” - 21 in total, including a computational linguistics engine, a decision tree engine, and a business-rules engine - to rapidly develop intelligent applications. This multiplicity of microservices is a way to quickly create operational systems that can analyse and think.
Finally, there is the need to persuade front-line users to change their behaviour toward decisions and actions based on operational analytics. A “next-best offer” system for bank tellers, for example, has to persuade the teller to actually use the recommendations when working with customers. They won’t employ analytical recommendations if they don’t trust them.
To build such trust, transparency of analytical recommendations is essential. If the reason for the recommended product or action can’t be described in understandable language, the user won’t be able to assess whether it makes sense. That requires some sort of natural-language generation capability to describe the decision logic. It doesn’t favour many machine-learning approaches to analytics because, most of the time, there is simply no way to describe or interpret why a particular model prevails in a machine-learning process.
Organisations embarking on operational analytics are learning that analytics itself is the easy part. There is no shortage of available vendors, both proprietary and open source, of analytical algorithms. But building an operational analytics system means integrating and changing existing architectures and behaviours, and that’s always the hard part. It’s well worth the trouble, however, to build applications in which analytics and smart decision-making are embedded in a company’s systems and processes.
For more information, please visit http://www.deloitte.com/mt/technology