Relevance of Master Data Management
MDM is a vital process in data governance:
Data management professionals and senior business managers today are increasingly becoming aware of the importance of Data Governance both because of the external pressures such as compliance and regulations as well shareholder rights and privileges and the internal business pressures, says C.S. Sathish Jain, a strategic and collaborative management technology consultant based in the ASEAN, and MD, ASVA Consulting.
Jain, who deals primarily with banks and financial services institutions in areas of Master Data Management with focus on new age frontiers like Blockchain, IOT and Big Data, which are disrupting the very fabric of economics, says it must be made mandatory for professionals and managers to understand the mechanics, virtues and ongoing operations of instituting data governance within an organization.
“The objective of data governance is predicated on the desire to assess and manage the risks that lies hidden within the enterprise information portfolio,” says Jain, maintaining: “One of the critical values obtained via data governance is to ensure there exists only one version of the ‘Truth’ for customer or product or any critical data domain and this is the common point of reference across departments, systems and regulators, thus achieving a 360-degree view of customer data dimension.”
Hence, ‘value in the form of data’ flows across the enterprise without any contamination or loss of information, he says, adding a ‘Master Data Management (MDM)’ program as part of data governance portfolio will ensure consistent data is published across lines of business.
DATA IS NEW OIL
He says success of MDM goes hand in hand with a well-defined ‘data governance practice implementation’. He also maintains that organizations have in the past decade understood the importance of data as an asset and hence are not categorizing data management-related programs as just another cost center initiative. Rather, they are being prudent by giving these initiatives the utmost attention and investment.
“Data (customer and master data) quality issues, which led to the 2008 financial crisis, have forced regulators to issue mandates via accord changes from BASEL committee for the global banking and financial sector. Even today, organizations are struggling to address the above,” says he.
A simple depiction of data governance leading to MDM as part of a data strategy journey according to him will be:
According to him, data strategy means:
• Aligning business strategy and ensuring business users, owners & stakeholders are engaged right from the beginning of the journey
• Aligning with corporate objectives and ensuring allocation of resources, gaining CXO executives buy-in and guiding principles for achieving quality data business asset
• Since not all data is equal, it is very important to assess how, and which of the data will be exactly used and be the specific part of the data governance journey
Data Policy & Stewardship encapsulates:
• Data policy, definitions, controls, metadata, ownership via a steward are pre-requisite
• There can be more than one data steward for an attribute or set of attributes- clear roles & responsibilities
• Data profiling will be the core deliverable in this exercise wherein factors such as business rules, values, exceptions, audit rules, owners, authority, amendment frequency, regulatory and in-country requirements, etc, are defined and agreed Data Quality Maintenance (DQM) & Framework comprises:
• Monitoring and post-event stage of the data lifecycle management will include system, business process and quality validations. In- spite of stringent checks, data quality can take a hit due to various other unforeseen reasons or human errors
• DQM is an evolving layer wherein over a period of 1-2 years on an average the framework of data quality matures, and this is when ROI of the entire data governance will surface.
CEMENT THAT BONDS
Jain explains that MDM is the cement that bonds the enterprise wide business intelligence, workflow and analytical systems to that of the core operational side of the business systems and processes. He lists the causes of the 2008 crisis as: • Failure of managements to implement a stringent data strategy or governance and lack of having any feedback loop to validate the data quality issues • Regulators should have enforced data governance on enterprise scale as a mandate as the banking sector was launching very complex structured products without the underlying clean data in place
• Presence of multiple versions of truth of customer data led to sub-standard reporting of financials
Besides, regulators could not nail these loopholes or pitfalls due to lack of stringent tools and feedback mechanism to alert the institutions.
Jain believes that new age technologies can help alleviate the data quality pains. “However, technology can only do what it has been told to. Hence, a watertight data governance framework will only help Jain discusses the impact of not implementing data governance in an organization:
Data lifecycle has key stages where data collation-consumption happens and at every stage the error correction cost multiplies 5-10-fold in terms of cost (numbers are for a global organization).
The stage 1 is data correction after initial data input. Data gets into the enterprise’s internal systems used by the business community to support business operations. This set of ingested data with issues gets processed and used in many business supporting systems and often used as part of decision making and business supporting models and calculations and across line of businesses.
The stage 3 is where data correction is done after being consumed enterprise-wide and published externally (regulators etc) Data or information has hit the stage of no return meaning the incorrect data has been used across the enterprise and also published to external entities.
This leads to serious issues like incorrect accounting statements, data impacting revenue predictions, cost allocation, etc, and creating impact on capital adequacy that could end in greater capital ingest.
the organizations achieve near pristine data quality. For example, blockchain technology possibly can help alleviate the doubts of insider fudging or false reporting or for that matter, bypassing regulations. MDM on a blockchain can ensure that the entire spectrum of data quality checks is in place if the data strategy, policy, stewards, quality framework are done with absolute diligence,” says he.
CXO LEVEL INTERVENTION
Jain says the final take-home is that given
the advent of Big Data, fintech, blockchain and AI data governance and master data management should be mandatory across business domains. He explains: “These have become the core building blocks, enabling an organization’s pursuit to innovate, survive and grow. Ultimately, the success of the above is solely dependent on the level of CXO involvement and commitment – either it can lead the organization to achieve near pristine data or end up with a tactical silo implementation. Datadios!!!”
Data is not consumed by any major systems and the cost of the correction is minimal, but the issue is identifying these errors is near impossible without data governance in place.
The stage 2 is data correction after being consumed by systems.
Satish Jain avers success of Master Data Management goes hand in hand with a well-defined data governance practice implementation