Skip to main content

Risk management starts with data quality

All businesses experience operational risk, it’s a normal part of business. It’s impossible to eliminate risk so the challenge becomes how to determine an optimal risk balance. However, business risks change constantly and outcomes are difficult to predict, which is why gaining insight through data analysis is so important. It’s like a telescope that allows you to see further into the future or into the fog! The better your telescope, especially compared to those of your competitors, the greater your advantage.

Yet would you choose to use a telescope with a scratched lens or another fault? Should you not care if the scope is properly cared for and kept in the best possible condition? When it comes to analysis and gaining insight, data accuracy is the lens through which you see the future. Poor data quality leads to fuzzy vision. So how should you ensure the best data quality possible?

Join this how-to webinar session on solving operational risk challenges and hear a comprehensive discussion of data quality, lineage, and governance and why they’re key to optimal risk management. Our speakers are: Mark Spanos, Director, Accenture, Patrick Egan, Enterprise Data Strategist & Partner, Data3Sixty, and Jasmine Cosgrove, Experian Pandora Partner Manager, Experian Data Quality.

More specifically we’ll be discussing:

  • How data quality has evolved with the development of new tools and capabilities
  • The strong relationship between data value and management involvement
  • Why data quality sits at the core of best-in-class master data management, data integration, data governance, and business intelligence and reporting
  • How to manage the data quality lifecycle

Risk management starts with data quality, lineage, and governance

JASMINE

As we look towards 2017 we can identify several major trends for the financial services industry that will require greater use of data resources to improve efficiencies, provide greater customer satisfaction and to help reduce, or at least better understand and deal with, operational risk. Clearly compliance, with an increasingly complex regulatory regime, is at the top of the list. Examples are updated privacy regulations from the EU, GDPR (General data protection regulations), MiFID and MiFIR in the financial instruments area,  CCAR  from the Federal Reserve on capital adequacy and many others. The risk associated with non-compliance are significant.

 

Beyond compliance, the financial services industry is changing. New models are slowly emerging and customers are beginning to unbundle their needs. Greater use of the Internet and of digitized services and solutions can be expected to drive further changes. Consequently, the competitive environment is changing, and understanding and addressing your customers shifting needs and expectations has never been more important. Getting it wrong can not only increase financial risk, but also reputational risk can impact the longer term. Of course, there will always be risk associated with business, but taking a proactive approach to risk management can result in better and more profitable outcomes.

 

A growing component of risk in today’s business environment are the risks associated with poor quality data. To address the challenges I mentioned in a timely fashion will likely require a degree of agility that financial institutions are less familiar with. Many of the financial systems in use today have taken years to develop and will be hard to migrate or integrate with new capabilities, at least quickly, for a variety of reasons, not the least of which are risk and compliance. This makes projects like developing a single customer view, improving compliance and risk management, integrating big data to gain greater business insights,  hard to do. Compound that with the idea that the data you have and the data you collect is error prone. Depending on what it is, how it’s collected, and by whom, data frequently incorporates inaccuracies or variations from the outset. Certain data ages and become more inaccurate over time. Other data problems arise when migrations or integrations occur – for example, meta data definitions can differ considerably between systems. Then there are data governance issues to deal with. Where did the data originate? From what sources? Who gets access? Who can change the data? What are the parameters around change?

 

And so excellence in data management is becoming an increasingly important prerequisite for financial services organizations as we move into 2017 and beyond. It’s also a potential source of competitive advantage.

Today, we have experts from a variety of data quality & governance roles.

Introduce Mark Spanos, Jeremy Hurwitz unavailable

Mark Spanos specializes in investment management operations and technology. With 20 years of experience in the financial services industry, he brings an extensive background of investment operations and software development and software implementation. In his current role in Accenture’s Asset Management Consulting practice, Mr. Spanos helps clients with operating model development, business process redesign, system implementation, and strategic sourcing initiatives.

JASMINE

Every year, Experian Data Quality conducts a global survey of 1,400 data management professionals around the globe. We call it the “Global Data Management Benchmark Report.” As part of our 2016 study, we identified that 96% of financial institutions face external risks related to data management. And those risks primarily come from data security and governance (46%), as well as through factors such as data collection and profiling.

JASMINE

So while our study indicates that 79 percent of organizations we surveyed say data clearly ties into their business objectives. Over the years, the risk that bad data represents to the business certainly has grown.

JASMINE

In fact, 83% of organizations say that poor data quality hurts these same initiatives – a scenario that only undermines the confidence in the organization’s data. To that end, our study showed that only 2% of businesses have complete trust in their data.

Mark

“New Data”: Data is coming in all shapes and sizes as well as traditional structured we have Big Data, unstructured (Social media), semi structured (Web Services)

Transparency: Led by more regulatory oversight in many industries BCBS239, Solvency II, Mifid II and GDPR

Life Cycle: From inception through elimination and every step along the way

Data Governance: Accountability, Business Meaning, Lineage, Impacts and a collaboration platform to discuss and resolve Data Quality

Mark

Reactive: After the fact resolution many with a large manual overhead.

Technical Solutions: Complex tools, required IT development and support.

Tactical : Single function focus and applied on a system by system basis.

Back Room operation: Little to no transparency to the business.

Data as a commodity:  Needed to perform a business function

Mark

Proactive: Identify patterns, frequencies and resolve close to source.

Business Facing: Newer tools for profiling, identification and remediation.

Collaboration: Responsibility for Shared Data across business lines.

Business led: Data Governances initiatives to define, understand and trust.

Data as an Asset: Valuable insights when acceptable quality and trust exists.

Mark

Tactical: Fixes performed using primitive tools or manual remediation

  • Checks may be carried out in spreadsheets, access databases or stored procedures
  • No realistic means of reporting errors or case management
  • Issues are resolved and closed, usually without identifying root case or documented remediation steps

Operational: Data Quality processes are in place typically automated to meet one or more business functions

  • Start to see case management
  • Quality checks may still be duplicated across various systems
  • ‘Tech footprint’ still remains to the fore

Departmental: Data Quality is understood, managed and applied within a single department or business function

  • Identification of shared use of data across systems
  • Collaboration within a group to apply quality checks and remediation within the department
  • Specialist Data Quality tools introduced
  • Emergence of some basic dashboarding and metrics

                 

Inter-Departmental: Quality checks are performed on shared data sets as close to source as possible

  • Emergence of Data Quality as a shared capability across departments
  • Active business in discovery, development, management and remediation of data quality issues
  • Dashboards Mature and provide more comprehensive aggregated metrics
  • Pockets of Data Governance Emerge including documentation of business terms, rules and responsibilities

Organizational:

  • Holistic view of data quality, across Departments, Systems and Business Functions
  • Top Down support of larger quality initiatives Master Data Management (MDM) and Data Governance
  • Operating model to proactive manage data quality at an organizational level using dashboards, KPIs and SLA agreements

 

Mark

Some companies are doing some interesting work with proactive data quality and one of those is Experian Data Quality.

Jasmine, can you take us through the philosophy behind Experian’s approach and how it matches with the industries standards. 

JASMINE

At Experian, we believe that data is at the heart of every organization -- and the quality of that data is critical to sustained business success. While most organizations indicate that data supports their business plans, on average, organizations believe a third of their data is inaccurate, which can undermine their ability to make strategic decisions.

Actions taken by employees or by customers create a wealth of information that organizations can collect. Savvy organizations are the ones that are able to turn these insights into action through initiatives like business intelligence, workforce optimization, predictive analytics, or targeted marketing. Data affects a lot of different areas of the business.

JASMINE

Standardization

Automatically detect and eliminate any formatting errors within your database. Restructure your contact data to fit standards that are meaningful to you and your business. Our data standardization solution allows you to set and update customized data structuring rules for all data types. 

Cleansing

Connect more effectively with your customers through accurate and clean data. Clean incorrect contact data in real time or in bulk capacity. We support address, email, mobile and name validation.

Matching & linkage

Gain a holistic view of your customers by connecting data across all channels. By discovering intelligent links among your customer records, our data matching software finds connections between data elements and enables you to quickly remove duplicates from your database.

Profiling

Our data profiling and data discovery tool allows you to easily analyze your data. Investigate and assess data content, structure, relationships and quality. Obtain statistics that provide data-driven insights to fuel your business decisions.

Monitoring

Control your data over time through a business-user friendly environment. Our data monitoring tool corrects and reports on your data quality automatically and ensures that your data conforms to business rules.

Enrichment

Create more targeted, personalized experiences for customers. By appending additional information from external sources, our real-time data enrichment gives you better insight into your customers, allowing you to improve analysis, segmentation and communication. 

JASMINE

At Experian Data Quality, we talk considerable of understanding the ability to Analyze, Improve and Control your data in a data quality cycle ongoing cycle to create and maintain the highest levels of data quality within your environment. 

Analyze – This process allows you to find, catalog and prioritize the scope of your data, we uncover the unexpected anomalies and outliers  and start to quantify the monetary value and risk of the data quality issues.

Improve – Our improving cycle allows you to drill down and find root cause analysis, prioritize and justify and actions as well as design and validate (or prototype) data improvement rules which will continually improve your data quality.  These improvement rules create a continual cycle of cleansing, improvement and enrichment.

Control – The control portion focuses on automatically monitor and qualify over time – assessments that give version over version results of how data quality is being maintained in your environment.  We also allow for the ability to continually ad-hoc investigation. 

This entire cycle allows users to improve and refine the data quality rules and process and keep data quality a priority in your environment for all levels of users.

Why, What & How

Tactical: Fixes performed using primitive tools or manual remediation

  • Checks may be carried out in spreadsheets, access databases or stored procedures
  • No realistic means of reporting errors or case management
  • Issues are resolved and closed, usually without identifying root case or documented remediation steps

Operational: Data Quality processes are in place typically automated to meet one or more business functions

  • Start to see case management
  • Quality checks may still be duplicated across various systems
  • ‘Tech footprint’ still remains to the fore

Departmental: Data Quality is understood, managed and applied within a single department or business function

  • Identification of shared use of data across systems
  • Collaboration within a group to apply quality checks and remediation within the department
  • Specialist Data Quality tools introduced
  • Emergence of some basic dashboarding and metrics

                 

Inter-Departmental: Quality checks are performed on shared data sets as close to source as possible

  • Emergence of Data Quality as a shared capability across departments
  • Active business in discovery, development, management and remediation of data quality issues
  • Dashboards Mature and provide more comprehensive aggregated metrics
  • Pockets of Data Governance Emerge including documentation of business terms, rules, and responsibilities

Organizational:

  • Holistic view of data quality, across Departments, Systems and Business Functions
  • Top Down support of larger quality initiatives Master Data Management (MDM) and Data Governance
  • Operating model to proactive manage data quality at an organizational level using dashboards, KPIs and SLA agreements

Copyright ©, 2014-2017. All rights reserved.

125 Summer St Ste 1910, Boston MA 02110-1615, US