Following the global financial crisis the business cases and mandates many data practitioners previously only dreamt of began to emerge in the form of a regulatory tsunami. In response to demands for transparency, reducing complexity and systemic risk a plethora of regulations from most G20 regulatory authorities began to emerge.
Financial Services organisations jumped on the bandwagon created by Silicon Valley and the Dot Coms and began creating Chief Data Officers (CDOs) of their own. Disciplines such as Data Quality Management, Data Modelling, Data Governance and Data Architecture suddenly received the remit and funding they had long craved. Many of the regulatory stipulations and requirements however focused on data storage, data provision and transparency – trade repositories, reporting standards and other related requirements. The focus was on the data itself rather than the underlying processes, architectures and people responsible for managing the data.
The focus on the ‘what’ shifted to the ‘how’ in January 2013 however when the Basel Committee on Banking Supervision published its Principles for Effective Risk Data Aggregation and Reporting. Known variously as PERDARR, RDA and in Europe most commonly as BCBS 239, the regulators’ main assertion was that “Improving banks’ ability to aggregate risk data will improve their resolvability".
In response the Basel Committee developed eleven principles covering a range of technical, cultural and organisational requirements for banks to implement prior to January 2016. Many data practitioners rejoiced when the principles were first published as they reinforced sentiments long held within the data community such as defining data ownership and stewardship, treating data as an asset and understanding the lineage or provenance of key data elements. The business case of a lifetime had landed.
Initially banks responded by forming project teams, new organisational units such as CDO offices and hiring specialist staff. Many extended and refocused existing Data Architecture and Data Management departments already implementing Data Governance programmes and other strategic data initiatives.
It is now safe to say however that the dream has become a nightmare for many. In a survey of banks by Ernst & Young, all respondents suggested at least 25% of their BCBS239 related change programmes would not be delivered by the deadline in January next year1.
A commonly cited reason why many of the banks are struggling to meet the deadlines is because the scope of BCBS 239 is substantial and the implementation timeframes ambitious. Many of the requirements are not new within the data community and indeed many organisations with highly mature data strategies and data architectures do exist. Few however would claim to have implemented these architectures, working practices and cultural changes in less than three years. Usually the investment and continuity of staff and approach is required over many more years and implemented in less complex environments.
Agile and Lean Information Management techniques is one area where a great deal of buzz and excitement is being generated. These approaches use frequent iterations, a build-measure-learn cycle, multi-skilled practitioners, cross functional teams and extensive prototyping to minimise the waste inherent in traditional Data Management approaches. Traditional sequential and waterfall approaches tend to feature extensive use of specialists, numerous touch points / hand-offs and well segregated responsibilities. This works well when requirements are well defined and relatively static and the number of interdependencies between disciplines is clear and minimal. For large data change projects in unchartered territory however, such as BCBS 239, often a more pragmatic approach yields greater value – particularly given the highly interconnected nature of requirements.
Lean Information Management works by unifying many of the above touchpoints and roles using Virtual Teams, Communities of Practice or Competency Centres. Chiefly this is because there are significant efficiencies in using common tools and working closely through challenges such as reverse engineering, profiling, defect measurement and report prototyping simultaneously. BCBS 239 presents the perfect use case for this given many of the tasks in the diagram are required to thoroughly understand the underlying quality / governance issues, transformations, aggregations, lineage and architecture of any single risk report. Using a cutting edge toolset and approach can enable these tasks to be carried out much more rapidly and effectively.
Technology can be used in this capacity as an accelerator to this lean approach and can yield significant results. Take the example of demonstrating a deep, empirical understanding of a complex risk report. Traditionally this would involve many different touchpoints across teams and many different tools with a final polished product only available at the end of the project following extensive testing. By using a data management platform you can build a Build-Measure-Learn cycle that allows cross-functional data professionals to more efficiently tackle these tasks. Once a firm understanding or hypothesis has been formed using prototyping functionality the same analysts can then simulate what a final risk report would look like for Subject Matter Experts to provide feedback far earlier in the process than a traditional waterfall approach would enable.
This living, breathing prototype can then be used as a clearly understandable blueprint for delivering sustainable solutions and adapting to regulatory requirements as they clarify and evolve.
It’s never too late to launch your rescue package and trial a new approach, particularly when the regulatory drivers present a paradigm change for how your organisation manages its data assets.
If you are interested in discussing your regulatory data challenges further then we shall be hosting a roundtable with Experian Data Quality on 8th September at Eight Members Club, Moorgate. Click here to register.
You can find out more information from the Data to Value website.