Why Data Governance Programs Are Easy to Envision, Difficult to Sustain

I am often asked, with all the investments in data management and infrastructure over the last 50 years, why are we still not great with governing data? To put it simply and directly – it’s hard! Data governance programs are easy to envision conceptually, difficult to implement, and without proper care, impossible to sustain. Often data governance investments have a long ‘time to value’ ROI, making showing incremental progress along with sustainment of the investments very important. Nothing like setting the bar high for all the organizations making investments into data governance. Understanding why things have failed in the past is just as important as designing for the future, so let’s cover some of the common pitfalls.

Three-fold challenges

Conceptually, governing data is just like creating controls for any asset – knowing how data is created or acquired, how is it owned, where it is stored at any given time, who has access to it and how it is being used. That supports the prior statement that the programs are easy to envision conceptually, but it’s the implementation where the simplicity quickly disappears. Given the sheer number of different data sources and processes churning out data, just knowing the creation and true source of data is difficult. Most organizations also have redundant or duplicative copies of data, so knowing which might be the best or most trusted source is also difficult.

Another challenge in execution comes with technologies for sharing and leveraging data. Over the past 15 years, business intelligence tools have rapidly shifted from centralized control of data in the hands of IT groups to end-user driven reporting with desktop tools like Microsoft PowerBI or Tableau. That shift, while empowering end-user consumption of data, has made it exponentially more difficult for data governance professionals to control how their data assets are being used at any given point in time. This end-user-driven data revolution has also put added strain on factors such as ensuring that data is used in proper context (e.g., using the correct contract effective date vs. a contract signature date for a given need) and that the data is of the right quality for use, also known as ‘fit for purpose.’

That brings us to the third challenge noted, which is sustainment. Even for organizations that have successfully navigated designing and implementing data governance, failure often materializes in the maintenance of the programs. A data governance program is not unlike a home at the beach – it’s an incredibly valuable asset but without proper maintenance, it will very quickly fall apart!

To address the challenges, data governance professionals have approaches, such as creation of metadata repositories or data dictionaries for proper context to the data, establishment of data catalogs to track the assets, and creation of data quality rules to help manage if the assets are truly fit for a given purpose. All these approaches can be resource intensive to implement, hence the earlier statement about the difficulties in realization of data governance.

Build sustainability into the governance model

To maintain data governance, sustainability must be built directly into the model. Data governance isn’t a project but rather needs to become an embedded process across the organization. That means building repeatable processes for maintenance of data catalogs, data dictionaries and other data controls. It also means that any roles, such as stewardship and data decisioning councils, must also be consistently maintained, with periodic retraining, monitoring of performance and proper rewards for delivery of quality outcomes.

The good news on the sustainability front is that technology is catching up with the needs. Tools are available that can periodically scan for new data or changes to reflect in the data catalog, along with more advanced AI/ML tools for maintenance of data dictionaries as well as identification of data quality issues. Many organizations are also becoming much more data-centric in their culture, which means that sustainability of stewardship and other key roles is becoming ingrained in everyday activities and properly rewarded. Organizations have continued to invest in C-level executives, such as chief data officers, to help further drive the messaging around data being a core asset and to support data-centric environments.

Innovation and agility

As more organizations tackle data governance projects, we continue to see more innovation in approaches and tools for sustainability. Years ago, it was not uncommon to see monolithic data governance programs trying to address all data at once, as opposed to today, where we see more targeted approaches based on the value or risk of certain data. Adoption of more agile approaches for scoping and executing data governance programs has led to fewer failures and more rapid time to value the programs. The adoption of newer technologies and having these embedded directly in line with data creation and ingestion points have also helped with sustainability.

I’ll end where I started by saying data governance programs are easy to envision conceptually, difficult to implement, and without proper care impossible to sustain. These programs do offer difficulty in implementation, and certainly in sustainability, but these are not impossible hurdles to overcome. Data governance, much like other worthwhile business investments, requires proper planning and continued operational excellence for longer-term success.

Read more in our whitepaper: Building Sustainable Governance Programs With Agile Concepts

To learn more about our data governance solutions, contact us.

Matt McGivern

Managing Director
Data and Analytics

Subscribe to Topics

Generative #AI is set to revolutionize the field of enterprise architecture. Get a comprehensive overview of the impact of #GenAI on EA activities, plus challenges, risks and limitations in the latest Technology Insights blog post. https://ow.ly/foPJ50SkUW6 #ProtivitiTech

Protiviti’s @KonstantHacker will join a panel to speak on “Quantum Leap: Securing Manufacturing's Next Frontier with Post Quantum Cryptography” on July 18 in Chicago, IL. Register today for this in-person event. https://ow.ly/s02X50SkfcI #ProtivitiTech #Quantum

Protiviti’s Kim Bozzella explains why it’s crucial for businesses to establish trust through transparent and secure data practices: “Losing trust means losing business.” Learn how to take action now. https://ow.ly/mIAX50Sjjju #ProtivitiTech #DataPrivacy

Protiviti’s Mark Carson discusses the importance of measuring analytics capabilities, the importance of taking an agile approach to analytics assessment, and the future of analytics maturity. Read more in TechTarget: https://ow.ly/GJKw50Siri7 #ProtivitiTech

Protiviti’s @KonstantHacker and guest Benedikt Fauseweh, of TU Dortmund University, discuss Richard Feynman’s 1981 quantum simulator idea, its relevance today and whether this work has anything to do with ‘The Three-Body Problem’ novel and Netflix show. https://ow.ly/CrRY50SibFV

Load More