A practical look at a Data Quality Framework with Johan du Pisanie

Executive Summary

This webinar highlights the critical aspects of Data Quality management and governance within organisations. Johan du Pisanie addresses the prevalent challenges faced in managing data effectively and emphasises the significant impact Data Quality issues have on organisational performance and compliance. The webinar highlights the importance of treating data as a vital asset, the role of Data Governance in the era of AI, and the significance of promoting data ownership and literacy. Furthermore, it advocates for the necessity of automation and effective data stewardship to drive growth, while providing guidelines for implementing robust Data Quality solutions. Ultimately, Johan highlights the crucial interplay between Data Quality, problem management, and cost efficiency in optimising overall business operations.

Webinar Details

Title: A practical look at a Data Quality Framework with Johan du Pisanie
Date: 17 September 2024
Presenter: Johan du Pisanie
Meetup Group: DAMA SA User Group Meeting
Write-up Author: Howard Diesel

Data Quality and Consulting with Johan du Pisanie

Howard Diesel opened the webinar and introduced Johan du Pisanie, a South African who has been living in the Netherlands for four years after relocating for work. He recently completed his PhD in statistics from Northwest University and works as a consultant, a role that influences his perspective on Data Quality and accuracy.

Understanding Data Quality from a business perspective is essential for effectively addressing the challenges and leveraging the benefits that data presents. This framework moves beyond technical assessments, focusing instead on practical strategies to identify and resolve Data Quality issues within a business context.

Johan shared that he is a consultant from View Wave, a consultancy operating in the Netherlands and South Africa, and highlights the importance of recognising both the potential and pitfalls of data. By exploring these concepts, businesses can better navigate Data Quality challenges and enhance their overall data-driven decision-making processes.

Figure 1 Data Quality Framework

The Challenges of Data Management in Organisations

Data is an essential asset for organisations, offering numerous benefits; however, it can also lead to significant problems. In both large and small organisations, individuals can often identify issues with their data practices, indicating that these challenges are prevalent. A common point of concern, particularly within South African banks, is the existence of siloed teams, which are often blamed for data dysfunction. While siloed teams can contribute to these issues, attributing all data problems to this factor may overlook other underlying causes. Thus, it’s crucial to recognise that data dysfunction can arise from a variety of sources, and addressing it requires a comprehensive understanding of the organisation’s data landscape.

Figure 2 The Roots of Data Dysfunction

Organisational Challenges and Data Quality Management

Organisational cultures often grapple with the challenge of prioritising speed over accuracy, a concern raised by Deming more than 50 years ago. Despite technological advancements like AI, many organisations hastily complete tasks without fully considering the long-term impacts or the quality of data that informs their decisions.

This urgency often results in inadequate foundational practices, particularly in the financial sector, where processes are executed repeatedly every month. Therefore, organisations need to adopt a more thoughtful approach that emphasises accuracy and Data Quality to ensure sustainable success.

The introduction of a matrix organisation has altered the traditional supervisory structure, resulting in challenges in maintaining quality control, as multiple managers demand speed and accuracy without a central supervisor to ensure oversight. This change, driven by financial considerations, highlights the need to prioritise proper execution over budget constraints. Additionally, the rise of artificial intelligence emphasises the importance of Data Quality, as poor data management can severely impact outcomes. There is a pressing need to reconcile efficiency with effective practices to ensure sustainable success.

Organisations often prioritise speed over accuracy in decision-making, a tendency that can lead to significant challenges. Leadership plays a crucial role in shaping how data is perceived and valued within the organisation, influencing team motivation and engagement. While the potential benefits of correct processes can inspire teams, it is often the fear of adverse outcomes that propels action.

This dynamic underscores the importance for leaders to view data as a strategic asset; without such recognition, initiatives can falter, leading to inefficiencies and disengagement. Ultimately, understanding the consequences of mistakes can be categorised into operational blocks, underscoring the importance of striking a balance between speed and accuracy in organisational practices.

The Impact of Data Quality Issues on Organisational Performance and Compliance

The challenges posed by poor Data Quality can significantly impact an organisation’s operational efficiency and strategic decision-making. When data integrity is compromised, organisations often resort to manual workarounds and face ongoing verification issues, leading to inflated costs and employee burnout.

This inefficiency becomes evident when compared to competitors who manage to streamline their processes, often resulting in faster reporting times and improved responses to customer requests. Ultimately, a lack of reliable data erodes the quality of decisions made within the organisation, which further diminishes trust among stakeholders.

Constantly making poor decisions due to incorrect information can erode trust in an organisation, leading to significant financial repercussions. When Data Quality falters, organisations often face fines and other remediation costs, highlighting the immediate impacts of data mismanagement.

The top three blocks affecting decision-making are typically clear and measurable. At the same time, the bottom three represent longer-term consequences, such as reputational damage, for large banks and organisations; even a single instance of providing incorrect information can spark negative media attention, creating a lasting impact on their image. Ultimately, in the financial sector, compliance is a crucial driver for improving Data Quality to mitigate these risks.

Figure 3 The Cost of Getting it Wrong

Data as an Asset in the Organisation

To effectively improve Data Quality within an organisation, it’s essential to establish a practical framework that addresses three key elements: people, processes, and technology. Businesses should begin by assessing the data challenges to tackle first, followed by subsequent priorities, and then progress methodically.

Each of these components—having the right people in place, implementing effective processes, and utilising appropriate technology—forms the foundation for successful data management. Recognising data as a valuable asset further emphasises the importance of integrating these elements to enhance overall Data Quality.

Data must be recognised as a valuable asset for companies to build an effective framework for their operations. If organisations view data merely as a necessary component to facilitate processes, they will struggle to advance. Most companies today, in reality, operate as data-driven entities, regardless of their primary products or marketing strategies. By acknowledging the importance of data, companies can ensure that other critical elements receive the recognition they deserve, allowing for a more cohesive and successful strategy. Without this perspective, progress will be limited.

Figure 4 A Practical Framework

The Value of Data Governance in the Context of AI and Business

Data is often referred to as a valuable asset, yet many organisations fail to recognise its true worth or convert it into tangible economic value. While the notion of data as an asset is widely discussed, few successfully implement it, leading to a disconnect between recognition and practical application. Questions arise regarding whether data serves as an asset or a liability; can it be demonstrated that data is generating revenue, or is it contributing to poor decision-making? This ambiguity highlights the need for a more profound understanding of data’s impact within organisations.

The increasing significance of data as a vital asset is becoming clear as organisations adopt artificial intelligence (AI) at an unprecedented rate. Companies are recognising that to remain competitive in a rapidly changing landscape, they must prioritise the strategic use of data. However, many organisations still struggle to integrate data into their core strategies effectively, and the true impact of these efforts remains to be determined. Moving forward, embracing data as a key component of their AI initiatives will be essential for achieving long-term success.

Many companies are realising that robust Data Governance is essential for effective AI implementation. Recently, during a training session, Data Governance professionals raised concerns about a prevailing misconception among AI teams that Data Governance is no longer necessary. As businesses rush to advance their AI models, there is a tendency to overlook the importance of high-quality data, which can lead to AI failures. While some believe that AI can autonomously clean and manage data, the reality is more complex and requires careful attention to data integrity. It is crucial to acknowledge that effective data underpins successful AI initiatives.

The effectiveness of AI is fundamentally dependent on the quality of data it relies upon, as AI functions purely as a decision-making tool. A common misunderstanding among people is overlooking this principle, which has previously led to the failure of numerous dot-com companies due to inadequate data management. As we move forward, this issue persists, with companies at risk of repeating past mistakes if they fail to secure proper data. Historically, effective data custodianship has been crucial in ensuring data integrity, which is essential for the successful deployment of AI technologies.

The key issue within organisations is the failure of leadership to recognise data as a strategic asset, which obstructs necessary changes and improvements. Poor Data Quality significantly hampers operational efficiency and decision-making, often leading to costly workarounds and diminished stakeholder trust.

Ineffective data management can result in significant financial penalties and reputational damage, particularly in highly regulated industries such as finance, where compliance is crucial. A framework addressing people, processes, and technology is crucial for improving Data Governance and quality. Furthermore, as businesses increasingly adopt AI, understanding and leveraging data as a valuable asset is vital for maintaining competitiveness and driving future success.

Data Ownership, Data Literacy, and Data Governance in an Organisation

To achieve effective Data Governance, organisations must cultivate ownership and data literacy among their teams, particularly for data stewards and custodians. These individuals must understand their responsibilities, as many appointed to these roles may lack interest or knowledge, often coming from unrelated fields like sales or technical backgrounds.

Establishing strong Data Governance requires comprehensive policies and standardised definitions to ensure proper data management. While it may be possible to address Data Quality issues temporarily, sustainable improvements can only be achieved when the necessary governance processes and structures are in place.

Figure 5 Trust by Design

The Importance of Automation and Data Quality Management in Organisational Growth

In an experience at a bank in Africa, a team member demonstrated a strong desire for personal and professional growth; however, his role primarily involved generating monthly reports, which limited his ability to enhance Data Quality or take on additional responsibilities. To address this, we decided to provide him with external assistance to automate two-thirds of his reporting tasks, significantly reducing the time required from a week to a day. This automation freed him up to focus on more meaningful analysis and improvements, allowing him to engage with more impactful aspects of his work.

Integrating automation is essential for enhancing organisational maturity and improving Data Quality. However, it is vital to ensure that foundational elements such as Data Quality, process effectiveness, and data literacy are adequately developed before rushing into automation. When organisations prioritise automation without addressing these core components, they risk overlooking existing issues that may only be visible through manual processes. Consequently, a balanced approach that focuses on refining both automation and foundational elements is critical for achieving superior Data Quality and making well-informed decisions.

To ensure effective progress in an organisation’s maturity levels, it’s essential to advance teams through levels 1, 2, and 3 in unison. While data scientists contribute valuable insights and models, they may lack the software development skills necessary for robust automation. True automation involves defining parameters that allow qualified software developers to create reliable systems, rather than merely running scripts written by data scientists. Failing to implement proper automation can lead to inefficiencies, as it simply accelerates the potential for errors rather than creating a well-functioning solution.

To enhance Data Quality within organisations, it is essential to establish a framework focusing on three key elements: people, processes, and technology. Companies must prioritise and address data challenges methodically, recognising data as a valuable asset rather than merely a functional component. This perspective ensures that Data Governance is prioritised, particularly in the context of AI adoption, where the effectiveness of AI models depends on high-quality data.

Organisations must cultivate data ownership and literacy, particularly among data stewards, to promote accountability and enhance governance. Without a clear understanding of data’s impact and effective management practices, operational efficiency and decision-making may suffer, leading to costly outcomes and diminished stakeholder trust. Overall, a robust framework for Data Governance is crucial for achieving a competitive edge and ensuring long-term success.

Figure 6 Data at the Centre

The Role of Data in the Organisation

Organisations must recognise data as a valuable asset, rather than merely an expense, emphasising the need for significant investment in data management at all levels, particularly at the executive and board levels. A past project revealed that poor Data Quality led to discrepancies costing the company over $100 million annually due to errors and inefficiencies.

The investigation into these issues revealed various underlying causes of the data mismatches, underscoring the critical importance of effective data management practices and the need to leverage technology for substantial improvement. Ultimately, viewing data as a vital resource can drive operational efficiency and enhance financial performance.

Maintaining Data Quality is crucial for the effective operation of call centres, where agents frequently input or modify essential data fields. To achieve this, organisations must structure their processes and conduct thorough reviews of front-end systems to ensure data integrity. While addressing existing Data Quality issues is necessary, the challenge of sustaining high Data Quality over time remains. Additionally, organisations need to develop data practices that not only uphold Data Quality but also foster business growth by providing clear visibility of data. This is especially important in larger organisations that may have entrenched beliefs about data management.

Organisations often adhere to established norms and practices without questioning their validity. Whether you’re an executive or a team member, it’s essential to challenge these conventions and align actions with reliable data. Effective decision-making requires evaluating the quality of the data to determine whether it supports or contradicts prevailing beliefs. By integrating meaningful data into discussions, businesses can drive progress and foster a culture that values evidence-based decision-making.

Understanding the Importance of Execution and Value-Added in Business

An effective execution plan is crucial for a company’s operations, especially in the context of consulting. While identifying and diagnosing issues is often a primary focus for external consultants, this approach may offer limited value when working internally within an organisation. Instead of spending significant amounts of time—such as six weeks or even three months—on pinpointing problems, companies should adopt more efficient strategies to facilitate problem-solving and drive actionable outcomes. Ultimately, focusing on timely solutions can enhance a company’s operational effectiveness and foster a more responsive work environment.

Clients frequently have a clear understanding of their problems, while others may struggle with fragmented issues that remain largely unknown. It’s common for everyone involved to possess some knowledge about these challenges, yet simply identifying them does not equate to solving them or adding value for the clients. This initial step, while important, ultimately does not contribute to meaningful solutions or enhance the client’s situation.

Figure 7 Execution Plan

Utilising AI and Stakeholder Interviews for Data Quality Management

In the process of discovering and diagnosing data-related issues, it is essential to conduct interviews across various departments, including business units, users, and IT leadership, to understand their trust in data and the perceived costs associated with poor Data Quality. Engaging with long-serving employees can provide valuable insights into their perspectives on data, which may be accurate despite not being formally documented or recorded.

Conducting a data audit and profiling is crucial, and modern tools significantly streamline this process. Additionally, building a data ecosystem map, which may already exist in some form within the organisation, helps visualise data interactions. Ultimately, conducting a readiness assessment that encompasses people, processes, and technology is crucial for a comprehensive understanding of the data landscape.

An attendee shared that their organisation has implemented an AI-driven platform to streamline the process of analysing stakeholder interviews and generating reports. This platform compiles interview questions, aligns them with project objectives, and utilises AI to draft preliminary reports. What previously took a consultant up to a week and a half can now be completed in just a few minutes, significantly reducing turnaround time. While the AI-generated reports require further refinement before being sent to clients, they provide valuable initial insights and save us considerable time during data audits and profiling.

The project utilises simple tools like Python and R to streamline data handling and analysis. By leveraging a platform for data loading, we can expedite the process without needing to rewrite code or source it externally. Additionally, we employ AI to draft initial SQL scripts based on Data Quality rules, which are then refined by a consultant to ensure accuracy and relevance.

The primary goal is to minimise client time waste by efficiently progressing through the initial project steps, enabling us to add value in subsequent phases. In Step 2, we focus on stabilising the data structure, delivering visible wins to build trust, and establishing governance to ensure compliance and accountability.

To enhance Data Quality within our organisation, we are currently assessing our capabilities and have identified our performance at a low level of 2 out of 10, largely due to insufficient support from our people, processes, and technology. To improve this rating to a level 3 or 4, we are prioritizing high-impact areas where quick wins can be achieved by clearly defining ownership and roles related to data stewardship.

Individuals must understand their responsibilities for maintaining Data Quality, as accountability will promote collaboration among teams to clarify practical expectations. Furthermore, we intend to implement initial Data Quality checks, beginning with basic evaluations and gradually transitioning to automated processes to improve efficiency. Overall, these strategic actions will lay the groundwork for sustainable improvements in our Data Quality management.

The demo showcases a simple tool designed to enhance Data Quality by empowering those who truly understand the data, rather than relying solely on centralised efforts. By enabling frontline employees to identify Data Quality issues and create rules in response, the process becomes more accessible and effective.

This approach fosters a culture of proactive data management, enabling teams to address issues as they arise more easily and effectively. Additionally, it emphasises the importance of documenting data definitions, particularly in complex environments like banks, where terms such as “balance” can have multiple interpretations. Understanding the context of data is crucial to ensuring clarity and consistency across the organisation.

Figure 8 Execution Plan pt.2

The Importance of Data Stewardship in Enterprise Management

In managing Data Quality within an enterprise, it is essential to establish a clear definition of key terms, such as “balance,” tailored to specific business units while considering a broader organisational perspective. This approach facilitates effective collaboration and understanding among data stewards.

A common pitfall arises when Data Quality rules are implemented in isolation, failing to account for the needs of downstream areas that rely on specific data formats. Therefore, it is crucial to ensure that any changes to Data Quality rules are made with a comprehensive understanding of data lineage and impact, involving all relevant stakeholders to maintain alignment and integrity across Data Governance efforts.

Effective financial data management is crucial in the banking sector, particularly in relation to provisioning and payment provisions. As data is aggregated from various sources within the bank, it significantly influences the profit and loss statement, making it vital for both upstream and downstream teams to comprehend this information for effective future planning. However, discrepancies often arise when the final aggregated data fails to meet the format requirements of downstream teams, which can lead to miscommunication and misunderstandings. Therefore, ensuring a clear understanding of the data’s origin and its intended use is essential, as it fosters collaboration among all relevant stakeholders and enhances decision-making processes.

Data Quality Improvement and Data Governance

The final stage in Data Quality improvement is the embedded scale, where the organisation begins to see noticeable enhancements in Data Quality, potentially reaching a level of 6. At this point, effective consultants focus on integrating these improvements within the organisation by ensuring that the right insights and strategies are in place.

This involves training and empowering teams, enhancing data literacy, and providing the necessary tools to utilise data effectively. The goal is to establish a sustainable process where Data Quality checks are incorporated into existing workflows, allowing the organisation to manage data autonomously with some initial guidance.

To ensure effective Data Quality management, it is essential to integrate quality practices into the overall process, rather than addressing them sporadically only when issues arise. Establishing strong governance is crucial for facilitating communication between upstream and downstream processes while maintaining a manageable and lightweight governance framework. Overly burdensome governance can lead to resistance, hinder change, and promote the creation of undesirable workarounds, such as individual departments developing their own databases. It’s essential to adopt a broader perspective while maintaining governance efficiency. Additionally, organisations should measure Data Quality improvements (e.g., from a rating of 3 to 6 out of 10) and effectively communicate these results to higher management to reinforce awareness and support.

To achieve a rating of 7 out of 10, significant improvements are essential; without these changes, we risk reverting to a score of 4 or 5 out of 10. Establishing clear measurements and effective communication about data is crucial in emphasising its importance. It ensures that everyone understands that data is a vital asset, actively tracked, and that we are managing its quality. Even if the data isn’t perfect at this stage, having a strategic approach enables us to identify the necessary next steps for improvement.

Figure 9 Execution Plan pt.3

Guidelines for Implementing Data Quality Solutions in Large Organisations

When addressing Data Quality issues in large organisations, it’s essential to start with small, visible projects that are directly linked to business impact. Proposing a comprehensive, long-term solution often lacks appeal, especially given impending reporting deadlines. Initiatives should prioritise databases or processes that stakeholders actively engage with, ensuring buy-in from the business. Additionally, involving non-data personnel can be invaluable, as they often possess crucial insights into the data’s relevance and application within the business context. Collaborating with these team members fosters a more effective and informed approach to enhancing Data Quality.

Establishing a solid foundation is crucial for any project or initiative. This foundation doesn’t need to be flawless, but it should be logical and robust enough to support additional layers of complexity that can be built upon it. By establishing a strong foundation, we can prevent potential issues in the future and promote sustainable growth. Before we move on to the next slide, I would like to invite any questions, comments, or thoughts on this important aspect.

Establishing Authority for Data Modelling in Organisations

Establishing an authoritative source for Master Data presents significant challenges, particularly in environments where data duplication occurs across various applications. This issue is particularly pertinent for organisations such as banks, where independent teams frequently develop their own models using disparate datasets. Although the ideal approach would involve centralising Master Data from the beginning, practical limitations often necessitate compromises.

A reliable Reference Data foundation is essential for achieving high-quality data outcomes within organisations. Without effectively addressing Master Data concerns, organisations face significant challenges in maintaining data integrity, which can ultimately undermine decision-making processes. Therefore, prioritising the resolution of these Master Data issues is crucial for enhancing overall Data Quality and supporting sound organisational decisions.

From a business perspective, it’s essential to first develop individual data models for each model, ensuring a clear understanding of the data inputs and limitations. This foundational work allows for the identification of authoritative data sources, which is crucial for effective data integration. The approach to establishing these sources will vary by organisation, and leveraging existing applications to minimise changes can be beneficial. Ultimately, solid data stewardship and clear authority designation are critical to successful data management.

Maturity in data understanding is crucial for effective participation in meetings focused on agreeing on data sources. Individuals who do not fully grasp the implications of their data should refrain from such discussions, as varying interpretations of the same terminology across different business units can lead to confusion. This highlights the necessity for clear communication through modelling and collaborative discussions.

To address these challenges, a well-defined architectural roadmap for Master Data Management (MDM) is essential, proposing a structured plan over two to three years to resolve these issues gradually. Ultimately, this approach will foster a unified understanding and better utilisation of data throughout the organisation.

Many organisations struggle with unclear data strategies, often relying on vendors who benefit from this lack of direction. To avoid this, enterprise architecture must provide a clear roadmap for data management, encompassing elements such as data warehousing and business intelligence.

This approach helps demystify the process for stakeholders by establishing a structured plan. By starting with a foundational understanding of key definitions and concepts, organisations can effectively communicate their data strategy and guide stakeholders toward achieving their goals.

In addressing complex issues within a business context, it is essential to utilise one’s own models and embrace an iterative approach, particularly when working on the specialist Step 2 stabilising structure. This process begins by identifying the most pressing problems and creating a clear roadmap to guide the iterative progress. Moreover, solutions must be closely aligned with business needs, as overly disruptive changes can obstruct buy-in and hinder the integration of data as a central focus within the organisation. Ultimately, a balanced and practical approach fosters effective problem-solving and supports sustainable growth.

Managing Organisational Costs and Quality Issues

Understanding the cost implications of issues within large organisations is crucial for gaining buy-in from stakeholders. It is essential to quantify the expenses associated with rework and challenges, as this information helps align business objectives. While individual contributors may recognise problems, convincing higher-ups often requires a clear understanding of potential costs. Although quantifying these costs can be challenging and sometimes hypothetical—such as the risk of losing a banking license—providing a comprehensive view of these issues can foster greater support for necessary interventions.

The alarming estimated cost associated with quality issues sparked intense scepticism among the attendees, leading them to question the credibility of the consultants’ claims. An attendee pointed out that the projected losses exceeded the company’s actual revenue, raising doubts about the reliability of the assessment. This scepticism not only challenged the findings but also undermined the entire evaluation process, rendering the consultants’ efforts seemingly pointless.

Asset and Data Quality Management

To gain buy-in for asset management, one effective approach is to establish an ITIL service desk specifically for data asset management, similar to existing processes for hardware and software. By allowing users to log issues related to data assets, organisations can quantify these problems in a standardised manner, fostering trust and engagement among users who are already familiar with the asset management of their devices. This practical method helps bridge the gap in understanding and encourages faster adoption of new processes.

An attendee inquired about the best approach to calculating the cost of Data Quality issues, expressing that although he had frequently encountered the topic, he had struggled to find concrete references or examples. Johan and Howard acknowledged the complexity of this question, highlighting that there is no single formula to determine the cost. Johan then suggested that it is important to consider multiple aspects related to Data Quality and its impact, focusing on whether the issues affect a single area or multiple areas within the organisation.

Quantifying the cost associated with poor Data Quality poses significant challenges for organisations, impacting various critical aspects, including operational processes, strategic finance, reputation, and compliance. Although a framework has been established to assess the significance of errors based on their operational impact, translating these assessments into a clear monetary value remains elusive.

The situation becomes more manageable when data seamlessly integrates into a model that generates straightforward outputs. Yet, the overall financial implications of Data Quality issues remain complex and difficult to evaluate. Ultimately, addressing these complexities is essential for organisations to understand and mitigate the repercussions of poor Data Quality fully.

Organisations, particularly in the banking sector, face significant challenges in managing data flow and decision-making processes, especially when Data Quality issues can directly impact capital requirements. Tariq emphasises insights from Tom Redmond, known as the “data doctor,” who suggests that a useful rule of thumb for understanding the financial implications of Data Quality problems is to anticipate a potential cost of approximately ten per cent.

This perspective not only fosters transparency regarding the uncertainties surrounding data assessments but also underscores the importance of a commitment to continuous improvement. Ultimately, adopting this estimate can serve as a foundational understanding for organisations as they navigate the complexities of Data Quality and its financial repercussions.

Organisations often adhere to established norms without questioning their relevance, which can hinder effective decision-making. Both executives and team members must challenge these conventions and base their actions on reliable data. By rigorously evaluating the quality of available data, organisations can better determine whether their prevailing beliefs are supported or contradicted. Engaging in conversations that integrate meaningful data not only drives progress but also cultivates a culture of evidence-based decision-making. Ultimately, fostering an environment that values data-driven insights can lead to more effective solutions and improved outcomes for the organisation.

Problem Management and Data Quality in Business Operations

The incorporation of a cultural element into the analysis of upstream quality issues significantly enhances the understanding of their impact. By quantifying rework resulting from poor quality, organisations can gain insight into the downstream effects, such as increased resolution time and costs, strategic missteps, financial ramifications, and potential reputational damage due to compliance failures.

This granular approach not only highlights the relevance of quality assessment across various areas of problem management, including incident and request management, but also underscores the necessity of a comprehensive strategy in addressing these challenges. Ultimately, this methodical analysis paves the way for more effective quality management and organisational resilience.

In problem management, conducting a root cause analysis is crucial for preventing recurring issues in service management. It is essential to establish processes that address these root causes, resulting in measurable outcomes and financial benefits, even though they may serve as lagging indicators. For instance, Johan shared that a client experienced challenges migrating from an on-premise ERP to a cloud-based ERP due to poor Data Quality, resulting in both immediate costs and extensive wasted time. This situation highlights the difficulty in quantifying costs attributed to poor Data Quality and the lack of awareness regarding its impact; had they recognised the data issues beforehand, they might have resolved them before attempting migration.

When considering the implementation of Master Data Management (MDM) tools, it’s crucial to take a holistic view of the organisation’s Data Quality. Many companies find that their data is insufficient for MDM, leading to the realisation that investments in tools may be wasted if the underlying data is poor. One case highlighted the challenges faced by a company that, after implementing an MDM solution with vendor support, discovered it needed to consolidate Master Data from 34 different businesses. This resulted in significant delays and costs, as consultants were engaged for nearly a year without having the necessary quality data in place, underscoring the importance of addressing Data Quality before adopting new tools.

If you would like to join the discussion, please visit our community platform, the Data Professional Expedition.

Additionally, if you would like to watch the edited video on our YouTube please click here.

If you would like to be a guest speaker on a future webinar, kindly contact Debbie (social@modelwaresystems.com)

Don’t forget to join our exciting LinkedIn and Meetup data communities not to miss out!

Scroll to Top