Desktop version

Home arrow Marketing

  • Increase font
  • Decrease font


<<   CONTENTS   >>

Data Quality Improvement

An effective framework for improving data quality is shown in Table 10.6. It helps identify impactful persona and use cases, projects, and the root

TABLE 10.6

Data Governance Roles

Executive Steering Committee

• Govemanc e prioritization and resourc e alloc ation •Policy Changes

Organization Changes Roles & Responsibility Changes Funding approval

Working Governance Council

•Investment recommendations to improve data quality •Approve core data standards, policies and business rules •Coordinate data quality improvement projects •Sets roles and responsibilities for who can create and change core metadata

Business Data Stewards

  • •Reiewr core data standards, policies and business rules
  • •Understand data models and lineage mapping across systems
  • •Recommends data clean-up and remediation

Subject Matter Experts (Analytics)

  • • Advanced analytics, methods and tools
  • • Maintenance to ensure data and conforms to data quality rules across is dimensions

causes for poor data quality, and it helps prioritize projects for solution. Business data stewards use a sequential methodology. In data governance, the improvement methodology is modified to the following steps: Develop, Onboard, Profile, Implement, and Control (DOPIC). We will discuss the DOPIC methodology and how it is used to improve data quality as opposed to the Define-Measure-Analyze-Improve-Control (DMAIC) improvement strategy.

In the develop phase, an opportunity assessment is made from the business owner and stakeholder perspectives to define impactful personas and use cases, key metrics, core metadata, and its quality dimensions relative to the business metrics that need to be improved (e.g., on-time delivery, reduced returns, low sales, missed schedules, and other use cases). These are prioritized by business owners, stakeholders, data quality stewards, and the information governance team. Prioritized projects are added to governance roadmaps to secure resources and funding. It is essential to show the return on investment for the larger initiative. The time frame for this work is between three and six months.

If these are initial projects, a data governance foundation is built as described by the maturity model. Business owners, stakeholders, and subject matter experts are brought onboard to the initiative using the initial projects or to the larger governance initiative if it is mature. A team is built around the project’s use case on the basis of the metadata as well as its data source and consuming systems. This initial analysis identifies business and IT owners needed to support the project. Once onboarded, these roles help define and align the metadata to the project’s relevant business metrics and benefits. The project team documents data lineage from the source to consuming systems, definitions, data quality rules and other information in the collaborative platform. If this work is foundational based on initial projects, the time required to complete it is between three and six months; if maturity exists, the project time is often between one and two months.

In the profile phase, user stories are created from an Agile Project Management perspective. These are used to start data quality profiling. Profiling is useful for answering relevant questions. Baselines are established for the metadata and business metrics and tracked to targets using a reporting dashboard. This helps show data quality and business metric performance. Models are also built to include leading, lagging, and coincident indicators as well as independent variables that impact the quality of metadata and the associated business and process metrics. Stakeholders are engaged to provide feedback according to the Agile methods to integrate the analysis with respect to systems, metadata, metrics, and process. The time to complete the project work is between one and two months if maturity exists, or between three and six months otherwise.

In the implementation phase, data policies and standards are either formalized if these are initial projects or updated through the working and leadership councils. Metadata lineage is documented with the help of supporting applications that map data lineage from source and consuming systems. Solutions for data quality gaps are piloted to improve business performance. The reporting dashboards are updated. Remediation activities (e.g., data clean-up) is also initiated using the updated definitions, standards, and algorithms. Ideally these solutions will be performed at an enterprise level and will be permanent to avoid the recurrence of the data issues. This phase typically requires between one and three months for projects that are part of a mature governance framework, or several months dependent on funding for data clean-up or capital investment.

In the control phase, governance is optimized for source and consuming systems. Policies and standards are used to identify impactful projects and use cases to improve data quality. New use cases bring more data domains, systems, and business owners into the governance community. Working councils are established for these domains. Advanced analytics form the basis for improvements to standards, policies, and business rules, building data models, and understanding data lineage across source and consuming systems. The organization moves from remediation planning and data clean-up activities to proactive infrastructure improvements that prevent data quality issues from occurring. The governance leadership council provides ongoing funding approval, prioritization, and resource allocation to improve business performance. Improvement projects are aligned to this strategy and its capital investment roadmap. Data governance roles and responsibilities are incorporated into policies.

 
<<   CONTENTS   >>

Related topics