Skip to content

Data quality index

Enabling data quality monitoring for your critical data assets

Contact us for a demo

Grid modernization and digitization are introducing a wealth of data into the utility industry. But too often, poor data quality means utilities aren’t getting the benefits they hoped for from investing in advanced metering, asset sensors, customer experience platforms, distributed energy resources and electric vehicle supply equipment.

Solving this data quality challenge begins with a simple concept: treat data as you would treat any other asset. Data quality management must be end-to-end, covering all critical datasets, focusing on everything from initial assessment to ongoing remediation, and involving diverse stakeholders.

Our Data Quality Index (DQI) measures, visualises and enhances the quality of your data and enables end-to-end monitoring for critical data assets across the enterprise, enabling collaboration between IT and wider stakeholders through a four-step journey.

A simple approach to calculate the data quality of any data asset

  • We use a robust and customizable data quality scoring calculation to determine the DQI for each of your data assets. The DQI is a weighted average taken from the execution results of a series of business rules defined across seven key dimensions for data quality:

    1. Accuracy: Data correctly describes the ‘real world’ object or event
    2. Completeness: All required occurrences of the data are populated
    3. Consistency: A unique piece of data holds the same value and format across all data sets
    4. Uniqueness: All distinct values of a data element appear only once
    5. Validity: All data conforms to allowed values and format
    6. Integrity: All data conforms to defined data relationship rules (e.g. primary/foreign keys)
    7. Timeliness: Data is delivered on time.

  • Our customisable data quality visualiser presents the results of the data quality assessment in easily digestible dashboards, letting you analyse data quality performance, refine calculations, or drill down into the metrics to identify real problem areas. For example, with the visualiser, utility asset managers can view the quality of their data assets and compare their site and asset scores to those of other sites across the enterprise, diagnosing specific areas for improvement.

    Data Quality Index

  • As data quality issues are quantified, our tool presents failing data records along with the associated issues and enables customisable workflows to open, suspend and resolve them. For example, asset managers can see the outstanding data quality issues with their assets and track remediation progress over time.

  • Data quality management isn’t a one-off process. It’s an ongoing one that significantly improves and sustains the data quality of critical data assets. Our solution enables continuous monitoring through periodic assessment of the data, automatic resolution of issues in the source system, and tracking of DQI changes over time.

Why implement data quality management?

Better decision-making

Data is becoming increasingly integral to everything utilities do. Poor data quality is often pegged as the reason for inaccurate analytics and ill-conceived business strategies. Ensuring high data quality helps to build greater confidence in the data, enabling better decision-making across the organisation.

Reducing risk

Decisions made from bad data increase operational, financial and reputational risk. In fact, Gartner research from 2018 showed the average organisations loses $15 million per year to poor data.

Comprehensive data quality performance analysis

As our DQI works at a record/row level, it’s easy to aggregate it to compute the overall data quality at any higher level. For example, when measuring data quality for a customer accounts dataset, you can show aggregated DQI by account type, account executive or segment. This empowers leaders to make informed budgeting decisions.

Baselining and tracking

Many businesses have only a cursory, qualitative understanding of the quality of their data assets. Our approach establishes a baseline for current data quality and shows how targeted actions improve it. Organisations can also set enterprise-wide data quality targets based on the criticality of data, so data owners can make meaningful, targeted investments.

Enabling greater trust

Having consistent, accurate and high-quality data is important to building a trusted relationship with your data consumers – both internal business stakeholders and external customers.

Improving productivity for data-intensive initiatives

Accommodating bad data is both expensive and time-consuming. If an organisation’s information isn’t complete or consistent, then they must spend a significant amount of time fixing it to make it useable.

Why PA?

Typical data quality solutions take months to procure, onboard, train and implement. Our unique solution can be up-and-running within days. And our fast-track onboarding engagement will define and implement specific business rules, customise dashboards to your needs and operationalise specific remediation workflows for the pilot dataset within six to eight weeks.

San Diego Gas & Electric

It’s never been more important to minimize power supply disruptions. Find out how SDG&E made its distribution network even more reliable with iPredict™, a system that uses Big Data and Machine Learning to predict faults before they happen.

Read the client story

More from our experts

Contact us

To find out more get in touch with one of our experts today.

Ross Smith

PA energy and utilities expert

Gregg Edeson

PA energy and utilities expert
×

By using this website, you accept the use of cookies. For more information on how to manage cookies, please read our privacy policy.