Grid modernization and digitization are introducing a wealth of data into the utility industry. But too often, poor data quality means utilities aren’t getting the benefits they hoped for from investing in advanced metering, asset sensors, customer experience platforms, distributed energy resources and electric vehicle supply equipment.
Solving this data quality challenge begins with a simple concept: treat data as you would treat any other asset. Data quality management must be end-to-end, covering all critical datasets, focusing on everything from initial assessment to ongoing remediation, and involving diverse stakeholders.
Our Data Quality Index (DQI) measures, visualises and enhances the quality of your data and enables end-to-end monitoring for critical data assets across the enterprise, enabling collaboration between IT and wider stakeholders through a four-step journey.
We use a robust and customizable data quality scoring calculation to determine the DQI for each of your data assets. The DQI is a weighted average taken from the execution results of a series of business rules defined across seven key dimensions for data quality:
Our customisable data quality visualiser presents the results of the data quality assessment in easily digestible dashboards, letting you analyse data quality performance, refine calculations, or drill down into the metrics to identify real problem areas. For example, with the visualiser, utility asset managers can view the quality of their data assets and compare their site and asset scores to those of other sites across the enterprise, diagnosing specific areas for improvement.
As data quality issues are quantified, our tool presents failing data records along with the associated issues and enables customisable workflows to open, suspend and resolve them. For example, asset managers can see the outstanding data quality issues with their assets and track remediation progress over time.
Data quality management isn’t a one-off process. It’s an ongoing one that significantly improves and sustains the data quality of critical data assets. Our solution enables continuous monitoring through periodic assessment of the data, automatic resolution of issues in the source system, and tracking of DQI changes over time.
Data is becoming increasingly integral to everything utilities do. Poor data quality is often pegged as the reason for inaccurate analytics and ill-conceived business strategies. Ensuring high data quality helps to build greater confidence in the data, enabling better decision-making across the organisation.
Decisions made from bad data increase operational, financial and reputational risk. In fact, Gartner research from 2018 showed the average organisations loses $15 million per year to poor data.
As our DQI works at a record/row level, it’s easy to aggregate it to compute the overall data quality at any higher level. For example, when measuring data quality for a customer accounts dataset, you can show aggregated DQI by account type, account executive or segment. This empowers leaders to make informed budgeting decisions.
Many businesses have only a cursory, qualitative understanding of the quality of their data assets. Our approach establishes a baseline for current data quality and shows how targeted actions improve it. Organisations can also set enterprise-wide data quality targets based on the criticality of data, so data owners can make meaningful, targeted investments.
Having consistent, accurate and high-quality data is important to building a trusted relationship with your data consumers – both internal business stakeholders and external customers.
Accommodating bad data is both expensive and time-consuming. If an organisation’s information isn’t complete or consistent, then they must spend a significant amount of time fixing it to make it useable.
Typical data quality solutions take months to procure, onboard, train and implement. Our unique solution can be up-and-running within days. And our fast-track onboarding engagement will define and implement specific business rules, customise dashboards to your needs and operationalise specific remediation workflows for the pilot dataset within six to eight weeks.
It’s never been more important to minimize power supply disruptions. Find out how SDG&E made its distribution network even more reliable with iPredict™, a system that uses Big Data and Machine Learning to predict faults before they happen.