According to PA’s business leader survey, 70 per cent of banks are not effectively using data assets to gain competitive advantage. Every year, almost every part of the banking sector gains access to newer, richer and more voluminous data, for example from mobile payment applications. This bigger data pool brings the challenge of accessing it effectively to gain the right intelligence, at the right speed, in the right order of priority and with the right means of extraction.
To ensure this improved data quality and quantity helps banks make better decisions, they need to look beyond traditional approaches to data mining and report development.
The exponential increase in the amount of data leaves behind a large and complex mix of structured and unstructured data trails. Thanks to advances in processing power and analytical tools it is just about possible to keep up. However, this is not enough for organisations to maximise the latent value of their own data systems. It means few are positioned to respond incisively to, and anticipate, changing requirements.
Most firms still approach data and reporting solution development along traditional lines, defining a set of requirements and then delivering a strong, compliant solution to meet these requirements. However, few organisations have the breadth of understanding of the data they are likely to need. Equally, with 85 per cent of banks stating that their main focus is ensuring system security, any proposed changes have a high impact (e.g. heavy compliance and testing requirements). This greatly inhibits the ability of the system to learn.
On those rare occasions when the business seizes control, questions about sustainability, security and, very possibly, basic technical viability tend to lead to a spend and fix decision that is a lose-lose situation both for the CIO and the business itself.
The approach taken by the oil industry offers a solution. Exploration for oil is speculative with more failures than successes, but the rewards for success mean exploration receives heavy investment. However, until and unless a strike is made, the oil company simply notes and plugs the location and moves on. Depending on the size of any discovery the oil company then determines when to bring in extraction infrastructure.
This provides a model for banks. By allowing the business some free rein to focus on new data sources and ask incidental, relatively unstructured questions (e.g. how do we match what makes our customers tick with what we have?) greater agility and insight is quickly gained and the “right” discussion held. Although this has to be tempered by limiting the lifespan of any experiment, this approach has been shown to take its exponents far further, more quickly. PA has helped a leading Dutch retail bank achieve 15 per cent cost savings from data and full learning capability (e.g. shortcuts to “sufficient” satisfaction). We also worked with the UK Meteorological Office to achieve vastly increased access and leverage of their unique data and with a major insurance broker to revolutionise their marketing strategy (e.g. it afforded a group-wide view of KPIs, providing key input for developing a marketing strategy driven by value to the customer).
Admittedly this approach has yet to achieve 100 per cent value extraction, but it is proving far more effective at containing costs and separating likely winners from losers. Similarly, as the industrialisation of data extraction only commences once real, viable business needs are articulated and prioritised, investment in infrastructure and change is demand-led and is therefore less expensive and more effective as a result.
To find out more about how PA has helped companies extract the value from their data more effectively, contact us now.