To enable a dashboard eco-system to even exist, we require legislation. In fact, we need primary and secondary legislation also to define exactly what the compulsion will be and how long we will have to comply. The delays to the required primary and secondary legislation give us the perfect window to get ready. Waiting until the date legislation is passed will contract the timescales available to sort out the problems all schemes and providers will have. It will also increase the demand for those finite project resources to deliver the changes needed. An unexpected outcome from the current pandemic is that we have all come to understand what “flattening the curve” means. However, in this case, we get the opportunity to do it ahead of when we need it, not in response to it.
I think it’s important to see this as a change project, not a cleanse project.
Cleansing is temporary. Change is permanent. The industry has to change how it treats data, and for some, those changes will be significant. Data doesn’t sit still. It’s part of a flow of information. We will need to look at the data we receive, the data we calculate and the data we store. We will need to look at how and when we do all that, not just what.
Allow me to illustrate…
In setting its suggested scope, MaPS have tried to remain consistent with TPR record keeping guidance and the typical values you would expect in benefit statements. “Quite reasonable” you would think. It implies no greater onus on a scheme than it already has to deal with under current legislation. But let’s break down where the challenges might lie:
In reality, most schemes will assess their data quality scores annually, and periodically carry out tracing and mortality screening across selected groups of members. This is traditionally how most of us have tackled data: get it right at a point in time. This won’t necessarily fit in an “always on” world of dashboard visibility. Also, is TPR data check enough? The challenge everyone will have in the future is that the regulator’s data quality checks can only really check for the presence of data, not its accuracy. After all, that is why we need to periodically trace deferred members – they move and don’t tell us. Members get married or divorced, or otherwise change names and don’t tell us. Data quality erodes, no matter how good your systems are.
If you do not have 100% accuracy on a member’s name, address, NI Number and date of birth you are not going to be able to do even the first part of a dashboard request: you’re not going to be able to match a request with a record in your system.
The implications of a false match are clear: it would be a data breach and all the associated nastiness that goes along with that. BUT a false negative can also be damaging. i.e. if you do not return a match when you should have, the member will be making financial decisions on inaccurate data, or your market reputation will be damaged. Even worse, the Regulators will need to be refining or developing their record-keeping requirements in response to dashboards. You will be in their crosshairs, much as those who have failed to provide their data quality scores in their scheme returns have been this year.
There are other solutions though. One approach could be to compare the data you hold to trusted third party data sets. A bit like doing a credit reference check, but for the currency of your data. This is just one example of how we are thinking of changing the mindset, and tooling, of data stewardship.
Whilst the MaPS’ first cut of data items suggests that these are data items required in a benefit statement, they aren’t all true in practice, well not for everyone. There are plenty of schemes out there who do not routinely produce all of these values in their benefit statement. Also, there will never be 100% coverage of automation of these results. Even where scheme rules have been fully automated, the transient nature of data can often mean that some calculations fail. Some schemes may never have automated their benefit statements for deferred members, instead preferring to service the low level of demand with manual, or semi-automated work-arounds.
DC doesn’t get an easy ride either. Many providers in the market could not provide an immediate SMPI calculation on demand and will need to be able to rely on presenting the last calculation they performed, rather than a current projection.
Leading on from calculations, how good are your processes? If you do produce benefit statements manually for any of your members, do you store the results so that they could be shared with a dashboard?
Do you need to carry out some of your periodic data quality processes a bit more frequently?
We cannot and must not divert our attentions away from looking after our people and delivering our critical services right now. But things will quiet down on that front. With recovery comes the opportunity to reassess priorities and to do that we must look early at what’s next.
Now is the time to engage with your administrators and your software providers. Now is the time to take stock of how ready you really are and assess the size of your challenges. You do not need to wait for the requirements to be nailed down. We already know enough to progress our readiness. If we wait for certainty, we increase the certainty of being late.
Whilst MaPS are not consulting on this just yet; I’d be interested to hear your thoughts on dashboard and share any ideas you might have. In turn, I can talk about some of the things we are doing to plan for dashboard readiness.