GMP Equalisation: The Increasing Importance Of Data
53444EQP Imagery For EQ Paymaster Articles39 (1)

GMP Equalisation: The Increasing Importance Of Data

05 August 2020

A recent Professional Pensions (PP) webinar looked at the data challenges of GMP equalisation

The latest PP webinar – held in association with EQ (Equiniti) – looks at guidance from the cross-industry GMP Equalisation Working Group on the importance of data – asking what data schemes need for GMP equalisation; the data demands, advantages and disadvantages of various calculation solutions; and assesses the challenges relating to historic data. 

The write-up below features some of the questions raised during the discussion – to listen to the webinar in full, visit: bit.ly/3hZLHIR

Participants

  • Steve Nicholson Operations technical director, EQ. Steve has spent the last five years focusing on GMP-related projects and has overseen more than 100 reconciliations and equalisation projects.
  • Geraldine Brassett Chair of the GMP Equalisation Working Group, PASA

You have spoken about a shorter and a longer road to equalisation – to what extent is it acceptable to make assumptions and simplifications to get the job done?

Steve Nicholson: Actually when you start looking hard at the GMP equalisation adjustment and calculate it in different ways – through using either full reconstruction or some shortcuts to sidestep the data issue – for the vast majority of cases, you get the same answer within a few pounds. As long as you’ve segmented your population and identified the cases where it might not hold, for example in GMP-only cases, actually these aren’t really assumptions and mathematically you get the same answer for the vast majority. It’s not as big a deal as people make out.

Are there any time limits for completion of these exercises?

Geraldine Brassett: There are no time limits prescribed for the completion of GMP equalisation exercises. Saying that, when you look at the number of schemes that will be going through GMP equalisation, doing the planning and effectively booking a slot might be quite important to start thinking about because there will be demand on the industry and obviously there’s only finite resources to meet that demand. Indeed, one of the reasons we are increasingly seeing schemes thinking about whether they want to do equalisation and rectification as one exercise or, because there are still some unresolved issues around GMP conversion, whether they want to split them, is because there is no timescale for when they have to be finished. There are no time limits prescribed for the completion of GMP equalisation exercises. Saying that, when you look at the number of schemes that will be going through GMP equalisation, doing the planning and effectively booking a slot might be quite important to start thinking about because there will be demand on the industry and obviously there’s only finite resources to meet that demand. Indeed, one of the reasons we are increasingly seeing schemes thinking about whether they want to do equalisation and rectification as one exercise or, because there are still some unresolved issues around GMP conversion, whether they want to split them, is because there is no timescale for when they have to be finished.

​How should schemes involve the employer with regards to GMPe?

Geraldine Brassett: The choice of methodology dictates in certain circumstances that you need to involve the employer anyway. From my experience, these type of projects work better when the employer is involved and understands the decisions that are being made.

We’ve focused much of our discussion in this webinar on pensioners but equalisation will impact on deferred members and, for a lot of defined benefit (DB) schemes, those deferred members might also still be employees. As such, making the employer aware and, where necessary, an integral part of the project is really helpful even if in certain circumstances there isn’t a requirement to do that.​

What about missing data? And what happens if previous administrators aren’t able to provide the data needed?

Steve Nicholson: I would be surprised that you don’t have enough data to undertake an accurate GMP equalisation calculation for most of your cases. It may be that you are considering approaching a previous administrator to help reconstruct the data for a small group of people but I suspect doing such reconstruction work will be a thankless task – there are real capacity issues in the industry, so you’re better off thinking about a convenient assumption to close off that group. After all, we’re dealing with a small adjustment and it’s perfectly valid using assumptions on small groups of people.

Should additional data work be done and budgeted for as part of GMP reconciliation to ensure schemes are in the best possible position for GMP equalisation?

Steve Nicholson: The populations are different. Only people with a GMP accrual after 1990 are in scope for equalisation; for rectification it’s only the people who have a change in their GMP.

If you’re doing data work, you’ve got to be really clear what the purpose is because we’ve explained you can actually get accurate adjustments using your current payroll for the majority. Having said that, the GMP reconciliation is the key data cleanse activity – so you start with the right GMP amounts. One thing we have found really useful when doing GMP reconciliation work is collecting and verifying the underlying contracting out data that make up the GMP that is being agreed (earnings, dates etc). This helps ensure some of the essential data items are consistent such as termination date and transfer data (where post 88). It also gives you the possibility of calculating post 90 GMPs from first principles. Capturing other data such as missing first life links and whether the GMP is an “intended mismatch” is also useful. So yes, the goal posts do move a bit for your GMP reconciliation if you look ahead to your calculation solution for GMPE.

What about the administrative cost of undertaking GMP equalisation? Is it likely this will be high and disproportionate to the value of corrected pensions?

Geraldine Brassett: From my perspective, it’s asking the administrators for transparency of the budget. For the work we’ve been doing, we’ve segmented that cost so our schemes can see quite clearly how much of the price relates to the upfront work, how much relates to the calculations, what data you’re using where and how that’s costed, and then how does that go through to the payments and the communications. So, transparency of budget is very important but so is being realistic that, at the outset, there are certain things that you can’t cost.

Is the longer-road approach more suited to certain sorts of schemes, such as those close to buyout?

Steve Nicholson: The long road is all about full reconstruction, having to reproduce those pensioner and dependants calcs and then running two cashflows comparing what they actually got and what they should have got. That is expensive and it takes a lot longer than you think.

It is, however, possible there is a saving by doing it. If you’re looking at buying out, for instance, the data risk premium in a large deal can be quite substantial sums of money. So if you compare the data risk premium versus doing a full benefit audit, which is very much long road, it will take you a while, it will have significant costs, but it might save money in the overall liability premium because then the insurer knows there’s no comeback and they’ll get a sign-off from the relevant advisers, or from themselves. So it can make a lot of sense – the long road is absolutely a good place to be in some circumstances.

Is there any indication about the likely outcome of the second Lloyd’s judgement?

Geraldine Brassett: Unfortunately not, but that doesn’t surprise me. Obviously we have already started to consider a response to that depending on the outcome and whether there is a need to revisit past cases. And I did wonder if that was behind the question about provisions information from former administrators or previous administrators because that has the potential to be a significant amount of work and to require more liaison between admin providers in respect of people who have transferred their benefits. But I’m disappointed to say no, nothing to share that would give us any insight as to what the outcome would be. If anyone knows, I’d be interested.

The anti-franking issue is hugely complex – is there any way to ignore this issue?

Steve Nicholson: That is perhaps rather a question for your legal adviser. But I’ve seen a lot of projects that have sidestepped the antifranking issue because it affects such a small number of people. If you look at the PPF approach, for instance, they came up with an elegant solution that could cost as little as £10 per member to implement by using simplifying assumptions but still retaining good precision for the majority.

You have spoken about members who worked past age 60. What impact will there be on members who left or retired before age 60? 

If you’re looking at buying out, the data risk premium in a large deal can be quite substantial sums of money

— Steve Nicholson, EQ

Steve Nicholson: If a member left pensionable service before age 60, then there will be no later earnings addition to consider. However, there may still be the step issue to consider – either to ensure GMP is covered from GMP payment age or to meet the requirements of Pension Schemes Act 1993 if the member retired at or after NRA but before GMP payment age. This is the complex part. One conundrum is how to square past house practice, which may vary, with what you are going to do for the GMP equalisation calculations to remove the GMP inequality.

What are your key takeaways following our discussion?

Steve Nicholson: My point on data is there are easy ways and hard ways to get to the answer. Use the easy ways when they’re good to use, which is most of the time. That means you can focus your finite data budget on the cases that matter and get more bang for your buck.

That the need to recognise the project will be specific to your scheme. I hope we’ve illustrated in this webinar that it isn’t a one-size-fits-all solution and that taking input from your scheme advisers in terms of getting your data ready is really important. One last point – please do read the Pensions Administration Standards Association’s guidance when it comes out and we always would welcome any feedback you have on it.

Steve Nicholson: My point on data is there are easy ways and hard ways to get to the answer. Use the easy ways when they’re good to use, which is most of the time. That means you can focus your finite data budget on the cases that matter and get more bang for your buck.
 


Written and published by Professional Pensions, July/August 2020.

If you would like to speak to our team of experts, please get in contact below.

We have detected that you are in United States. We think that Equiniti US would be more suited to deal with your needs.