In depth: Pensions data – transparent standards

Getting data right is a cornerstone of good pensions – that’s why calls are growing for greater transparency around data standards. John Lappin reports

When data goes wrong in pensions, it can go very badly wrong. The case of Now: Pensions is instructive. It had to radically overhaul its administrative systems, ended up removed from The Pensions Regulators’ master trust assurance list and in 2019 was sold to Cardano following severe data blunders.

As the Pensions Regulator noted at the time – “By April 2016, the pension contributions of almost one in three of the master trust’s members – an estimated £18 million affecting over 265,000 people – had not been collected.” Now had to move all of its members onto a purpose-built platform and rebuild the data records of more than 350,000 members.  It also stirred the regulator into further action.

In autumn of last year, TPR announced a data crackdown for trustees, asking the boards of 400 schemes to carry out a data review given their failure to do so in the previous three years.

The trustees were required to report to TPR what proportion of their members they held accurate common and scheme-specific data for. Failure to do so risked an improvement notice about their inadequate internal controls and continuing failure carried a fine of up to £5,000 for individuals or up to £50,000 overall.

A total of 1,200 schemes were contacted to remind them to carry out data reviews of both common and scheme-specific data every year.

Kristy Cotton, chair of the PASA Data Working Group and Deloitte expert partner for data, says there is a view that DC schemes are more ‘recent’ and thus should have good data.

She lists three big data challenges – the volume of data transferred from employers to DC pension providers, the increased mobility of members of DC schemes and the increased complexities of options (e.g. income drawdown) available to members of DC schemes

“Data needs to be actively managed, with an ongoing data management plan that includes periodic reviews and controls around the collection, transfer and ts retention. Poor data scores shouldn’t be something to be embarrassed about, as long as trustees and pension providers are actively taking steps to improve the quality of data held,” she says. “We need transparency across  the DC market on the level of data quality and what ‘good data’ looks ike, and advisers will need to work together to drive transparency market-wide.”

Matt Dodds, director with pension data specialists ITM says: “Data shouldn’t be a one-off project. It is not a tick box exercise that should be completed and forgotten about. Maintaining data accuracy needs to be part of business as usual to build a cycle of continuous improvement. This s the critical combination that impacts DC data quality.”

Experts warn that poor data can lead to members making the wrong choices or being guided on to the wrong path.

Cotton adds: “Complete and accurate data gives members and providers an understanding of the evel and type of members’ pensions savings. It also presents nformation about members’ preferences, circumstances and decision-making criteria.

“Insufficient data can impact the decisions made by them, for example, through inappropriate communications or incorrect understanding of the options available.

“Default strategies, although useful and necessary, are not the most suitable for all members, and poor data can lead to the use of default strategies when other options may be more appropriate.”

Dodds says: “Data is the foundation of every scheme – the enabler of decision making and informing strategic direction. Data quality can impact an endless list of areas of scheme management. Admin is often affected by data – duplicate records that compound the issue of small pots and impact charges. That’s both charges applied to schemes by way of TPR levy and charges applied to members per pot – all of which can be higher than necessary as a result of poor data. But we mustn’t forget about the bearing data has on investments – putting someone on a glidepath at the wrong date as a result of inaccurate data can have costly repercussions.

“At the most fundamental requirement to act in the best interest of members – quality data is needed to inform decisions. Analysis and insight based on inaccurate data will see propositions developed wide of the mark. Accurate data is needed to understand the membership of a scheme and facilitate the development of robust, member- centric propositions especially around default investment strategies and decumulation.”

TPR beefed up the  requirements around data in 2018 with new questions in scheme returns and schemes scoring their data in terms of common and scheme specific data.

The common data concerns 11 pieces of information – a member’s surname and initials, NI number, date of birth, gender, joining date, target retirement date, membership status, date of last status change and address and postcode.

Cotton says: “The TPR data score has a valid role in the industry, making pension schemes consider the completeness and accuracy of that data. However, scheme- specific data scores are also important, as are data management plans for pension schemes. There is industry-wide inconsistency on how the calculation of the scores is interpreted, and there are many schemes still not reporting data scores. Pension scheme data underpins all of the activity of a scheme, and data must stay high up on trustee agendas.”

Dodds says: “Reporting on the presence of data isn’t enough. Benchmarks should be reporting on accuracy of data, data gaps, improvement plans alongside metrics of improvement made to date, the use of insights to drive in-depth analysis and the use of spot checks on data accuracy. Data standards set and reported on in isolation just don’t work – a holistic approach to data and process improvement is needed to create and maintain quality.”

He adds that the concept behind TPR’s common data scores is sound, but the implementation needs to validate the accuracy of the data and that isn’t always straightforward.

“Let’s take National Insurance numbers – there isn’t an easy way to confirm whether an NI number

is accurate or not, but there are key indicators like format (2 letters, 6 numbers, 1 letter), and if it is a temporary number (starts with TN). So reporting on common data scores is only a useful exercise if the report looks at accuracy and not just presence of data. It can be a critical measure to drive improvements and track progress.” Cotton adds that good quality data can help reduce delays in the transfer market and is a critical step in any transfer deal.

“Ensuring scheme data is complete and accurate in advance can help companies and trustees exploit market opportunities and therefore facilitate a more efficient and fluid transfer market. Data cleansing and management should be on all pension schemes agendas, but those approaching or planning towards the transfer market should have data cleansing as a priority in their flight plan,” she says.

Corporate advisers are increasingly focused on the issue. LEBC Group director of public policy Kay Ingram says: We take protecting client data very seriously and invest heavily in cyber security. We hold the Government-backed Cyber Essentials accreditation, use GCI for internal vulnerability scanning on a continuous basis. We also recently had our insurance providers undertake a security audit and it was pleasing that they gave us a clean bill of health and commented that we had one of the highest scores for a business of our type.

“When dealing with providers and client data transfer we use our Intelliflo CRM which offers a high level of security and enables us to share documents with clients and providers securely. We do not undertake specific audits of providers data protection systems but expect them to operate to the highest standards.”

How the dashboard needs data

The development of the pension dashboard will be hugely reliant on data of course. The Money and Pension Service has said it will publish a clear timetable by the end of the year.

The DWP’s Pensions Schemes Bill impact assessment suggested schemes would have to invest in new software and IT architecture to be able to provide data to the dashboard. But costs ranged significantly.

The DWP estimated one-off implementation costs ranged from £200m to £580m over 10 years and ongoing costs range from £245m to £1.48bn over 10 years.

In a recent blogpost, Richard Smith, head of industry liaison on the Pensions Dashboards Programme considered some of the issues.

He wrote: “There will need to be a continual loop of user testing and refining the data standards: ensuring that users can understand the information that they’re being presented with on initial pensions dashboards, whilst also reflecting (at an appropriate level of detail) the key underlying complexities surrounding their pension entitlements.

“Being able to confidently digitally match individuals to all their pension entitlements is at the heart of the whole pensions dashboards endeavour.

“For a significant proportion of the pension entitlements they hold, many pension providers and schemes think they will be able to successfully match pensions against the individual’s National Insurance Number (NINo) and their date of birth (DOB), plus one other data item (such as the individual’s name).

“But there are known issues with the accuracy of some NINos received from employers or inherited from previous administrators. And when individuals move house, or change name, they can often fail to notify their pension  provider to update their data.

“Unless schemes are able to positively match against all these items, they will be reluctant under data protection rules to return pension information for the individual to view on their chosen pensions dashboard.”

Cotton says: “Members having access to their own data is likely to highlight issues leading to a spike in queries and complaints about pension schemes which haven’t kept on top of their records. The first key step for pension schemes is to understand where on this broad spectrum they sit. They can then take steps to improve data quality as quickly as possible, as data issues can take a long time to fix. The differing circumstances of pension schemes across the industry make issuing guidance tricky.”

Discussing the dashboard, Dodds adds: “Data cleansing requirements will vary from scheme to scheme – for some there will be very little cleansing whilst others face a potentially monumental task. As a rule of thumb, as you’d expect, schemes set up since 2000 are likely to have many digitised processes and inbuilt validations so their data is in significantly better shape. But members aren’t stagnant and are moving house, changing jobs, getting married (or divorced) and changing contribution rates all the time, so data management needs to be an ongoing commitment.”

Exit mobile version