(February 14, 2013) — Have you altered how you measure the longevity risk in your pension fund in recent years? Chances are you are now receiving an incorrect measure, leading academics have claimed.
The pension industry has witnessed an explosion in the number of mortality models available in recent years due, in part, to an increased focus on longevity risk by actuaries and governments, Professor David Blake and Andrew Hunt, director and fellow of the Pensions Institute at Cass Business School, have reported in a paper published today.
However, updating and creating new models may not have improved the outcome for the end users, the authors assert in the paper entitled “A General Procedure for Constructing Mortality Models“.
“Despite having more terms than the older models, they still fail to capture a lot of the information present in the data,” the paper assets. “Lacking a formal procedure for interrogating the data in order to establish what structure remains to be explained, modellers too often add new terms based on theoretical models or assumptions regarding the shape of the mortality curve rather than evidence.”
The authors said misinterpreted data would lead to incorrect and implausible forecasts for the end user – usually pension funds and life insurers. With this in mind they had created a “General Procedure” (GP), which is driven by forensic examination of data that could be used as a basis for building a mortality model from scratch.
“Through an iterative process, the GP identifies every significant demographic feature in the data in a sequence, beginning with the most important. For each demographic feature, we need to apply expert judgement to choose a particular parametric form to represent it. To do this, we need a ‘toolkit’ of suitable functions.”
Blake and Hunt assert that by following the GP, it is possible to construct mortality models with sufficient terms to capture accurately all the significant information present in the age, period, and cohort dimensions of the data.
The paper shows how each separate part of data should and can be examined separately before finally being related to biological and social evidence to produce a result.
“It is not a “black box” algorithm which can be deployed mechanically on various datasets, but rather requires a substantial investment of time to understand the underlying forces driving mortality within the population of interest and how these forces can be represented mathematically,” the authors said.
“Far from this being a disadvantage, we would argue that our approach accords perfectly with good model building practice, which seeks to move beyond a purely algorithmic approach in order to understand better the underlying structure of the data,” they concluded.
To read the entire paper click here and for Professor Blake’s column on Longevity for aiCIO, click here.
See our feature on the leading academics in institutional investment in the next issue of aiCIO, published at the end of this month.