The Evolution of Actuarial Modelling in Life Insurance

Insights
Author
Valerie Du Preez

Our actuarial forefathers made a breakthrough discovery. By studying data, they recognised patterns and determined that the past could be used to predict the future. Today, actuarial modelling is based on a similar concept.

Introducing Actuarial Tables

In the 14th Century, the Great plague was believed to have wiped out a third of the European population due to illness. In order to measure the risk of future outbreaks, the City of London started recording deaths and produced regular statistics of mortality. This responsibility of recording the number of deaths was handed over to the Worshipful Company of Parish Clerks in 1611 because churches organised funerals and thus were able to track deaths.

In 1662 John Graunt published a book titled: Natural and Political Observations Made upon the Bills of Mortality. Essentially, he converted raw data from the bills of mortality into the Life Table, a cornerstone of actuarial science. The Life Table showed that there were predictable patterns of longevity and mortality in a group of people of similar ages despite the uncertainty of the date of death of any one individual.

Later, in 1693 Edmond Halley demonstrated how these ‘predictable patterns’ could be used to calculate the amount each person in a group, on average should contribute to a common fund in order to cover the financial loss caused by a person’s death.

The following analysis process was developed and became familiar to future actuaries:

  1. Collect Data (Bills of mortality)

At first, the number of deaths were recorded, then the person’s age was recorded and finally the cause of death was added to the bills. As more data became available, more insights were gleaned.

  1. Construct a Model (Life Table)

A basic Life Table was constructed, to calculate life expectancy, which consisted mainly of:

  • Age - input to the model
  • Number of people alive at this age - input to the model
  • Number of people who died at this age - input to the model
  • Probability of dying at this age - calculated based on the number of people who died at this age divided by the number of people alive at this age
  • Number of years the people alive at this age are expected to live - calculated as summing the probability of not dying for all later ages. In some cases adjustments are made based on expected mortality improvements.
  1. Develop a Product (Life Insurance)

Two essential products based on mortality were developed, to protect customers from risk:

  • For a given set of premiums, a benefit is paid if you survive past a certain date in the future.
  • For a given set of premiums, a benefit  is paid in case of death before a certain date.

In time a fourth step would be added:

  1. Monitor the Experience

To refine future predictions, a check is done to compare actual death or longevity experience with those expected based on the constructed model above.

The primary check is to compare actual results with expected results. It is very unlikely that the amounts will be exactly the same, but if the difference is too great, then an investigation needs to be carried out to understand the drivers for the variance. Actuaries use statistical significance to determine if the difference is caused by random fluctuations or if there is an underlying cause.

Reference: M Greenwood, “The First Life Table” 1938

Today, life insurance actuarial modelling follows a similar analysis process.

Embracing Spreadsheets

The early-stage Life Tables were fundamentally analogue spreadsheets. Spreadsheet software was originally developed for accounting tasks but it didn’t take long for actuaries to adopt them. Spreadsheets analyse and store data in tabular form.

The main advantage of using the computer spreadsheet software over the paper-based Life Table was speed. Not only did the program do the probability of death and life expectancy calculations in a fraction of the time, but the columns contained formulae so that if the data was changed, the entire table could be automatically updated, instead of having to re-create the entire Life Table by hand. Spreadsheet software quickly advanced and introduced new tools to assist with data analysis and visualisation such as graphing tools, pivot tables and programming of macros.

The actuarial syllabus now requires all new students to be able to work with spreadsheets and it is still a commonly used tool in most actuarial teams.

Reference: D.J Power, “A Brief History of Spreadsheets” 2004

Evolvement to Actuarial Modelling Software

Generic spreadsheet software gets the actuarial modelling job done; however, for intensive computational workloads, speed and performance may be impacted. In addition, given the control and governance required for certain calculations, additional frameworks have to be applied to ensure quality and accuracy when it comes to using and adjusting spreadsheets.

In response to some of these imperfections, purpose-built financial modelling platforms were created; streamlined to do more calculations in less time. This gives insurers quicker access to results while reducing processing costs.

Actuarial modelling software has more built-in financial functions compared to generic spreadsheet software. The actuarial modelling software also tends to be more structured with dedicated spaces for input, parameters, calculations and output; as opposed to generic spreadsheets that start as blank tables and require time and effort to create a template that mimics that of actuarial modelling software. Some actuarial software also has pre-developed libraries and shortcuts to simulations, thus making the modelling process quicker than if the same model was developed in a spreadsheet.

With the advent of automation and cloud computing, actuarial modelling platforms are constantly being upgraded, and new ones are being released. Actuarial software has also evolved with the added skills of data scientists and IT specialists playing a role in updating actuarial software for the latest trends and solutions.

Embedded audit trails in financial modelling platforms keep a history of changes and assist with diagnosis when the model returns unexpected results. Before there was always a risk that someone using the spreadsheet might make a structural change to one of the cells by mistake. The new software also ensures a detailed audit trail is kept so that if the model starts misbehaving, it only requires a quick investigation to see the cause.

Modern modelling platforms typically have multiple working environments, including a development environment for editing model functionality, a test environment for release management,  and a production environment for running and viewing results. This division is an important risk management procedure as it reduces the probability of an unknown and unauthorised edit in a live production cycle, which could have a systemic impact on the entire company.

Automatic Documentation Features allows other people, besides the creator, to work on and understand the model. Today actuarial software allows for automatic documentation which provides instant documentation and allows actuaries to understand models developed by others. Before; the task of documenting the model, parameters and validation was a strenuous task and often neglected when time was tight.

Current Generation Modelling Platforms

The current generation of modelling platforms are also opening the door to allow for integration with legacy systems and integration with modern operational techniques such as workflow automation and Application Program Interfaces (APIs), making these platforms the potential backbone of enterprises.

By using APIs different software can become the building blocks that actuaries can piece together to create models. Data can be stored in databases, for example, and then called into the actuarial modelling software for model execution and then sent to spreadsheets or  visualisation software for further calculation or graphical analysis respectively.

These features also open the door to task automation and workflow orchestration to optimise processes.

In addition the power of cloud computing gives actuaries access to and control over multiple powerful offsite machines so that critical models can be run in a fraction of the time; and potentially scheduled at off-peak hours.

Today actuarial modelling software, together with powerful add-ons are giving actuaries tools to perform faster, more controlled and more efficient modelling tasks; with more data. This rule not only applies to traditional areas of pricing, reserving, capital management, policy administration, and sales forecasting but can be used in all areas of a business. Current actuarial modelling software is flexible enough to be used in many different functions and  are designed to help actuaries do more with less effort. This is helpful when there is so much pressure to provide additional regulatory, statutory and internal analysis and reporting requirements.

Are we in the era of actuarial data science modelling?

Formula Mathematics Equation Mathematical Symbol Geometry Information Concept[/caption]

A huge increase in data generation, data capture and data storage combined with significantly increased computing power is providing insurers with a unique opportunity to re-evaluate the value that their data can provide; and the technologies available to do that.

Enabling actuaries to embrace modern day data science tools and to work closely with data scientists is an important link that could give strategic advantages to insurers in the further development of actuarial modelling software.

Looking forward, the actuary will continue to evaluate key sources of data and need to find ways to incorporate data science that uses state of the art machine- learning and data technologies together with the actuary’s business insights. We need to refresh our methods and make use of emerging technological advances.

Some are turning to programming languages like Julia, Python and R; among other. With the rise of open-source execution environments computational notebooks, programming is becoming more accessible and easy to use.

Computational notebooks provides a modern day coding environment for users to perform data cleaning, data analysis, statistical modelling, numerical simulation and data visualisation. Some of these applications provides a free, open source interactive web tool for combining various coding, text and data analysis resources in one single development location.

This provides an interesting alternative for actuaries to execute large amounts of statistical calculations and see the results with the latest data visualisation techniques.

In the future, the flexibility of technological tools will give actuaries opportunities to get creative. Machine learning could  even advance to such a state that it is not only able to read and understand new regulations but could potentially also be able to translate the old results to the new regulatory requirements.

At the end of the day, the steps in the actuarial process from the original days of the Life Table haven’t changed. We still need to collect data, build models, develop products and monitor the results; within our professional framework. However, with the insurance environment becoming more complicated, the demand is growing for advanced solutions with features to unlock new opportunities and allow actuaries who embrace the digital advancements to do more in less time.

Michael Jordan, FASSA Dupro Advisory, 2019

Valerie du Preez, FIA, Dupro Advisory, 2019

You may also like these...

Newsletter

Keep up-to-date

Sign up to our newsletter for the latest in Archihandi

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.