To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure firstname.lastname@example.org
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
The least squares Monte Carlo method has become a standard approach in the insurance and financial industries for evaluating a company’s exposure to market risk. However, the non-linear regression of simulated responses on risk factors poses a challenge in this procedure. This article presents a novel approach to address this issue by employing an a-priori segmentation of responses. Using a K-means algorithm, we identify clusters of responses that are then locally regressed on their corresponding risk factors. The global regression function is obtained by combining the local models with logistic regression. We demonstrate the effectiveness of the proposed local least squares Monte Carlo method through two case studies. The first case study investigates butterfly and bull trap options within a Heston stochastic volatility model, while the second case study examines the exposure to risks in a participating life insurance scenario.
This article proposes a continuous time mortality model based on calendar years. Mortality rates belong to a mean-reverting random field indexed by time and age. In order to explain the improvement of life expectancies, the reversion level of mortality rates is the product of a deterministic function of age and of a decreasing jump-diffusion process driving the evolution of longevity. We provide a general closed-form expression for survival probabilities and develop it when the mean reversion level of mortality rates is proportional to a Gompertz–Makeham law. We develop an econometric estimation method and validate the model on the Belgian population.
Wavelet theory is known to be a powerful tool for compressing and processing time series or images. It consists in projecting a signal on an orthonormal basis of functions that are chosen in order to provide a sparse representation of the data. The first part of this article focuses on smoothing mortality curves by wavelets shrinkage. A chi-square test and a penalized likelihood approach are applied to determine the optimal degree of smoothing. The second part of this article is devoted to mortality forecasting. Wavelet coefficients exhibit clear trends for the Belgian population from 1965 to 2015, they are easy to forecast resulting in predicted future mortality rates. The wavelet-based approach is then compared with some popular actuarial models of Lee–Carter type estimated fitted to Belgian, UK, and US populations. The wavelet model outperforms all of them.
This article proposes a neural-network approach to predict and simulate human mortality rates. This semi-parametric model is capable to detect and duplicate non-linearities observed in the evolution of log-forces of mortality. The method proceeds in two steps. During the first stage, a neural-network-based generalization of the principal component analysis summarizes the information carried by the surface of log-mortality rates in a small number of latent factors. In the second step, these latent factors are forecast with an econometric model. The term structure of log-forces of mortality is next reconstructed by an inverse transformation. The neural analyzer is adjusted to French, UK and US mortality rates, over the period 1946–2000 and validated with data from 2001 to 2014. Numerical experiments reveal that the neural approach has an excellent predictive power, compared to the Lee–Carter model with and without cohort effects.
In this paper, we address the issue of determining the optimal contribution rate of a defined benefit pension fund. The affiliate's mortality is modelled by a jump process and the benefits paid at retirement are function of the evolution of future salaries. Assets of the fund are invested in cash, stocks, and a rolling bond. Interest rates are driven by a Vasicek model. The objective is to minimize both the quadratic spread between the contribution rate and the normal cost, and the quadratic spread between the terminal wealth and the mathematical reserve required to cover benefits. The optimization is done under a budget constraint that guarantees the actuarial equilibrium between the current asset and future contributions and benefits. The method of resolution is based on the Cox–Huang approach and on dynamic programming.
This paper addresses some of the problems a majority of retired individuals face: Why and in what proportion should they invest in a life annuity to maximize the utility of their future consumption or a bequest? The market considered in this work is made up of three assets: a life annuity, a risky asset and a cash account. As this problem doesn’t accept any suitable explicit solution, it is numerically solved by the Markov Chain approximation developed by Kushner and Dupuis. Without a bequest motive, we observe that the optimal planning of consumption is divided into two periods and that optimal asset allocation should include the risky asset. Next, the influence of a bequest on consumption and investment pattern is developed. We demonstrate that even with a bequest motive, pensioners should allocate a part of their wealth to the purchase of life annuities.
Email your librarian or administrator to recommend adding this to your organisation's collection.