Processing math: 100%
Chin, Stephanie, and Greg McNulty. 2023. “A Bayesian Model for Estimating Long Tailed Excess of Loss Reinsurance Loss Costs.” CAS E-Forum Summer (August).
Download all (20)
  • Figure 1. Diagram of Hierarchical Bayesian Frequency Model
  • Table 1. Sample Frequency Model Data
  • Figure 2. Fitted Gamma to empirical distribution of ˆλprior_co[i]
  • Figure 3. Weibull cumulative development pattern with fixed shape and varying scale parameter
  • Figure 4. Weibull cumulative development pattern with fixed scale and varying shape parameter
  • Figure 5. Weibull parameter joint distribution resembles a Clayton copula
  • Figure 6. Fitted Clayton copula and Akaike information criterion rankings
  • Figure 7. Posterior Weibull parameters plotted against cedent data
  • Figure 8. Implied Frequency Credibility Comparison
  • Figure 9. Comparison of Weibull Parameter Joint Distributions
  • Figure 10. Bayesian frequency model posterior output and MCMC convergence plots
  • Figure 11. Weibull parameter joint distribution
  • Figure 12. Diagram of Hierarchical Bayesian Severity Model
  • Table 2. Pareto α Priors by Cedent
  • Figure 13. Fixed Pareto α development
  • Figure 14. Implied severity credibility for various observed claim counts
  • Figure 15. Bayesian severity model posterior output and MCMC convergence plots
  • PDF Supplement

Abstract

In this paper we will describe a Bayesian model for excess of loss reinsurance pricing which has many advantages over existing methods. The model is currently used in production for multiple lines of business at one of the world’s largest reinsurers. This model treats frequency and severity separately. In estimating ultimate frequency, the model analyzes nominal claim count data jointly against uncertain ultimate frequency and development pattern priors, allowing for more careful analysis of sparse claim count information and properly differentiating between triangulated and last diagonal data. The severity model is pragmatic, yet accounts for severity distribution development and weighs the volume of data against prior distributions. The model is programmed in R and Stan, thus eliminating the need for a considerable amount of algebra and calculus and the necessity to use conjugate prior distribution families. We compare this method with the more established Buhlmann-Straub credibility application to excess of loss pricing (for instance in Cockroft), and the more complex model given by Mildenhall, showing numerous advantages of our method.

Accepted: July 07, 2023 EDT