1999 CAS SEMINAR ON RATEMAKING OPRYLAND HOTEL CONVENTION CENTER MARCH 11-12, 1999 MIS-43 APPLICATIONS OF THE MIXED EXPONENTIAL DISTRIBUTION CLIVE L. KEATINGE.

Slides:



Advertisements
Similar presentations
Introduction to Experience Rating
Advertisements

Assignment Nine Actuarial Operations.
Increased Limits, Excess & Deductible Ratemaking Joseph M. Palmer, FCAS, MAAA, CPCU Assistant Vice President Increased Limits & Rating Plans Division Insurance.
Fundamentals of Data Analysis Lecture 12 Methods of parametric estimation.
1 Math 479 / 568 Casualty Actuarial Mathematics Fall 2014 University of Illinois at Urbana-Champaign Professor Rick Gorvett Session 11: Individual Risk.
Market Risk VaR: Historical Simulation Approach
Commercial Property Size of Loss Distributions Glenn Meyers Insurance Services Office, Inc. Casualty Actuaries in Reinsurance June 15, 2000 Boston, Massachusetts.
GEN-3: Introduction to Increased Limits Factors
Introduction to Reinsurance Reserving Casualty Loss Reserve Seminar Washington, D.C. September 23, 2002 Bruce D. Fell, FCAS, MAAA Am-Re Consultants, Inc.
2008 Seminar on Reinsurance Reinsuring Commercial Umbrella Brian E. Johnson, ACAS, MAAA.
March 11-12, 2004 Elliot Burn Wyndham Franklin Plaza Hotel
Determining Sample Size
A New Exposure Base for Vehicle Service Contracts – Miles Driven CAS Ratemaking Seminar – Atlanta 2007 March 8, 2007Slide 1 Discussion Paper Presentation.
Reinsurance Structures and On Level Loss Ratios Reinsurance Boot Camp July 2005.
Introduction to Experience Rating Jim Sandor American Re-Insurance 2003 CAS Ratemaking Seminar 1234.
Lecture Presentation Software to accompany Investment Analysis and Portfolio Management Seventh Edition by Frank K. Reilly & Keith C. Brown Chapter 7.
Incorporating Catastrophe Models in Property Ratemaking Prop-8 Jeffrey F. McCarty, FCAS, MAAA State Farm Fire and Casualty Company 2000 Seminar on Ratemaking.
Business Statistics: Communicating with Numbers
2005 CLRS September 2005 Boston, Massachusetts
Basic Track I 2007 CLRS September 2007 San Diego, CA.
Ab Page 1 Advanced Experience Ratemaking Experience Rating and Exposure Shift Presented by Robert Giambo Swiss Reinsurance America Seminar on Reinsurance.
Basic Ratemaking Workshop: Intro to Increased Limit Factors Jared Smollik FCAS, MAAA, CPCU Increased Limits & Rating Plans Division, ISO March 19, 2012.
SALES COMPARISON APPROACH  THE PROCESS IN WHICH THE MARKET ESTIMATE IS DERIVED BY ANALYZING THE MARKET FOR SIMILAR PROPERTIES.  A MAJOR PREMISE OF THE.
1999 CASUALTY LOSS RESERVE SEMINAR Intermediate Track II - Techniques
VII-1 Stratification Case study to illustrate alternative methods to stratify a sampling frame Dr. Will Yancey, CPA This material is the property of the.
2004 CAS RATEMAKING SEMINAR INCORPORATING CATASTROPHE MODELS IN PROPERTY RATEMAKING (PL - 4) ROB CURRY, FCAS.
Estimating the Predictive Distribution for Loss Reserve Models Glenn Meyers Casualty Loss Reserve Seminar September 12, 2006.
 2012 NCCI Holdings, Inc. WC-5 Just How Credible Is That Employer? Proposed Experience Rating Plan Changes CAS RPM Seminar Philadelphia, PA March 21,
2007 CAS Predictive Modeling Seminar Estimating Loss Costs at the Address Level Glenn Meyers ISO Innovative Analytics.
Managerial Economics Demand Estimation & Forecasting.
Toward a unified approach to fitting loss models Jacques Rioux and Stuart Klugman, for presentation at the IAC, Feb. 9, 2004.
EXPOSURE RATING – UNIQUE APPLICATIONS: UMBRELLA PRICING ADEQUACY Halina Smosna Endurance Reinsurance Corp of America CARe June 1 & 2, 2006.
A. Overview of Current Reporting Requirements B. Quality Reviews.
Page 1 Additional Topics Pricing Umbrella and Excess on Excess The ISO Mixed Exponential Chris Svendsgaard Casualty Exposure Rating CARe Boot Camp 2005.
“The Effect of Changing Exposure Levels on Calendar Year Loss Trends” by Chris Styrsky, FCAS, MAAA Ratemaking Seminar March 10, 2005.
CAS Seminar on Ratemaking Introduction to Ratemaking Relativities (INT - 3) March 11, 2004 Wyndham Franklin Plaza Hotel Philadelphia, Pennsylvania Presented.
Pricing Excess Workers Compensation 2003 CAS Ratemaking Seminar Session REI-5 By Natalie J. Rekittke, FCAS, MAAA Midwest Employers Casualty Company.
March 9-10, 2000 The Contest - Part I CAS Seminar on Ratemaking SPE - 47 Thomas L. Ghezzi, FCAS, MAAA Katharine Barnes, FCAS, MAAA.
On Predictive Modeling for Claim Severity Paper in Spring 2005 CAS Forum Glenn Meyers ISO Innovative Analytics Predictive Modeling Seminar September 19,
Estimation and Application of Ranges of Reasonable Estimates Charles L. McClenahan, FCAS, MAAA 2003 Casualty Loss Reserve Seminar.
Glenn Meyers ISO Innovative Analytics 2007 CAS Annual Meeting Estimating Loss Cost at the Address Level.
Ab Rate Monitoring Steven Petlick CAS Underwriting Cycle Seminar October 5, 2009.
1 - © ISO, Inc., 2008 London CARe Seminar: Trend – U.S. Trend Sources and Techniques, A Comparison to European Methods Beth Fitzgerald, FCAS, MAAA, CPCU.
Steve White, FCAS MAAA, Guy Carpenter Property Ratemaking - an Advanced Approach Exposure Rating June 6-7, 2005.
Property Exposure Rating Types of Exposure Rating Curves
Increased Limits Ratemaking Joseph M. Palmer, FCAS, MAAA, CPCU Assistant Vice President Increased Limits & Rating Plans Division Insurance Services Office,
2000 SEMINAR ON REINSURANCE PITFALLS IN FITTING LOSS DISTRIBUTIONS CLIVE L. KEATINGE.
1 Introduction to Reinsurance Exposure Rating CAS Ratemaking Seminar Session REI-47 March 12, Las Vegas Ira Kaplan
1 Solving the Puzzle: The Hybrid Reinsurance Pricing Method John Buchanan CAS Ratemaking Seminar – REI 4 March 17, 2008 CAS RM 2008 – The Hybrid Reinsurance.
Session C7: Dynamic Risk Modeling Loss Simulation Model Working Party Basic Model Underlying Prototype Presented by Robert A. Bear Consulting Actuary and.
1 Casualty Loss Reserve Seminar Claudette Cantin, FCIA, FCAS, MAAA Munich Reinsurance Company of Canada September 14, 2004 Las Vegas Session 7 Loss Reserve.
CARe Seminar ILF estimation Oliver Bettis 15 th September 2009.
1 Mirage Re Introduction to Experience Rating Joy Takahashi - American Re Brokered Group CAS Ratemaking Seminar Session REI-47 March 12, 2001 Las Vegas,
©Towers Perrin Introduction to Reinsurance Reserving Casualty Loss Reserve Seminar Atlanta, Georgia September 11, 2006 Christopher K. Bozman, FCAS, MAAA.
1 Introduction to Exposure and Experience Pricing Methods A Case Study John Buchanan CAS Ratemaking Seminar – REI 3 March 17, 2008 CAS RM 2008 – Introduction.
Basic Track II 2004 CLRS September 2004 Las Vegas, Nevada.
Reserving for Medical Professional Liability Casualty Loss Reserve Seminar September 10-11, 2001 New Orleans, Louisiana Rajesh Sahasrabuddhe, FCAS, MAAA.
Introduction to Reinsurance Reserving Casualty Loss Reserve Seminar Chicago, Illinois September 9, 2003 Christopher K. Bozman, FCAS, MAAA.
A. Overview of Current Reporting Requirements B. Quality Reviews.
September 11, 2001 Thomas L. Ghezzi, FCAS, MAAA Casualty Loss Reserve Seminar Call Paper Program Loss Reserving without Loss Development Patterns - Beyond.
Basic Track I 2008 CLRS September 2008 Washington, DC.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
“The Effect of Changing Exposure Levels on Calendar Year Loss Trends” by Chris Styrsky, FCAS, MAAA MAF Seminar March 22, 2005.
1998 CASUALTY LOSS RESERVE SEMINAR Intermediate Track II - Techniques
Sampling Distributions and Estimation
2000 CAS RATEMAKING SEMINAR
DISCUSSION OF MINIMUM DISTANCE ESTIMATION OF LOSS DISTRIBUTIONS BY STUART A. KLUGMAN AND A. RAHULJI PARSA CLIVE L. KEATINGE.
1999 CLRS September 1999 Scottsdale, Arizona
Cost of Capital Issues April 16, 2002 John J. Kollar.
Market Risk VaR: Historical Simulation Approach
Presentation transcript:

1999 CAS SEMINAR ON RATEMAKING OPRYLAND HOTEL CONVENTION CENTER MARCH 11-12, 1999 MIS-43 APPLICATIONS OF THE MIXED EXPONENTIAL DISTRIBUTION CLIVE L. KEATINGE AND JOHN NOBLE Insurance Services Office

Features of the Mixed Exponential Increased Limits Procedure: Ability to include Excess, Umbrella, and Deductible data. Ability to treat policy limit censorship without an a priori distribution assumption. Promotes clarity in evaluating various distribution fits. Ability to reflect differences in severity distributions by policy limit. Use of mixed exponential distributions instead of mixed Pareto distributions. Allows closer fits to the underlying data. Lack of constraints across increased limits tables.

The two main steps of the mixed exponential increased limits procedure are: 1. Constructing an empirical distribution 2. Fitting a mixed exponential distribution to an empirical distribution

Constructing an empirical distribution

The construction of an empirical distribution will be illustrated with sample data in the following format: AccidentSettlement Lag Year * * * 2* * * 3* * 4*

 Each * represents the collection of all settled occurrences in that cell.  Each cell contains all occurrences settling at that lag for that accident year. A calendar accident year occurrence settling between January 1 and December 31 of that same year is defined to have settled in Lag 1. Settlement Lag = Average Payment Year - Accident Year + 1  As a first step, all settled occurrences are trended to the average accident date for which a loss distribution is desired allowing us to combine data for all accident years within each lag. Later, distributions for each lag will be weighted together to implicitly account for development.

Data from each lag must be treated separately because: 1. Later lags tend to have a greater percentage of large losses than earlier lags since occurrences that take longer to settle are generally more severe than claims that settle quickly. 2. The particular lag’s data includes different groups of accident years which likely have different exposure amounts. This will be addressed in the lag weighting procedure. For each lag, we want to obtain the empirical survival function (survival probability) at a number of different dollar amounts. Survival Function = S(x) = 1 - Cum. Distribution Function = 1 - F(x)

The survival probabilities at each dollar amount will be derived (‘built-up’) using a series of conditional survival probabilities (CSP’s). This will allow us to more easily include: 1. Losses Censored by Policy Limit 2. Excess, Umbrella and Deductible data 3. Composite-Rated Risks data

To obtain the empirical survival function, we will use a variation of the Kaplan-Meier Product-Limit Estimator. This has historically been used extensively in survival analysis. For further details see:  Loss Models: From Data to Decisions, by Klugman, Panjer and Willmot  Survival Analysis: Techniques for Censored and Truncated Data, by Klein and Moeschberger Both of these texts will be used for CAS/SOA Exams 3 and 4 beginning next year.

Illustrative Example: Trended Data for Lag One OccurrenceOccurrence Attachment Policy ID Number Size Point LimitComment 1 5, , , , , ,000 Loss Censored 4 5,000 7,500 15,000 Deductible Data 5 5, , , , , , ,00015,000 30,000 Excess Data 9 15, , , , , , ,00015, ,000 Excess Data

We will obtain the empirical survival function at the following dollar amounts: 10,000 20,000 40,000 These survival probabilities will be derived by multiplying the relevant conditional survival probabilities (CSP’s) for each dollar amount.

Two conditions must be met in each CSP calculation so as not to bias the size of loss distribution: 1. Use only those occurrences with policy limit plus attachment point greater than or equal to the upper bound. This avoids a downward severity bias. We exclude those occurrences with policy limits that preclude an occurrence from penetrating the upper bound. This accounts for policy limit censorship. 2. Use only those occurrences with attachment points less than or equal to the lower bound. This avoids an upward severity bias. We don’t have information on occurrences below the attachment point (including their contribution to the CSP’s). To allow excess data to impact the CSP’s below the attachment point would create a bias in the severity distribution.

Conditional Survival Probabilities Condition: CSP e1 (10,000|0) = PL + AP  10,000 P(X  10,000|X > 0) AP = 0 CSP e1 (20,000|10,000) = PL + AP  20,000 P(X  20,000|X  10,000) AP  10,000 CSP e1 (40,000|20,000) = PL + AP  40,000 P(X  40,000|X  20,000) AP  20,000 AP = Attachment Point PL = Policy Limit X = Gross Loss Amount e 1 = empirical lag 1

CSP e1 (10,000|0) = P(X  10,000|X > 0) Number of occurrences with: Occurrence Size + AP  10,000 Policy limit + AP  10,000 AP = 0 Number of occurrences with: Occurrence Size + AP> 0 Policy limit + AP  10,000 AP = 0 6 (occurrences 3, 6, 7, 9, 10, 11) 9 (occurrences 1, 2, 3, 5, 6, 7, 9, 10, 11) Only occurrences with policy limits plus attachment point greater than or equal to 10,000 are used. Only occurrences with attachment point equal to zero are used.

CSP e1 (20,000|10,000) = P(X  20,000|X  10,000) Number of occurrences with: Occurrence Size + AP  20,000 Policy limit + AP  20,000 AP  10,000 Number of occurrences with: Occurrence Size + AP> 10,000 Policy limit + AP  20,000 AP  10,000 3 (occurrences 7, 10, 11) 6 (occurrences 4, 6, 7, 9, 10, 11) Only occurrences with policy limits plus attachment point greater than or equal to 20,000 are used. Only occurrences with attachment point less than or equal to 10,000 are used.

CSP e1 (40,000|20,000) = P(X  40,000|X  20,000) Number of occurrences with: Occurrence Size + AP  40,000 Policy limit + AP  40,000 AP  20,000 Number of occurrences with: Occurrence Size + AP> 20,000 Policy limit + AP  40,000 AP  20,000 1 (occurrence 12) 4 (occurrences 8, 10, 11, 12) Only occurrences with policy limits plus attachment point greater than or equal to 40,000 are used. Only occurrences with attachment point less than or equal to 20,000 are used.

We now calculate the empirical survival probabilities: S e1 (10,000) = P(X  10,000) =CSP e1 (10,000|0) = P(X  10,000|X > 0) =6/9 = 2/3 S e1 (20,000) = P(X  20,000) =CSP e1 (10,000|0) * CSP e1 (20,000|10,000) =P(X  10,000|X > 0) * P(X  20,000|X > 10,000) =6/9 * 3/6 = 1/3 S e1 (40,000) = P(X  40,000) =CSP e1 (10,000|0) * CSP e1 (20,000|10,000) * CSP e1 (40,000|20,000) =P(X  10,000|X > 0) * P(X  20,000|X > 10,000) * P(X  40,000|X > 20,000) =6/9 * 3/6 * 1/4= 1/12

Lag Weight Calculation Lag weights are used to combine the survival probabilities of the lags. This results in an overall combined empirical survival probability. Note: Occurrences from policies with nonzero attachment points are not used. The weights reflect the lag distribution of ground-up occurrences and the inclusion of Excess & Umbrella data might distort the results.

The ratios between adjacent lag weights are: Lag 2/Lag 1 Lag 3/Lag 2 Lag 4/Lag 3 Simplified example: Assume the following number of occurrences by accident year and lag. Note: In practice, a settlement lag model is estimated by maximum likelihood.

Lag 1: Lag 4: Lag 2: Lag 3: The Lag Weights Are: The use of ratios of occurrences between lags avoids distortions that may otherwise result from: the shape of the experience period accident year exposure growth

Now we combine the empirical survival functions at each lag using the lag weights: Empirical Survival Function Dollar Lag Amount , , , Lag Weights Dollar Lag Amount , , ,

Empirical survival function for 10,000, 20,000, and 40,000: S e (10,000) =.36    .90 =.78 S e (20,000) =.36    .63 =.47 S e (40,000) =.36    .40 =.22

Typical dollar amounts at which the empirical survival function is calculated are: 10 6, ,000 5,000, , ,000 6,000, , ,000 8,000, , ,000 10,000, , ,000 15,000, , ,000 20,000,000 1,000 30, ,000 25,000,000 1,500 40,0001,000,000 30,000,000 2,000 50,0001,500,000 40,000,000 2,500 60,0002,000,000 50,000,000 3,000 80,0002,500,000 60,000,000 4, ,0003,000,000 80,000,000 5, ,0004,000,000100,000,000

In actual practice: The latest five diagonals of data are used. Lags n and greater are combined when calculating the empirical survival function since there is no clear difference among them. For General Liability n is 7. For Commercial Automobile Liability n is 5. A settlement lag model based on a multinomial distribution is used to include exposure to all settlement lags. An exponential decay assumption implicitly provides nonzero weight at all possible settlement lags.

Fitting a Mixed Exponential Distribution to an Empirical Distribution

Once the overall empirical survival function has been calculated at a number of different dollar amounts, a curve may be fit. Maximum likelihood estimation or minimum distance estimation may be used to obtain the fitted mixed exponential parameters. The assumption is made that the density function f(x) of the curve should decrease and gradually flatten out as x becomes large. Mathematically speaking, the curve should have alternating derivatives (f(x) > 0, f '(x) 0,...) for all x. A theorem proved by S. Bernstein (1928) states that a function has alternating derivatives if and only if it can be written as a mixture of exponential distributions.

Many common distributions fit to losses either have the alternating derivative property or nearly have the alternating derivative property. The Pareto distribution, in particular, has this property, since it is a mixture of exponential distributions, with the mixing distribution being an inverse gamma distribution. The general mixed exponential distribution is extremely flexible, since there are no restrictions on the mixing distribution. In particular, the mixed exponential distribution will provide at least as good a fit as the Pareto distribution or any mixture of Pareto distributions, since the Pareto distribution is a special case of the mixed exponential distribution.

The fitted survival function, based on the mixed exponential distribution, is: - Optimal values of w 1,..., w n and m 1,..., m n can be obtained by grouped maximum likelihood estimation or minimum distance estimation. The dollar amounts at which the empirical survival function is calculated serve as the group boundaries (for maximum likelihood estimation) or as the points at which the distance function is evaluated (for minimum distance estimation). - To obtain the optimal mixed exponential distribution, n is permitted to be as large as necessary. Generally, the optimal distribution is found when n is between 5 and 10. By increasing n above the optimal number, the likelihood function cannot be further increased. Likewise, the distance function cannot be further decreased.

As an example, suppose a fit to the empirical loss size distribution produced the following optimal set of means and weights: Mixed Exponential Distribution Mean Weight 1 1, , , , , ,000, The survival function is:

The limited average severity (LAS) function is: LAS(x) = where n equals the number of exponential distributions required. n=6 in this example.

Increased limits factors based on this model are as follows: PolicyLimited Average Limit Severity ILF* 100,000 15, ,000 25, ,000 30, ,000,000 38, ,000,000 47, ,000,000 63, ,000,000 74, * Excluding LAE and risk load provisions

Additional Considerations

Other adjustments that must be made as part of the increased limits procedure are: Composite rated risk and excess & umbrella data are not segregated by increased limits severity table. A Bayesian allocation is used to distribute each occurrence in a given lag. The relative volume of each table in the lag along with the empirical survival distributions for that lag are used in the allocation. Because of the low volume of data in the tail of the distributions, adjustments may be appropriate to ensure stability in the tail for the individual severity tables as well as the relativities between them.

Tail of the Empirical Distribution: An example To smooth the tail of the empirical distribution: 1. Select a truncation point where the credibility of the empirical survival probabilities becomes low. 2. Over successive intervals just below the truncation point, use percentile matching to examine the indicated parameters of various distributions. 3. Select a distribution which shows stable parameter indications. There should be no parameter trend over this stable region. 4. Use this distribution type, with parameters fit to the stable region, to smooth the empirical distribution beyond the truncation point.

Hypothetical Increased Limits Table Empirical Distribution below Truncation Point Empirical Survival Indicated Q Dollar Amount Function Parameter 800, ,000, ,500, ,000, ,500, ,000, ,000, Truncation Point: 4,000,000 Range of Stability: 800,000-4,000,000 Selected Distribution Type: Pareto

Hypothetical Increased Limits Table Empirical Survival Empirical Survival Dollar Amount Function (without Tail)Function (with Tail) 10, , , , , , ,000, ,500, ,000, ,000, ,000, ,000, ,000, ,000, ,000,

COMPARISON OF FIT TO THE EMPIRICAL DATA Hypothetical Increased Limits Table Empirical Limited Fitted Limited Percentage Policy Limit Average Severities Average Severities Difference 100,000 9,977 9, % 200,00013,144 13, % 250,00014,254 14, % 300,00015,180 15, % 500,00017,811 17, % 800,00020,141 20, % 1,000,00021,137 21, % 1,500,00022,756 22, % 2,000,00023,663 23, % 2,500,00024,323 24, % 3,000,00024,837 24, % 4,000,00025,553 25, % 5,000,00026,045 26, % 10,000,00027,088 27, %

Current Status of the Mixed Exponential Increased Limits Procedure ISO intends to use the new increased limits methodology in its reviews for Commercial Liability Lines in General Liability has been reviewed and a filing based on this methodology is expected to be made later this year. ISO completed a review of Legal Professional Liability using this new methodology. This enabled us to easily include the relatively large amount of deductible data. This review was not filed and is available for separate purchase.

Research using the Mixed Exponential Increased Limits Procedure ISO is currently reviewing and considering the filing of: Other lines of business, including Commercial Automobile Liability and Medical Professional Liability Separate Premises/Operations Liability tables by state group A change in increased limits table and state group compositions for Commercial Automobile Liability A change in the increased limits table class composition for General Liability A credibility procedure to complement the empirical survival function where necessary Other Related Research Efforts Evaluation of varying ALAE by policy limit Reevaluation of the current risk load procedure

Further details about the entire mixed exponential increased limits procedure are planned in future ISO Actuarial Services increased limits circulars. In addition, for those who are interested, an updated draft of a paper on the mixed exponential distribution will be available in a few weeks. You may sign up to receive a copy after the session.