Download presentation
Presentation is loading. Please wait.
Published byPercival Lindsey Modified over 9 years ago
1
Introduction to Experience Rating Jim Sandor American Re-Insurance 2003 CAS Ratemaking Seminar 1234
2
Introduction to Experience Rating l Classical Burning Cost Method l Frequency Based Method
3
3 Classical Burning Cost Method Basic Steps l Obtain large loss listing and calculate nominal excess losses in layer (i.e. 100k xs 100k). l Apply trend factors; cap at policy limits. l Apply loss development factors. l Divide losses by adjusted subject premium to derive an expected loss cost.
4
4 Classical Burning Cost Step 1 - Collect data 198255,692 300,000100,000 598 75,3240 698130,235 100,0000 1499 1,152,028 1,000,000100,000 1900175,274 75,274 3802360,044 1,000,000100,000 Total 5,747,914997,631 LogAYRptd LossPol LimitLoss in Layer Note: Losses include ALAE. Not all losses are displayed.
5
5 Classical Burning Cost Step 2 - Trend Trend Trended PolicyLoss in LogAYFactor Loss Limit Layer 1981.338 342,174 300,000100,000 5981.338 100,801 801 6971.338 174,284 100,000 0 14991.262 1,454,4091,000,000100,000 19001.191 208,754100,000 38021.060 381,6471,000,000100,000 Total6,907,025 1,234,012 Total w/ freq trend 1,312,100
6
6 Classical Burning Cost Step 3 - Loss Development Trended XSUltimate AYLoss in LayerLDF Loss in Layer 98 251,5001.238 311,300 99 300,1001.485 445,600 00 212,2002.302 488,500 01 442,7004.6042,038,100 02 105,500 41.4324,370,300 Total1,312,1007,653,800
7
7 Classical Burning Cost Step 4 - Divide by Subject Premium Nominal Trended Tr & Dev AY Adj SEP $ % $ % $ % 9812,763 144.4 1.1% 251.5 2.0% 311.3 2.4% 9918,233 215.5 1.2% 300.1 1.6% 445.6 2.4% 0023,133 175.3 0.8% 212.2 0.9% 488.5 2.1% 0126,460 362.5 1.4% 442.7 1.7% 2,038.1 7.7% 0231,500 100.0 0.3% 105.5 0.3% 4,370.3 13.9% Est ‘03 40,000 400.8 1.0% 533.6 1.3% 967.5 2.4%
8
8 Classical Burning Cost Potential Problems l Presence or absence of a few large claims drives the indicated rates. l Order of application of development, trend and capping makes a difference. l Trending individual claims past policy limits. l Impact of current policy limit profile vs. historicals. l History not reflective of current situation: reserving practices, type of business, coverage, etc.
9
9 Frequency Based Method Basic Steps l Estimate # of claims above a data limit (e.g. 28 claims > $50,000). l Use size of loss curves to project # of claims above the retention (e.g. 14.4 claims > $100,000 retention). l Distribute the projected counts by policy limit; eliminate counts with policy limit below retention (e.g. 12.25 claims if 15% of exposure has $100,000 limits). l Use size of loss curves to project average severity of claims in layer (e.g. $69,495 sev. in 100 x 100 layer). l Multiply frequency by severity to get total losses. l Divide by adjusted subject premium to get expected loss cost.
10
10 Frequency Based Step 1 - Project # of Claims Above Data Limit Detrended Actual Freq Clm Cnt Projected AYData Limit # > DDLTrend Dev Fctr# > DL 98 37,363 61.104 1.050 6.96 99 39,605 81.082 1.15510.00 00 41,981 51.061 1.559 8.27 01 44,500131.040 2.33931.63 02 47,170 51.020 5.84729.82 Selected 50,00028.00
11
11 Frequency Based Step 1a - Selection Process Projected AY # > DL Adj SEPFrequency# @ 03 Levels 98 6.96 12,763.545 21.8 99 10.00 18,233.549 21.9 00 8.27 23,153.357 14.3 01 31.63 26,460 1.196 47.8 02 29.82 31,500.947 37.9 Selected 40,000.700 28.00
12
12 Frequency Based Step 2 - Project # of Claims Above Retention Projected Limit Retention # > Ret. 50,000 xs 50,000 28.00 100,000 xs 100,000 14.41 * 300,000 xs 200,000 7.22 * 500,000 xs 500,000 2.84 * * Note: these were derived from pareto size-of-loss curve frequency formula: N × [(DL + B)/(R + B)] ^ Q
13
13 Frequency Based Step 3 - Include Impact of Policy Limits Projected # Clms by Pol Limit New Limit Retention # > Ret 100 300 500 1MM # > Ret 50,000 50,000 28.004.20 5.60 7.00 11.20 28.00 100,000 100,000 14.41 2.16 2.88 3.60 5.76 12.25 300,000 200,000 7.22 1.08 1.44 1.81 2.89 6.14 500,000 500,000 2.84.43.57.71 1.14 1.14 ‘03 Policy Limit Distribution:15% 20% 25% 40% Note: Claims below line are eliminated from the layer due to policy limits.
14
14 Frequency Based Step 4 - Estimate Loss $ in Layer ProjectedAvg Sev. Loss Cost LimitRetention # > Ret.in Layerin Layer 100,000 100,000 14.4169,4951,001,423 100,000 100,000 12.2569,495 851,210 Note: Average severities are from pareto size-of-loss curve severity formula: [(R+B)/(Q-1)] × {1 - [(R+B)/(R+L+B)]^(Q-1)}
15
15 Frequency Based Method Step 5 - Divide by Subject Premium SubjectSelected Loss Cost Earned Prem. $ % 40,000,000851,2102.1%
16
16 Frequency Based Method Potential Problems l Credibility of claim count development factors l Adjustment of development factors by data limit l Picking an appropriate data limit l Testing of size-of-loss assumptions
17
17 Frequency Based Method Advanced Techniques Goal: Fitting individual claim data to size-of-loss curve. â Trend individual claims to common accident date. â Develop trended individual claims to ultimate, using report year development factors if available. â Fit developed and trended claims to size-of-loss curve. â Test curve with actual data and industry curves. â Use new fitted curve in frequency based method to derive new loss cost.
18
18 Frequency Based Method Advanced Techniques Comparison of Actual and Fitted Average Severities (in 000’s)
19
19 Frequency Based Method – Curve Fitting Potential Problems l Volume of individual too low for credible fit. l No adjustment for IBNR claims. â Can have a significantly different shape than reported l Result can be very sensitive to trend assumption. l Must make adjustment for Policy Limits.
20
20 Experience Rating Comparison of Methods Classical Burning Cost Original Alternative Est. Losses $ 1,089,100 967,500 Est. Loss Cost % 2.7% 2.4% Frequency Based Mtd Original Co. Fitted Est. Losses $ 851,210 955,118 Est. Loss Cost % 2.1% 2.4% Selected 1,000,000 2.5%
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.