1 Regression Models & Loss Reserve Variability Prakash Narayan Ph.D., ACAS 2001 Casualty Loss Reserve Seminar.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Computational Statistics. Basic ideas  Predict values that are hard to measure irl, by using co-variables (other properties from the same measurement.
The Simple Regression Model
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Regression Analysis Module 3. Regression Regression is the attempt to explain the variation in a dependent variable using the variation in independent.
Ch11 Curve Fitting Dr. Deshi Ye
Random effects estimation RANDOM EFFECTS REGRESSIONS When the observed variables of interest are constant for each individual, a fixed effects regression.
Chapter 12 Simple Linear Regression
1 1 Slide © 2014 Cengage Learning. All Rights Reserved. May not be scanned, copied or duplicated, or posted to a publicly accessible website, in whole.
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Simple Linear Regression Model: Specification and Estimation
Yard. Doç. Dr. Tarkan Erdik Regression analysis - Week 12 1.
1-1 Regression Models  Population Deterministic Regression Model Y i =  0 +  1 X i u Y i only depends on the value of X i and no other factor can affect.
Chapter 10 Simple Regression.
1 Simple Linear Regression Chapter Introduction In this chapter we examine the relationship among interval variables via a mathematical equation.
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
SIMPLE LINEAR REGRESSION
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
FIN357 Li1 The Simple Regression Model y =  0 +  1 x + u.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Linear regression models in matrix terms. The regression function in matrix terms.
Simple Linear Regression Analysis
Adaptive Signal Processing
Objectives of Multiple Regression
Regression and Correlation Methods Judy Zhong Ph.D.
SIMPLE LINEAR REGRESSION
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Inference for regression - Simple linear regression
Correlation and Linear Regression
Hypothesis Testing in Linear Regression Analysis
1 FORECASTING Regression Analysis Aslı Sencer Graduate Program in Business Information Systems.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
1 1 Slide © 2005 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
1 1 Slide © 2007 Thomson South-Western. All Rights Reserved Chapter 13 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
1 1 Slide © 2008 Thomson South-Western. All Rights Reserved Chapter 15 Multiple Regression n Multiple Regression Model n Least Squares Method n Multiple.
Introduction to Linear Regression
1 Dr. Jerrell T. Stracener EMIS 7370 STAT 5340 Probability and Statistics for Scientists and Engineers Department of Engineering Management, Information.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
Lecture 8 Simple Linear Regression (cont.). Section Objectives: Statistical model for linear regression Data for simple linear regression Estimation.
Inference for Regression Simple Linear Regression IPS Chapter 10.1 © 2009 W.H. Freeman and Company.
1 Multiple Regression A single numerical response variable, Y. Multiple numerical explanatory variables, X 1, X 2,…, X k.
Regression Analysis Part C Confidence Intervals and Hypothesis Testing
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
Multiple Regression. Simple Regression in detail Y i = β o + β 1 x i + ε i Where Y => Dependent variable X => Independent variable β o => Model parameter.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Model Building and Model Diagnostics Chapter 15.
Data Modeling Patrice Koehl Department of Biological Sciences National University of Singapore
Dept. E.E./ESAT-STADIUS, KU Leuven
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Regression Analysis Deterministic model No chance of an error in calculating y for a given x Probabilistic model chance of an error First order linear.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Regression Analysis Part A Basic Linear Regression Analysis and Estimation of Parameters Read Chapters 3, 4 and 5 of Forecasting and Time Series, An Applied.
Lecturer: Ing. Martina Hanová, PhD..  How do we evaluate a model?  How do we know if the model we are using is good?  assumptions relate to the (population)
Data Modeling Patrice Koehl Department of Biological Sciences
Correlation and Linear Regression
Chapter 4: Basic Estimation Techniques
Regression Analysis AGEC 784.
Statistical Methods For Engineers
The regression model in matrix form
The Regression Model Suppose we wish to estimate the parameters of the following relationship: A common method is to choose parameters to minimise the.
10701 / Machine Learning Today: - Cross validation,
Some issues in multivariate regression
Simple Linear Regression
Product moment correlation
SIMPLE LINEAR REGRESSION
Parametric Methods Berlin Chen, 2005 References:
St. Edward’s University
Presentation transcript:

1 Regression Models & Loss Reserve Variability Prakash Narayan Ph.D., ACAS 2001 Casualty Loss Reserve Seminar

2 Regression Models and Loss Reserve Variability Range for Loss Reserve: Ultimate loss will be different from any estimate A measure of variability or range of loss reserve is needed to monitor reserve levels We have methods that can be implemented in EXCEL to estimate reserve variability

3 Regression Models and Loss Reserve Variability Ad Hoc Methods of Reserve Ranges Ad Hoc Methods of Reserve Ranges: Use a % of ultimate reserve based on line of business and professional judgment.Use a % of ultimate reserve based on line of business and professional judgment. Use variety of methods and select a range.Use variety of methods and select a range. Use high and low development factors (various alternatives).Use high and low development factors (various alternatives).

4 Regression Models and Loss Reserve Variability Statistical Methods: Development Factor Variability Models (Mack/Murphy) Least Square Method (does not provide parameter uncertainty) Log Regression Method Variety of other methods (not discussed here)

5 Regression Models and Loss Reserve Variability Notation: x i, j = Losses Paid (reported) for the accident year i in development year j. (incremental losses) i,j = 1,... n. j y i,j = Σ x i k k=1 We observe x i, j for i = 1,... n; j=1,... n + 1 – i and we are interested in estimating y i, k and variability of these estimates for k = n + j – i,... n.

6 Regression Models and Loss Reserve Variability  Assumptions:   j independent of accident year  All future development dependent on most current evaluation   i, j are error and are independent of accident year and delay  Expected value of  i, j = o and variance may be a function of y i, j and the development year  Note:  We assume that losses are fully developed by period n and do not consider tail factors in this study.  These assumptions are helpful in deriving variance estimates of ultimate losses.  Regression Frame Work for Loss Development

7 Regression Models and Loss Reserve Variability  Alternate Loss Development - Method 1 Under the assumption simple average development factors (SAD) are best linear unbiased estimates (BLUE) for  j and are unbiased estimates of  j 2. are unbiased estimates of ultimate factors and ultimate losses respectively. And

8 Regression Models and Loss Reserve Variability  Alternate Loss Development - Method 2 Under the assumption weighted average development (WAD) factors are best linear unbiased estimates (BLUE) for  j and are unbiased estimates of  j 2. Define are unbiased estimates of ultimate losses, and then

9 Regression Models and Loss Reserve Variability  Variance Estimation-Loss Development The assumption of an underlying linear model allows estimation of variance of the estimated ultimate losses. These may be useful in deriving confidence intervals for ultimate losses. Variance of the prediction is The first term of this equation is parameter risk, the second term is process risk, and expected values of the cross product terms are all zero under our assumptions. Note: We have assumed k = n+1- i and used

10 Regression Models and Loss Reserve Variability  Variance Estimation (continued) Parameter Risk Hence Parameter Risk may be estimated by which is estimated by Note: We have used estimates as proposed by Thomas Mack (1994). Daniel Murphy (1994) uses slightly different estimators. Process Risk

11 Regression Models and Loss Reserve Variability  Variance Estimation (continued) The Process Risk is assumed independent of accident year. However, the age to ultimate factors are correlated. Therefore, to compute parameter risk for the total ultimate losses one must account for the covariance of the ultimate losses among accident year. The algebra is little messy, after simplification, the covariance estimate for accident years r and s (r<s) is given by and the estimate of total covariance for accident years 2 through n is

12 Regression Models and Loss Reserve Variability Least Square Method X i, j = r i c j + δ i j estimate r i,and c j by minimizing Σ Σ (X i j – r i c j ) 2 i j * It has been shown the least square equations do have a solution and that can be obtained by iterative numerical method. * Parameter uncertainty can not be estimated. * There are only 2n-1 parameters. Model unchanged if we multiply each r i by a constant and divide c j by the same.

13 Regression Models and Loss Reserve Variability Assume X i j = r i c j δ i j Taking logarithm and redefining notation, we can write Z i j = α i + β j + e i j or Z i j = μ + α i + β j + e i j with α 1 = β 1 = O In matrix notation Z = A θ + ε ~ Ê ε = O, V ( ε ) = σ 2 I ~ Least square estimates are ^ θ = (A´ A) -1 A ´ Z ^ ^ σ 2 = (Z´ Z - θ ´ A´ Z)/ r r = (n-1) (n-2)/2  Log Regression Model

14 Regression Models and Loss Reserve Variability  Log Regression Model The unknown elements of the loss process can be written as _ E Z = B θ _ ^ The vector Z can be estimated by B θ. However our aim is to estimate X i j and not Z i j. The formulae are a bit complex but unbiased estimate corresponding estimates can all be computed in EXCEL. Details are given in Verrall 1994 CAS Forum.

15 Regression Models and Loss Reserve Variability  Method of Log Regression Model (continued) * Model fitted has too many parameters * Many parameters may not be significant * Tail factor may be needed * Calendar year inflation may be distorting observed observation One can choose alternate models, for example Z i j = μ + α i + β j α i = (i - 1) α β j = γ log (j) Model parameters can be tested for statistical significance. If variance changes by payment year, weighted least square may be used.