Kunal Jain March 24, 2010 Economics 201FS

Slides:



Advertisements
Similar presentations
Probability models- the Normal especially.
Advertisements

Autocorrelation Functions and ARIMA Modelling
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
FACTORIAL ANOVA Overview of Factorial ANOVA Factorial Designs Types of Effects Assumptions Analyzing the Variance Regression Equation Fixed and Random.
Hypothesis Testing Steps in Hypothesis Testing:
Inference for Regression
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
11 Simple Linear Regression and Correlation CHAPTER OUTLINE
Regression Analysis Simple Regression. y = mx + b y = a + bx.
Classical Regression III
© 2010 Pearson Prentice Hall. All rights reserved Least Squares Regression Models.
The Multiple Regression Model Prepared by Vera Tabakova, East Carolina University.
Chapter 10 Simple Regression.
Linear Regression with One Regression
T-test.
Chapter Topics Types of Regression Models
Chapter 11 Multiple Regression.
Simple Linear Regression Analysis
Quantitative Business Analysis for Decision Making Simple Linear Regression.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Simple Linear Regression Analysis
Normal and Sampling Distributions A normal distribution is uniquely determined by its mean, , and variance,  2 The random variable Z = (X-  /  is.
Lecture 5 Correlation and Regression
One Sample  M ean μ, Variance σ 2, Proportion π Two Samples  M eans, Variances, Proportions μ1 vs. μ2 σ12 vs. σ22 π1 vs. π Multiple.
Regression and Correlation Methods Judy Zhong Ph.D.
Regression Analysis Regression analysis is a statistical technique that is very useful for exploring the relationships between two or more variables (one.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter Inference on the Least-Squares Regression Model and Multiple Regression 14.
Initial Data Analysis Kunal Jain February 17, 2010 Economics 201FS.
OPIM 303-Lecture #8 Jose M. Cruz Assistant Professor.
Multiple Regression and Model Building Chapter 15 Copyright © 2014 by The McGraw-Hill Companies, Inc. All rights reserved.McGraw-Hill/Irwin.
1 11 Simple Linear Regression and Correlation 11-1 Empirical Models 11-2 Simple Linear Regression 11-3 Properties of the Least Squares Estimators 11-4.
Simulation Example: Generate a distribution for the random variate: What is the approximate probability that you will draw X ≤ 1.5?
Stat 112: Notes 2 Today’s class: Section 3.3. –Full description of simple linear regression model. –Checking the assumptions of the simple linear regression.
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
AUTOCORRELATION: WHAT HAPPENS IF THE ERROR TERMS ARE CORRELATED?
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Simple Linear Regression Analysis Chapter 13.
Example x y We wish to check for a non zero correlation.
1 1 Slide The Simple Linear Regression Model n Simple Linear Regression Model y =  0 +  1 x +  n Simple Linear Regression Equation E( y ) =  0 + 
Maths Study Centre CB Open 11am – 5pm Semester Weekdays
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Multiple Regression Chapter 14.
Semivariance Significance in the S&P500 Baishi Wu, 4/7/08.
Lecturer: Ing. Martina Hanová, PhD.. Regression analysis Regression analysis is a tool for analyzing relationships between financial variables:  Identify.
Time-Varying Beta: Heterogeneous Autoregressive Beta Model Kunal Jain Spring 2010 Economics 201FS May 5, 2010.
Statistics 350 Lecture 2. Today Last Day: Section Today: Section 1.6 Homework #1: Chapter 1 Problems (page 33-38): 2, 5, 6, 7, 22, 26, 33, 34,
Chapter 11: Linear Regression E370, Spring From Simple Regression to Multiple Regression.
TESTING STATISTICAL HYPOTHESES
Quarterly Earnings Releases, Expectations, and Price Behavior
Linear Regression with One Regression
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Review 1. Describing variables.
Jump Detection and Analysis Investigation of Media/Telecomm Industry
STAT 312 Chapter 7 - Statistical Intervals Based on a Single Sample
Econ Roadmap Focus Midterm 1 Focus Midterm 2 Focus Final Intro
Seasonal Variance in Corn Futures
HAR-RV with Sector Variance
Chapter 12 Inference on the Least-squares Regression Line; ANOVA
6-1 Introduction To Empirical Models
Prediction of new observations
Kunal Jain February 17, 2010 Economics 201FS
Jump Processes and Trading Volume
CHAPTER 6 Statistical Inference & Hypothesis Testing
Semivariance Significance
Seasonal Variance among Light Sweet Crude and Corn Futures
Chapter 13 Additional Topics in Regression Analysis
3.2. SIMPLE LINEAR REGRESSION
Introduction to Regression
Adding variables. There is a difference between assessing the statistical significance of a variable acting alone and a variable being added to a model.
Statistical Inference for the Mean: t-test
Introductory Statistics
Presentation transcript:

Kunal Jain March 24, 2010 Economics 201FS Presentation 3 Kunal Jain March 24, 2010 Economics 201FS

HAR-RV Model Heterogeneous Autoregressive model of the Realized Volatility (HAR-RV), Corsi 2003, uses average realized variance over daily, weekly, and monthly time intervals to build a conditional volatility model. h=1 corresponds to daily periods, h=5 corresponds to weekly periods, h=22 corresponds to monthly periods. This model uses volatilities realized over a 1-day, 5-day, and 1-month time interval to build the conditional volatilities.

HAR-RV Model Model implemented in MatLab Regression Results (AMZN) Coefficient sum approximates one. Constant? Reject the null hypothesis? RVt+1 Coefficient Standard Error T-Statistic RVt .3921 0.0220 17.82* RVt-5,t .3638 0.0339 10.73* RVt-22,t .1766 0.0282 6.26 constant .0001 0.0000 0.000

HAR-RV Model- Kernel Density Residuals- Kernel Density Plot (Observed-Expected) Non-parametric way of estimating the probability density function of a random variable- want to resemble a normal distribution.

HAR-RV: Squared Overnight Including squared overnight returns to look at volatility RVt+1 Coefficient Standard Error T-Statistic RVt .3742 .0217 17.24 RVt-5,t .3542 .0334 10.60 RVt-22,t .1786 .0277 6.45 BON .0656 .0067 9.79 Constant .0001 0.0000 0.000

HAR-RV Model- Kernel Density *Including overnight returns

HAR-RV Model Normalizing overnight returns [Sqrt(RV) & Abs(Overnight)] vs. [RV & Overnight2] Normalize outliars -> T-statistic for Overnight Which one is better? [RV & Overnight2] [Sqrt(RV) & Abs(Overnight)] RVt+1 Coefficient Standard Error T-Statistic RVt .3742 .0217 17.24 RVt-5,t .3542 .0334 10.60 RVt-22,t .1786 .0277 6.45 BON .0656 .0067 9.79 Constant .0001 0.0000 0.000 RVt+1 Coefficient Standard Error T-Statistic RVt .3856 .0216 17.85 RVt-5,t .3472 .0326 10.65 RVt-22,t .1886 .0250 7.54 BON .1012 .0082 12.34 Constant .0011 0.0004 2.75

HAR-RV Model: MedV Use MedV as a dummy variable in regression Regression Results with MedV Z-values at 5% significance level (10 minute interval) T-Distribution with 5 DOF RVt+1 Coefficient Standard Error T-Statistic RVt .2731 .0185 14.76 RVt-5,t .1683 .0286 5.88 RVt-22,t .1524 .0233 6.54 BMedV .0030 .0001 30 Constant .0003 0.0000 -

HAR-RV Model Regression results with: Squared Overnight returns RV & Overnight2 MedV 5% significant Z-statistic dummy variable (10 minute interval) RVt+1 Coefficient Standard Error T-Statistic RVt .2508 .0182 13.78 RVt-5,t .1533 .0280 5.475 RVt-22,t .1469 .0228 6.44 BON .0108 .0009 12 BMedV .0030 .0001 30 Constant .0002 -

Further Research Significant Levels STATA integration More stocks New Regressors Integrate Earnings Surprises