Download presentation
Presentation is loading. Please wait.
Published byDamon Bradford Modified over 9 years ago
1
11 Use of Gaussian Integration (Quadrature) in Stata Alan H. Feiveson NASA Johnson Space Center What is Gaussian quadrature? Gauss-Legendre case A simple example A more-complicated example Speed issues a b
2
2 What is Gaussian quadrature ? Wish to evaluate 1.Represent H(x) as the product H(x) = W(x) f(x) 2.Use the approximation where the x j and w j depend on W, a and b.
3
3 Choose W(x) such that f(x) is “smooth” What is Gaussian quadrature ?
4
4 Choose W(x) such that f(x) is “smooth” f(x) approximated by linear comb. of orthogonal polynomials {P j (x)}: P i (x)P j (x)dx = 0 (i j ). What is Gaussian quadrature ?
5
5 Choose W(x) such that f(x) is “smooth” f(x) approximated by linear comb. of orthogonal polynomials {P j (x)}: P i (x)P j (x)dx = 0 (i j ). x j are zeros of P M (x). What is Gaussian quadrature ?
6
6 Choose W(x) such that f(x) is “smooth” f(x) approximated by linear comb. of orthogonal polynomials {P j (x)}: P i (x)P j (x)dx = 0 (i j ). x j are zeros of P M (x). w j is a weight that depends on What is Gaussian quadrature ?
7
7 Some types of Gaussian quadrature W(x) = 1 (a,b) = (any): Gauss-Legendre quadrature W(x) = x e -x (a, b) = (0, ): Gauss-Laguerre quadrature W(x) = (a,b) = (– , ): Gauss-Hermite quadrature
8
8 Example 1: -xtlogit- Likelihood at each positive outcome is Estimate by maximum likelihood
9
9 Example 1: -xtlogit- H(x)H(x) f(x)f(x) Likelihood at each positive outcome is The integrand H(x) = where f(x) is “smooth”
10
10 W(x) = 1 (a,b) = (any): Gauss-Legendre quadrature W(x) = x e -x (a, b) = (0, ): Gauss-Laguerre quadrature W(x) = (a,b) = (– , ): Gauss-Hermite quadrature Example 1: -xtlogit- (cont.) Maximum likelihood Likelihood at each obs. is of the form The integrand is where f(x) is “smooth”
11
11 W(x) = 1 (a,b) = (any): Gauss-Legendre quadrature W(x) = x e -x (a, b) = (0, ): Gauss-Laguerre quadrature W(x) = (a,b) = (– , ): Gauss-Hermite quadrature Example 1: -xtlogit- (cont.) Maximum likelihood Likelihood at each obs. is of the form The integrand is where f(x) is “smooth” Uses Gauss-Hermite quadrature see p. 139 (-xtlogit-) and pp. 9 – 14 (-quadchk-)
12
12 Example 2: Wish to evaluate Let W(x) = W(x)f(x) Then H(x) f(x) (Gauss-Laguerre quadrature)
13
13 What is Gaussian quadrature? Gauss-Legendre case A simple example A more-complicated example Speed issues Use of Gaussian Integration (Quadrature) in Stata
14
14 Gauss-Legendre Case ( W(x) = 1 ) Standard form General form
15
15 Gauss-Legendre quadrature abcissas and weights
16
16 x -20123 -.2 0.2.4.6 ABCDE ABCDE Gauss-Legendre quadrature abcissas and weights –5 th order
17
17 What is Gaussian quadrature? Gauss-Legendre case A simple example A more-complicated example Speed issues Use of Gaussian Integration (Quadrature) in Stata
18
18 A simple example = norm(3) – norm(0) in Stata A = 0.4986500740
19
19 A simple example = 0.4986490676 A = 0.4986500740 = norm(3) – norm(0) in Stata
20
20 A simple example. range x 0 3 `K’. gen fx = normd(x). integ fx x A = 0.4986489713 (K=26) = norm(3) – norm(0) in Stata = 0.4986490676 A = 0.4986500740
21
21 What is Gaussian integration? Gauss-Legendre case A simple example A more-complicated example Speed issues Use of Gaussian Integration (Quadrature) in Stata
22
22
23
23 Nonlinear regression Model: y i = F(x i ) + e i e i ~ N(0, 2 ) A more-complicated example (cont.) = ibeta(p, q, x)
24
24
25
25 Estimate p and q by nonlinear least squares using -nl-
26
26 A more-complicated example (cont.)
27
27 Incorporation of M-th order Gaussian quadrature abcissas and weights to data set Contains data obs: 50 vars: 14 size: 4,800 (99.9% of memory free) ------------------------------------------------------------------------------- storage display value variable name type format label variable label ------------------------------------------------------------------------------- y float %9.0g x float %9.0g a float %9.0g lower limit b float %9.0g upper limit u1 double %10.0g abcissa #1 u2 double %10.0g abcissa #2 u3 double %10.0g abcissa #3 u4 double %10.0g abcissa #4 u5 double %10.0g abcissa #5 w1 double %10.0g weight #1 w2 double %10.0g weight #2 w3 double %10.0g weight #3 w4 double %10.0g weight #4 w5 double %10.0g weight #5 -------------------------------------------------------------------------------
28
28
29
29 program nlgtest version 8.2 /* ---------- Initialization section ------------ */ args y x np a b if "`1'" == "?" { global NP = `np' /* no. of integration points */ global S_1 "P Q " global P=1 /* initialization (poor) */ global Q=1/* same as above */ /* initialize Gaussian integration variables */ forv i=1(1)$NP { cap gen double lnf`i'=. /* log f(p,q,u_i) */ cap gen double wf`i'=. /* w_i*f(p,q,u_i) */ /* merge Gaussian abcissas (u_i) and weights (w_i) */ run gaussrow _u w $NP `a’ `b’ u } exit }
30
30 /* ---------- Iteration section ------------ */ local np = $NP forv i=1(1)`np' { /* loop over quadrature points */ qui replace lnf`i'=lngamma($P+$Q)-lngamma($P)-lngamma(Q) /* */ + ($P-1)*log(u`i')+($Q-1)*log(1-u`i') qui replace wf`i'=w`i'*exp(lnf`i') } cap drop Eyx egen double Eyx=rsum(wf*) replace Eyx=Eyx*(`b’-`a’)/2 replace `1'= Eyx end
31
31 nl-estimation ( 5 points). nl gtest y x five a b (obs = 46) Iteration 0: residual SS = 1.143306 Iteration 1: residual SS =.2593769 Iteration 2: residual SS =.119276 Iteration 3: residual SS =.1104239 Iteration 4: residual SS =.1103775 Iteration 5: residual SS =.1103775 Source | SS df MS Number of obs = 46 -------------+------------------------------ F( 2, 44) = 4385.65 Model | 22.0034704 2 11.0017352 Prob > F = 0.0000 Residual |.110377456 44.002508579 R-squared = 0.9950 -------------+------------------------------ Adj R-squared = 0.9948 Total | 22.1138479 46.480735823 Root MSE =.0500857 Res. dev. = -146.9522 (gtest) ------------------------------------------------------------------------------ y | Coef. Std. Err. t P>|t| [95% Conf. Interval] -------------+---------------------------------------------------------------- P | 1.939152.1521358 12.75 0.000 1.632542 2.245761 Q | 2.922393.2346094 12.46 0.000 2.449569 3.395218 ------------------------------------------------------------------------------ (SEs, P values, CIs, and correlations are asymptotic approximations) +
32
32. nl gtest y x five a b (obs = 46) Source | SS df MS Number of obs = 46 -------------+------------------------------ F( 2, 44) = 4385.65 Model | 22.0034704 2 11.0017352 Prob > F = 0.0000 Residual |.110377456 44.002508579 R-squared = 0.9950 -------------+------------------------------ Adj R-squared = 0.9948 Total | 22.1138479 46.480735823 Root MSE =.0500857 Res. dev. = -146.9522 (gtest) ------------------------------------------------------------------------------ y | Coef. Std. Err. t P>|t| [95% Conf. Interval] -------------+---------------------------------------------------------------- P | 1.939152.1521358 12.75 0.000 1.632542 2.245761 Q | 2.922393.2346094 12.46 0.000 2.449569 3.395218 ------------------------------------------------------------------------------ (SEs, P values, CIs, and correlations are asymptotic approximations)
33
33
34
34 What is Gaussian integration? Gauss-Legendre case A simple example A more-complicated example Speed issues Use of Gaussian Integration (Quadrature) in Stata
35
35 Speed issues forv i=1(1)`np' { qui replace lnf`i'=... + ($Q-1)*log(1-u`i') qui replace wf`i'=w`i'*exp(lnf`i') } cap drop Eyx egen double Eyx=rsum(wf*) replace Eyx=Eyx*(`b’-`a’)/2
36
36 Speed issues cap gen Eyx=. replace Eyx=0 forv i=1(1)`np' { qui replace lnf`i'=... + ($Q-1)*log(1-u`i') qui replace wf`i'=w`i'*exp(lnf`i') replace Eyx=Eyx+wf`I’ } replace Eyx=Eyx*(`b’-`a’)/2 1.sum directly in loop – don’t use –egen-
37
37 cap gen Eyx=. replace Eyx=0 forv i=1(1)`np' { qui replace lnf=... + ($Q-1)*log(1-u`i') qui replace wf=w`i'*exp(lnf) replace Eyx=Eyx+wf } replace Eyx=Eyx*(`b’-`a’)/2 Speed issues 1.sum directly in loop – don’t use –egen- 2.just use one “lnf” and one “wf”-variable
38
38 forv i=1(1)`np' { qui replace lnf`i'=... + ($Q-1)*log(1-u`i') qui replace wf`i'=w`i'*exp(lnf`i') } cap drop Eyx egen double Eyx=rsum(wf*) replace Eyx=Eyx*(`b’-`a’)/2 Speed issues
39
39 Eric W. Weisstein. "Gaussian Quadrature." From MathWorld--A Wolfram Web Resource.MathWorld http://mathworld.wolfram.com/GaussianQuadrature.htmlhttp://mathworld.wolfram.com/GaussianQuadrature.html. Press, W. H.; Flannery, B. P.; Teukolsky, S. A.; and Vetterling, W. T. "Gaussian Quadratures and Orthogonal Polynomials." §4.5 in Numerical Recipes in FORTRAN: The Art of Scientific Computing, 2nd ed. Cambridge, England: Cambridge University Press, pp. 140-155, 1992Numerical Recipes in FORTRAN: The Art of Scientific Computing, 2nd ed. http://www.library.cornell.edu/nr/bookfpdf/f4-5.pdf Abramowitz, M. and Stegun, I. A. (Eds.). Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 9th printing. New York: Dover, pp. 887-888, 1972.Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 9th printing. References
40
40 over-dispersed binomial models n i ~ B(N i, p i ) p i ~ g(p) (0 < p i < 1) Example: logit p i ~ N( , 2 ) (-xtlogit- model) Estimate , and by ML on obs. of N and n
41
41 2. “Long”: expand M and add 2 new variables (w and u) Possible approaches to incorporation of N-th order Gaussian quadrature abcissas and weights to data set
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.