Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.

Slides:



Advertisements
Similar presentations
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Advertisements

1 Chapter 13 Curve Fitting and Correlation This chapter will be concerned primarily with two separate but closely interrelated processes: (1) the fitting.
Kin 304 Regression Linear Regression Least Sum of Squares
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
6-1 Introduction To Empirical Models 6-1 Introduction To Empirical Models.
 Coefficient of Determination Section 4.3 Alan Craig
Chapter 10 Curve Fitting and Regression Analysis
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
Lecture (14,15) More than one Variable, Curve Fitting, and Method of Least Squares.
Statistics for the Social Sciences
Function Approximation
Least Square Regression
General Linear Least-Squares and Nonlinear Regression
Least Square Regression
The Islamic University of Gaza Faculty of Engineering Civil Engineering Department Numerical Analysis ECIV 3306 Chapter 17 Least Square Regression.
Lesson #32 Simple Linear Regression. Regression is used to model and/or predict a variable; called the dependent variable, Y; based on one or more independent.
Part 4 Chapter 13 Linear Regression
Statistics 350 Lecture 23. Today Today: Exam next day Good Chapter 7 questions: 7.1, 7.2, 7.3, 7.28, 7.29.
Engineering Computation Curve Fitting 1 Curve Fitting By Least-Squares Regression and Spline Interpolation Part 7.
Probability & Statistics for Engineers & Scientists, by Walpole, Myers, Myers & Ye ~ Chapter 11 Notes Class notes for ISE 201 San Jose State University.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Least-Squares Regression
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Introduction to Linear Regression and Correlation Analysis
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Curve Fitting.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
Section 5.2: Linear Regression: Fitting a Line to Bivariate Data.
MECN 3500 Inter - Bayamon Lecture 9 Numerical Methods for Engineering MECN 3500 Professor: Dr. Omar E. Meza Castillo
Applied Quantitative Analysis and Practices LECTURE#23 By Dr. Osman Sadiq Paracha.
BIOL 582 Lecture Set 11 Bivariate Data Correlation Regression.
Biostatistics Lecture 17 6/15 & 6/16/2015. Chapter 17 – Correlation & Regression Correlation (Pearson’s correlation coefficient) Linear Regression Multiple.
Curve Fitting and Regression EEE 244. Descriptive Statistics in MATLAB MATLAB has several built-in commands to compute and display descriptive statistics.
ES 240: Scientific and Engineering Computation. Chapter 13: Linear Regression 13. 1: Statistical Review Uchechukwu Ofoegbu Temple University.
Thomas Knotts. Engineers often: Regress data  Analysis  Fit to theory  Data reduction Use the regression of others  Antoine Equation  DIPPR.
Go to Table of Content Single Variable Regression Farrokh Alemi, Ph.D. Kashif Haqqi M.D.
Regression Lesson 11. The General Linear Model n Relationship b/n predictor & outcome variables form straight line l Correlation, regression, t-tests,
Business Statistics: A Decision-Making Approach, 6e © 2005 Prentice-Hall, Inc. Chap 13-1 Introduction to Regression Analysis Regression analysis is used.
STA 286 week 131 Inference for the Regression Coefficient Recall, b 0 and b 1 are the estimates of the slope β 1 and intercept β 0 of population regression.
Ch14: Linear Least Squares 14.1: INTRO: Fitting a pth-order polynomial will require finding (p+1) coefficients from the data. Thus, a straight line (p=1)
Polynomials, Curve Fitting and Interpolation. In this chapter will study Polynomials – functions of a special form that arise often in science and engineering.
Curve Fitting Pertemuan 10 Matakuliah: S0262-Analisis Numerik Tahun: 2010.
Copyright © The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 Part 4 Chapter 15 General Least Squares and Non- Linear.
Chapter 13 Objectives Familiarizing yourself with some basic descriptive statistics and the normal distribution. Knowing how to compute the slope and intercept.
Curve Fitting Introduction Least-Squares Regression Linear Regression Polynomial Regression Multiple Linear Regression Today’s class Numerical Methods.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression.
1 Simple Linear Regression and Correlation Least Squares Method The Model Estimating the Coefficients EXAMPLE 1: USED CAR SALES.
Intro to Statistics for the Behavioral Sciences PSYC 1900 Lecture 7: Regression.
BUSINESS MATHEMATICS & STATISTICS. Module 6 Correlation ( Lecture 28-29) Line Fitting ( Lectures 30-31) Time Series and Exponential Smoothing ( Lectures.
Introduction. We want to see if there is any relationship between the results on exams and the amount of hours used for studies. Person ABCDEFGHIJ Hours/
Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00.
Regression and Correlation of Data Summary
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
REGRESSION G&W p
Part 5 - Chapter
Part 5 - Chapter 17.
Kin 304 Regression Linear Regression Least Sum of Squares
BPK 304W Regression Linear Regression Least Sum of Squares
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
6-1 Introduction To Empirical Models
Linear regression Fitting a straight line to observations.
General Linear Least-Squares and Nonlinear Regression
Contact: Machine Learning – (Linear) Regression Wilson Mckerrow (Fenyo lab postdoc) Contact:
Nonlinear Fitting.
Engineering Analysis ENG 3420 Fall 2009
Curve Fitting in Matlab
Presentation transcript:

Engineering Analysis ENG 3420 Fall 2009 Dan C. Marinescu Office: HEC 439 B Office hours: Tu-Th 11:00-12:00

22 Lecture 21 Last time:  Relaxation  Non-linear systems  Random variables, probability distributions, Matlab support for random variables Today  Histograms  Linear regression  Linear least squares regression  Non-linear data models Next Time  Multiple linear regression  General linear squares

Statistics built-in functions Built-in statistics functions for a column vector s:  mean(s), median(s), mode(s) Calculate the mean, median, and mode of s. mode is a part of the statistics toolbox.  min(s), max(s) Calculate the minimum and maximum value in s.  var(s), std(s) Calculate the variance and standard deviation of s If a matrix is given, the statistics will be returned for each column.

Histograms [n, x] = hist(s, x)  Determine the number of elements in each bin of data in s.  x is a vector containing the center values of the bins. [n, x] = hist(s, m)  Determine the number of elements in each bin of data in s using m bins.  x will contain the centers of the bins.  The default case is m=10 hist(s, x) or hist(s, m) or hist(s)  With no output arguments, hist will actually produce a histogram.

Histogram Example

Linear Least-Squares Regression Linear least-squares regression is a method to determine the “best” coefficients in a linear model for given data set. “Best” for least-squares regression means minimizing the sum of the squares of the estimate residuals. For a straight line model, this gives: This method will yield a unique line for a given set of data.

Least-Squares Fit of a Straight Line Using the model: the slope and intercept producing the best fit can be found using:

Example V (m/s) F (N) ixixi yiyi (x i ) 2 xiyixiyi 

Nonlinear models Linear regression is predicated on the fact that the relationship between the dependent and independent variables is linear - this is not always the case. Three common examples are:

Linearization of nonlinear models

Transformation Examples

Linear Regression Program

Polynomial least-fit squares MATLAB has a built-in function polyfit that fits a least-squares n-th order polynomial to data:  p = polyfit(x, y, n) x: independent data y: dependent data n: order of polynomial to fit p: coefficients of polynomial f(x)=p 1 x n +p 2 x n-1 +…+p n x+p n+1 MATLAB’s polyval command can be used to compute a value using the coefficients.  y = polyval(p, x)

Polynomial Regression The least-squares procedure from can be extended to fit data to a higher-order polynomial. The idea is to minimize the sum of the squares of the estimate residuals. The figure shows the same data fit with: a) A first order polynomial b) A second order polynomial

Process and Measures of Fit For a second order polynomial, the best fit would mean minimizing: In general, this would mean minimizing: The standard error for fitting an m th order polynomial to n data points is: because the m th order polynomial has (m+1) coefficients. The coefficient of determination r 2 is still found using: