Multimodal and Sensorial Interfaces for Mobile Robots course task Nicola Piotto a.y. 2007/2008.

Slides:



Advertisements
Similar presentations
Machine Learning and Data Mining Linear regression
Advertisements

EKF, UKF TexPoint fonts used in EMF.
Data Modelling and Regression Techniques M. Fatih Amasyalı.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Numeriska beräkningar i Naturvetenskap och Teknik Today’s topic: Approximations Least square method Interpolations Fit of polynomials Splines.
Pattern Recognition and Machine Learning: Kernel Methods.
FTP Biostatistics II Model parameter estimations: Confronting models with measurements.
Definition  Regression Model  Regression Equation Y i =  0 +  1 X i ^ Given a collection of paired data, the regression equation algebraically describes.
COMP 116: Introduction to Scientific Programming Lecture 11: Linear Regression.
CITS2401 Computer Analysis & Visualisation
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Section 4.2 Fitting Curves and Surfaces by Least Squares.
1-norm Support Vector Machines Good for Feature Selection  Solve the quadratic program for some : min s. t.,, denotes where or membership. Equivalent.
Principal Components. Karl Pearson Principal Components (PC) Objective: Given a data matrix of dimensions nxp (p variables and n elements) try to represent.
Radial Basis Functions
Curve-Fitting Regression
NOTES ON MULTIPLE REGRESSION USING MATRICES  Multiple Regression Tony E. Smith ESE 502: Spatial Data Analysis  Matrix Formulation of Regression  Applications.
IR Models: Latent Semantic Analysis. IR Model Taxonomy Non-Overlapping Lists Proximal Nodes Structured Models U s e r T a s k Set Theoretic Fuzzy Extended.
1 MF-852 Financial Econometrics Lecture 2 Matrix Operations in Econometrics, Optimization with Excel Roy J. Epstein Fall 2003.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. by Lale Yurttas, Texas A&M University Chapter 171 CURVE.
Basic Mathematics for Portfolio Management. Statistics Variables x, y, z Constants a, b Observations {x n, y n |n=1,…N} Mean.
1 Chapter 17: Introduction to Regression. 2 Introduction to Linear Regression The Pearson correlation measures the degree to which a set of data points.
Calibration & Curve Fitting
CSC 589 Lecture 22 Image Alignment and least square methods Bei Xiao American University April 13.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
PATTERN RECOGNITION AND MACHINE LEARNING
1 SVY207: Lecture 18 Network Solutions Given many GPS solutions for vectors between pairs of observed stations Compute a unique network solution (for many.
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
Stats for Engineers Lecture 9. Summary From Last Time Confidence Intervals for the mean t-tables Q Student t-distribution.
Lecture 12 Statistical Inference (Estimation) Point and Interval estimation By Aziza Munir.
Scientific Computing Linear Least Squares. Interpolation vs Approximation Recall: Given a set of (x,y) data points, Interpolation is the process of finding.
Managerial Economics Demand Estimation. Scatter Diagram Regression Analysis.
Chapter 8 Curve Fitting.
Functions of Several Variables Copyright © Cengage Learning. All rights reserved.
Curve-Fitting Regression
By Adam Mallen.  What is it?  How is it different from regression?  When would you use it?  What can go wrong?  How do we find the interpolating.
Regression Regression relationship = trend + scatter
MA3264 Mathematical Modelling Lecture 3 Model Fitting.
Ch. 3: Geometric Camera Calibration
Chapter 6 (cont.) Difference Estimation. Recall the Regression Estimation Procedure 2.
Lecture 16 - Approximation Methods CVEN 302 July 15, 2002.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
CSC321: Lecture 7:Ways to prevent overfitting
Estimation of Random Variables Two types of estimation: 1) Estimating parameters/statistics of a random variable (or several) from data. 2)Estimating the.
Kernel Methods Arie Nakhmani. Outline Kernel Smoothers Kernel Density Estimators Kernel Density Classifiers.
ROBOTICS 01PEEQW Basilio Bona DAUIN – Politecnico di Torino.
Bias-Variance Analysis in Regression  True function is y = f(x) +  where  is normally distributed with zero mean and standard deviation .  Given a.
Lecture 2 Linear Inverse Problems and Introduction to Least Squares.
Linear Algebra Curve Fitting. Last Class: Curve Fitting.
Computational Intelligence: Methods and Applications Lecture 14 Bias-variance tradeoff – model selection. Włodzisław Duch Dept. of Informatics, UMK Google:
Part 5 - Chapter
Part 5 - Chapter 17.
CH 5: Multivariate Methods
Parameter estimation class 5
Least Squares Approximations
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Part 5 - Chapter 17.
13 Functions of Several Variables
Ying shen Sse, tongji university Sep. 2016
The regression model in matrix form
Linear regression Fitting a straight line to observations.
Least Squares Fitting A mathematical procedure for finding the best-fitting curve to a given set of points by minimizing the sum of the squares of the.
Presenter: Georgi Nalbantov
Biointelligence Laboratory, Seoul National University
5.2 Least-Squares Fit to a Straight Line
Nonlinear Fitting.
Discrete Least Squares Approximation
Learning Theory Reza Shadmehr
Mathematical Foundations of BME
Topic 11: Matrix Approach to Linear Regression
Presentation transcript:

Multimodal and Sensorial Interfaces for Mobile Robots course task Nicola Piotto a.y. 2007/2008

Specifics about the task Robertino has been positioned at different distances from an obstacle (i.e , 0.25, 0.5, 1, 2, 3 [meters]). For each step several measurements from the frontal IR sensor has been collected. The final goal is to define a function to map the noisy sensor data to the real object distance.

Initial data observations More the object distance increases the noise increases as well. 84, , , , , ,376 0,276630, , , , ,6178 0,1250,250,5123 distance mean variance

Employed solution The solution to the problem can be reached using a linear regression over the acquired sensor data. In this way it is possible to analitycally define a linear function throughout least squared error minimization (data fitting). The derived function maps sensor data to object distances.

Considerations It has been tried to retrieve an higher degree interpolating function (e.g. polinomial, quadratic) using a Support Vector Regression (SVR) procedure: however, due to the noise in the observed data it has not been possible to successfully end the task (the final result was unreliable).

Considerations(2)

Some specification The linear regression has been implemented in Matlab environment. [b,c]=regress(x,y) Y=c+b*X Y is the estimated object distance. X is the sensor measurement x is the training data y is the related distance

Results The matrix z includes in 2 columns all the training sensor data(1) and the distance they refer to(2) (bracketed results refre to different set of data considered). b=0.013;(0.014);(0.016); MSE=0.0849;(0.033);(8.9269*10^-5)

Results(2) 0.125,0.25, ,0.25,0.5, ,0.25,0.5,1,2,3

Considerations Including also the noisy data from the bigger object distance leads to a calibration function not particularly precise (high MSE). Instead, considering only the less noisy data from the smallest object distance (up to m) leads to a more reliable calibration function.

Considerations(2) Given the impossibility in processing the information from the more noisy distances, it may be suggested to employ the IR sensor to estimate object distances up to 1 meter. For bigger distances is not achieved an sufficient precision so it may be better employ different kind of sensor.

Considerations(3)