A meeting to celebrate Murray Aitkin’s 70 th Birthday.

Slides:



Advertisements
Similar presentations
Part 2: Unsupervised Learning
Advertisements

Overview of Inferential Statistics
Lecture 11 (Chapter 9).
Point Estimation Notes of STAT 6205 by Dr. Fan.
Mixture Models and the EM Algorithm
How Should We Assess the Fit of Rasch-Type Models? Approximating the Power of Goodness-of-fit Statistics in Categorical Data Analysis Alberto Maydeu-Olivares.
1 Parametric Empirical Bayes Methods for Microarrays 3/7/2011 Copyright © 2011 Dan Nettleton.
Hypothesis testing and confidence intervals by resampling by J. Kárász.
Bayesian Estimation in MARK
Multilevel survival models A paper presented to celebrate Murray Aitkin’s 70 th birthday Harvey Goldstein ( also 70 ) Centre for Multilevel Modelling University.
An Introduction to Bayesian Inference Michael Betancourt April 8,
Introduction  Bayesian methods are becoming very important in the cognitive sciences  Bayesian statistics is a framework for doing inference, in a principled.
Descriptive statistics Experiment  Data  Sample Statistics Sample mean Sample variance Normalize sample variance by N-1 Standard deviation goes as square-root.
First introduced in 1977 Lots of mathematical derivation Problem : given a set of data (data is incomplete or having missing values). Goal : assume the.
G. Cowan Lectures on Statistical Data Analysis 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem, random variables, pdfs 2Functions.
Machine Learning CMPT 726 Simon Fraser University CHAPTER 1: INTRODUCTION.
1 Unsupervised Learning With Non-ignorable Missing Data Machine Learning Group Talk University of Toronto Monday Oct 4, 2004 Ben Marlin Sam Roweis Rich.
Log-linear modeling and missing data A short course Frans Willekens Boulder, July
Programme in Statistics (Courses and Contents). Elementary Probability and Statistics (I) 3(2+1)Stat. 101 College of Science, Computer Science, Education.
A gentle introduction to Gaussian distribution. Review Random variable Coin flip experiment X = 0X = 1 X: Random variable.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference (Sec. )
Descriptive statistics Experiment  Data  Sample Statistics Experiment  Data  Sample Statistics Sample mean Sample mean Sample variance Sample variance.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
Overview of STAT 270 Ch 1-9 of Devore + Various Applications.
Chapter 14 Simulation. Monte Carlo Process Statistical Analysis of Simulation Results Verification of the Simulation Model Computer Simulation with Excel.
What is it? When would you use it? Why does it work? How do you implement it? Where does it stand in relation to other methods? EM algorithm reading group.
G. Cowan Lectures on Statistical Data Analysis Lecture 10 page 1 Statistical Data Analysis: Lecture 10 1Probability, Bayes’ theorem 2Random variables and.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference.
One Sample  M ean μ, Variance σ 2, Proportion π Two Samples  M eans, Variances, Proportions μ1 vs. μ2 σ12 vs. σ22 π1 vs. π Multiple.
Overview G. Jogesh Babu. Probability theory Probability is all about flip of a coin Conditional probability & Bayes theorem (Bayesian analysis) Expectation,
Estimating parameters in a statistical model Likelihood and Maximum likelihood estimation Bayesian point estimates Maximum a posteriori point.
Advanced Higher Statistics Data Analysis and Modelling Hypothesis Testing Statistical Inference AH.
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
7.4 – Sampling Distribution Statistic: a numerical descriptive measure of a sample Parameter: a numerical descriptive measure of a population.
Mixture Models, Monte Carlo, Bayesian Updating and Dynamic Models Mike West Computing Science and Statistics, Vol. 24, pp , 1993.
Randomized Algorithms for Bayesian Hierarchical Clustering
STA 216 Generalized Linear Models Meets: 2:50-4:05 T/TH (Old Chem 025) Instructor: David Dunson 219A Old Chemistry, Teaching.
Phisical Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 4th Maximum likelihood estimation and EM algorithm Kazuyuki Tanaka Graduate School.
CS Statistical Machine learning Lecture 24
1 Standard error Estimated standard error,s,. 2 Example 1 While measuring the thermal conductivity of Armco iron, using a temperature of 100F and a power.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Lecture 2: Statistical learning primer for biologists
Latent Dirichlet Allocation
Overview G. Jogesh Babu. Overview of Astrostatistics A brief description of modern astronomy & astrophysics. Many statistical concepts have their roots.
Markov Chain Monte Carlo for LDA C. Andrieu, N. D. Freitas, and A. Doucet, An Introduction to MCMC for Machine Learning, R. M. Neal, Probabilistic.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Statistics Sampling Distributions and Point Estimation of Parameters Contents, figures, and exercises come from the textbook: Applied Statistics and Probability.
1 Chapter 8: Model Inference and Averaging Presented by Hui Fang.
CS Statistical Machine learning Lecture 25 Yuan (Alan) Qi Purdue CS Nov
Intelligent Database Systems Lab 國立雲林科技大學 National Yunlin University of Science and Technology Advisor : Dr. Hsu Graduate : Yu Cheng Chen Author: Lynette.
Introduction We consider the data of ~1800 phenotype measurements Each mouse has a given probability distribution of descending from one of 8 possible.
Overview G. Jogesh Babu. R Programming environment Introduction to R programming language R is an integrated suite of software facilities for data manipulation,
Generalization Performance of Exchange Monte Carlo Method for Normal Mixture Models Kenji Nagata, Sumio Watanabe Tokyo Institute of Technology.
Statistical Modelling
Generalized Linear Models
7-1 Introduction The field of statistical inference consists of those methods used to make decisions or to draw conclusions about a population. These.
Discrete Choice Modeling
STA 216 Generalized Linear Models
Bayes Net Learning: Bayesian Approaches
Special Topics In Scientific Computing
STA 216 Generalized Linear Models
Probability & Statistics Probability Theory Mathematical Probability Models Event Relationships Distributions of Random Variables Continuous Random.
Bayesian Models in Machine Learning
'Linear Hierarchical Models'
POINT ESTIMATOR OF PARAMETERS
CHAPTER 6 Statistical Inference & Hypothesis Testing
Biointelligence Laboratory, Seoul National University
Graduate School of Information Sciences, Tohoku University
Fractional-Random-Weight Bootstrap
Maximum Likelihood Estimation (MLE)
Presentation transcript:

A meeting to celebrate Murray Aitkin’s 70 th Birthday.

Changes to programme  Mikis Stasinopoulos replaces Goeran Kauermann (Flight problems because of volcanic ash)  Brian Francis probably replaces Peter van der Heijden (laryngitis).

 Development of statistical modelling over last 30 years has been rapid and extensive.  This meeting is a celebration of Murray Aitkin’s work over this period and what it has led on to.  Large body of Murray Aitkin’s work which is heavily cited.  Covers a wide spectrum of topics: Simultaneous test procedures, Applications of the EM algorithm, survival models, likelihood inference, latent class models, random effects models, item/response theory, mixture models, discrete random effects, Bayes factors, new approaches to Bayesian inference.

 Bock and Aitkin (1981) Marginal maximum likelihood estimation of item parameters: Application of an EM algorithm 967 citations.  Aitkin, Hinde and Anderson (1981) Statistical modelling of data on teaching styles.276 citations  Anderson, and Aitkin (1985) Variance component models with a binary response. 240 citations  Aitkin and Clayton (1980) Fitting of exponential, Weibull and Extreme Value distributions. 211 citations  Aitkin and Rubin (1985) Estimation and hypothesis testing in finite mixture models. 205 citations

 Developed through reanalysis of data on teaching styles ( formal, informal, mixed)  Exciting new area in 1980s with distinct research teams developing work and ideas.  Now vast body of work by numerous researchers  Speakers: Hinde, Goldstein, Alfo.

 Contributions in Monte Carlo testing, EM fitting of models in the GLM framework, and more recently, new inferential procedures for determining number of groups.  Speakers – Longford.

 Investigation of wide variety of new models for data in survival analysis, contingency table analysis and item/response theory.  Work in comparison of correspondence analysis and log-linear models.  Involvement in Statistical Modelling workshops  Speakers: Kauermann, de Falguerolles, Firth, Statiinopoulos.

 Likelihood based – Book on Statistical Modelling in GLIM.  New approaches to Bayesian analysis - new book out in July.  Speakers: Liu, Aitkin