Computing Interval Estimates for Components of Statistical Information with Respect to Judgements on Probability Density Functions Victor G. Krymsky Ufa.

Slides:



Advertisements
Similar presentations
LIAL HORNSBY SCHNEIDER
Advertisements

Introduction to Sensitivity Analysis Graphical Sensitivity Analysis
LECTURE 14 Minimization Two Phase method by Dr. Arshad zaheer
Lecture XXIII.  In general there are two kinds of hypotheses: one concerns the form of the probability distribution (i.e. is the random variable normally.
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
1 12. Principles of Parameter Estimation The purpose of this lecture is to illustrate the usefulness of the various concepts introduced and studied in.
Chap 8: Estimation of parameters & Fitting of Probability Distributions Section 6.1: INTRODUCTION Unknown parameter(s) values must be estimated before.
© 2011 Autodesk Freely licensed for use by educational institutions. Reuse and changes require a note indicating that content has been modified from the.
Visual Recognition Tutorial
By : L. Pour Mohammad Bagher Author : Vladimir N. Vapnik
Elementary hypothesis testing
Sample size computations Petter Mostad
Elementary hypothesis testing
Statistics for Managers Using Microsoft Excel, 4e © 2004 Prentice-Hall, Inc. Chap 7-1 Chapter 7 Confidence Interval Estimation Statistics for Managers.
9-1 Hypothesis Testing Statistical Hypotheses Statistical hypothesis testing and confidence interval estimation of parameters are the fundamental.
Quality Control Procedures put into place to monitor the performance of a laboratory test with regard to accuracy and precision.
Basic Business Statistics, 10e © 2006 Prentice-Hall, Inc. Chap 8-1 Chapter 8 Confidence Interval Estimation Basic Business Statistics 10 th Edition.
Elementary hypothesis testing Purpose of hypothesis testing Type of hypotheses Type of errors Critical regions Significant levels Hypothesis vs intervals.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 99 Chapter 4 The Simplex Method.
EE 685 presentation Optimization Flow Control, I: Basic Algorithm and Convergence By Steven Low and David Lapsley Asynchronous Distributed Algorithm Proof.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 68 Chapter 9 The Theory of Games.
Visual Recognition Tutorial
1 Optimization. 2 General Problem 3 One Independent Variable x y (Local) maximum Slope = 0.
Game Theory.
Chapter 4 The Simplex Method
Random Variables and Probability Distributions
Estimation Basic Concepts & Estimation of Proportions
1 CSI5388: Functional Elements of Statistics for Machine Learning Part I.
Confidence Interval Estimation
PROBABILITY (6MTCOAE205) Chapter 6 Estimation. Confidence Intervals Contents of this chapter: Confidence Intervals for the Population Mean, μ when Population.
Random Sampling, Point Estimation and Maximum Likelihood.
MS 305 Recitation 11 Output Analysis I
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
General Database Statistics Using Maximum Entropy Raghav Kaushik 1, Christopher Ré 2, and Dan Suciu 3 1 Microsoft Research 2 University of Wisconsin--Madison.
Statistics for Managers Using Microsoft Excel, 5e © 2008 Pearson Prentice-Hall, Inc.Chap 8-1 Statistics for Managers Using Microsoft® Excel 5th Edition.
Copyright © 2013, 2009, 2005 Pearson Education, Inc. 1 5 Systems and Matrices Copyright © 2013, 2009, 2005 Pearson Education, Inc.
Edge-disjoint induced subgraphs with given minimum degree Raphael Yuster 2012.
1 Let X represent a Binomial r.v as in (3-42). Then from (2-30) Since the binomial coefficient grows quite rapidly with n, it is difficult to compute (4-1)
EE 685 presentation Utility-Optimal Random-Access Control By Jang-Won Lee, Mung Chiang and A. Robert Calderbank.
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Optimization & Constraints Add mention of global techiques Add mention of calculus.
Lecture 2 Basics of probability in statistical simulation and stochastic programming Leonidas Sakalauskas Institute of Mathematics and Informatics Vilnius,
Sample Variability Consider the small population of integers {0, 2, 4, 6, 8} It is clear that the mean, μ = 4. Suppose we did not know the population mean.
D Nagesh Kumar, IIScOptimization Methods: M2L4 1 Optimization using Calculus Optimization of Functions of Multiple Variables subject to Equality Constraints.
EE 685 presentation Optimization Flow Control, I: Basic Algorithm and Convergence By Steven Low and David Lapsley.
Chapter 1 Introduction n Introduction: Problem Solving and Decision Making n Quantitative Analysis and Decision Making n Quantitative Analysis n Model.
Vaida Bartkutė, Leonidas Sakalauskas
Chapter 7 Point Estimation of Parameters. Learning Objectives Explain the general concepts of estimating Explain important properties of point estimators.
OR Chapter 8. General LP Problems Converting other forms to general LP problem : min c’x  - max (-c)’x   = by adding a nonnegative slack variable.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
(iii) Lagrange Multipliers and Kuhn-tucker Conditions D Nagesh Kumar, IISc Introduction to Optimization Water Resources Systems Planning and Management:
Introduction to Optimization
Lecture 3: MLE, Bayes Learning, and Maximum Entropy
Inequalities for Stochastic Linear Programming Problems By Albert Madansky Presented by Kevin Byrnes.
A Brief Maximum Entropy Tutorial Presenter: Davidson Date: 2009/02/04 Original Author: Adam Berger, 1996/07/05
Chance Constrained Robust Energy Efficiency in Cognitive Radio Networks with Channel Uncertainty Yongjun Xu and Xiaohui Zhao College of Communication Engineering,
CDC Relative Entropy Applied to Optimal Control of Stochastic Uncertain Systems on Hilbert Space Nasir U. Ahmed School of Information Technology.
Amir Yavariabdi Introduction to the Calculus of Variations and Optical Flow.
Copyright (c) 2004 Brooks/Cole, a division of Thomson Learning, Inc. Chapter 7 Inferences Concerning Means.
Statistical Decision Making. Almost all problems in statistics can be formulated as a problem of making a decision. That is given some data observed from.
Copyright © Cengage Learning. All rights reserved. 8 PROBABILITY DISTRIBUTIONS AND STATISTICS.
Sampling and Sampling Distribution
12. Principles of Parameter Estimation
3. The X and Y samples are independent of one another.
Risk-informed Decision Making under Incomplete Information
CONCEPTS OF ESTIMATION
POINT ESTIMATOR OF PARAMETERS
Stat 223 Introduction to the Theory of Statistics
6.3 Sampling Distributions
12. Principles of Parameter Estimation
Constraints.
Presentation transcript:

Computing Interval Estimates for Components of Statistical Information with Respect to Judgements on Probability Density Functions Victor G. Krymsky Ufa State Aviation Technical University, Russia; Kgs. Lyngby, Denmark, 2004

Imprecise Prevision Theory (IPT) Starting Points: Fundamental Publications [1], [2] [1] Walley P., Statistical reasoning with imprecise probabilities, Chapman and Hall, New York, (1991); [2] Kuznetsov V., Interval statistical models, Radio and Sviaz, Moscow, (1991) (in Russian).

Traditional Problem Formulation in the Framework of IPT Constraints: It is necessary to find: subject to constraints (1). and as well as

Dual for the Initial Problem Statement subject to and for any x ≥ 0, i=1,2,…,n: And subject to and for any x ≥ 0, i=1,2,…,n: (3) (4) (5) (6) Let us find:

Important Conclusion Concerning Optimal Solutions (L. Utkin and I. Kozine, [3]) [3] Utkin L. and Kozine I. Different faces of the natural extension. In: Proceedings of the Second International Symposium on Imprecise Probabilities and Their Applications, ISIPTA '01, 2001, pp Optimal solutions belong to a family of DEGENERATE distributions (such probability densities are composed of δ-functions)

Distribution of Probabilistic Masses Masses are concentrated in the fixed points: ∞ Δ x →0 x Density 0

Use of Additional Judgements Additional judgement can be reflected by inequality:  ( x )  K=const,(7) where is such that T0 x K ρ(x)ρ(x)

Main Goal (Theorem) If there is no any finite interval for which function g ( x ) can be represented in the form (8) where then function ρ(x) providing solution of optimization problem mentioned above, belongs to class of step- functions with minimum value equal to 0 and maximum value equal to K.

Some Comments To provide (8) the system must have at least one solution which is independent on x in some interval

Applying Methodology of the Calculus of Variations The inequalities (9) should be excluded from direct consideration in order to allow operating in the open domain with the values of the function. The requirement  ( x )≥0 can be replaced by denoting The requirement  ( x )≤ K can be reflected by equality where v ( x ) is newly introduced function. 0  ( x )  K (10) (11)

Modified Formulation of the Problem We would like to estimate and subject to (12) (13) (14) (15) (16)

Lagrange Approach Equations of Euler – Lagrange: (17)

The Necessary Conditions of Optimality The equations look here as follows: Let us fix any interval Case 1. inside the interval. Thenand (18) Case 2. so and

Practical Implementation Optimal probability density: x x x x x x … x K ρ(x)ρ(x) Denote: (19) (20) 0

Reformulation of the Problem Statement We would like to estimate subject to (21) (22) (23) (24)

Example 1 The information concerning a continuous random variable X is where K, T are fixed positive numbers. What are the bounds for the expectation M ( X )? * * * Let us choose m =0. Objective function:

Solution of Optimization Problem Lower and upper bounds of J interval: 1/2K 1/K T ρ(x) K (T-1/K) (T-1/2K) T 0 0 x x ρ(x) K

Example 2 We add the constraint: where is the indicator function. Here also any finite interval of x values for which cannot be found, so the theorem can be applied. Further analysis shows, that m=1 is the best choice for such situation.

Example 2 (Continuation 1) To provide we have to set: (i)if

Example 2 (Continuation 2) (ii) if As the result or

Acknowledgements The research was initiated by Dr. Igor Kozine of Risø National Laboratory, Denmark, whose kind attention to this work is gratefully acknow- ledged. The work was partially supported by the grant T of Russian Ministry for Education which is also acknowledged.