YAP: Yet Another Puzzle Boudewijn R. Haverkort November 16, 2007.

Slides:



Advertisements
Similar presentations
ST3236: Stochastic Process Tutorial 3 TA: Mar Choong Hock Exercises: 4.
Advertisements

Copyright © Cengage Learning. All rights reserved. CHAPTER 5 SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION SEQUENCES, MATHEMATICAL INDUCTION, AND RECURSION.
The Collatz Problem Ellen Dickerson.
The Effect of Markov Chain State Size for Synthetic Wind Speed Generation Fatih O. Hocaoğlu, Ömer N. Gerek and Mehmet Kurban.
Notes 1.1.
Possibilistic and probabilistic abstraction-based model checking Michael Huth Computing Imperial College London, United Kingdom.
Model Checking for Probabilistic Timed Systems Jeremy Sproston Università di Torino VOSS Dagstuhl seminar 9th December 2002.
1 Markov Chains (covered in Sections 1.1, 1.6, 6.3, and 9.4)
Inductive Reasoning.  Reasoning based on patterns that you observe  Finding the next term in a sequence is a form of inductive reasoning.
Andrei Andreyevich Markov Markov Chains.
Introduction to Discrete Time Semi Markov Process Nur Aini Masruroh.
Chapter 5 Probability Models Introduction –Modeling situations that involve an element of chance –Either independent or state variables is probability.
Matrices, Digraphs, Markov Chains & Their Use. Introduction to Matrices  A matrix is a rectangular array of numbers  Matrices are used to solve systems.
Markov Chains Ali Jalali. Basic Definitions Assume s as states and s as happened states. For a 3 state Markov model, we construct a transition matrix.
Tutorial 8 Markov Chains. 2  Consider a sequence of random variables X 0, X 1, …, and the set of possible values of these random variables is {0, 1,
Markov Models notes for CSCI-GA.2590 Prof. Grishman.
1. Markov Process 2. States 3. Transition Matrix 4. Stochastic Matrix 5. Distribution Matrix 6. Distribution Matrix for n 7. Interpretation of the Entries.
What is the probability that the great-grandchild of middle class parents will be middle class? Markov chains can be used to answer these types of problems.
CSE 3504: Probabilistic Analysis of Computer Systems Topics covered: Continuous time Markov chains (Sec )
Lecture 1Easy, Hard, Impossible!Handout Easy: Euclid Sequences Form a sequence of number pairs (integers) as follows: Begin with any two positive numbers.
Branching Bisimulation Congruence for Probabilistic Transition Systems
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Analysis of software reliability and performance.
Finite Mathematics & Its Applications, 10/e by Goldstein/Schneider/SiegelCopyright © 2010 Pearson Education, Inc. 1 of 60 Chapter 8 Markov Processes.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec )
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Analysis of software reliability and performance.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete time Markov chains (Sec. 7.1)
(a) (b) (c) (d). What is (1,2,3)  (3,4,2)? (a) (1, 2, 3, 4) (b) (1,2)  (3,4) (c) (1,3,4,2) (d) (3,1)  (4,2)
Geometry Vocabulary 1A Geometry, like much of mathematics and science, developed when people began recognizing and describing patterns. In this course,
Operations with Matrices Dan Teague NC School of Science and Mathematics
College Algebra Fifth Edition James Stewart Lothar Redlin Saleem Watson.
Some Probability Theory and Computational models A short overview.
Markov Decision Processes1 Definitions; Stationary policies; Value improvement algorithm, Policy improvement algorithm, and linear programming for discounted.
1 Probabilistic Model Checking of Systems with a Large State Space: A Stratified Approach Shou-pon Lin Advisor: Nicholas F. Maxemchuk Department of Electrical.
CHAPTER 15 SECTION 1 – 2 Markov Models. Outline Probabilistic Inference Bayes Rule Markov Chains.
Markov Chains and Random Walks. Def: A stochastic process X={X(t),t ∈ T} is a collection of random variables. If T is a countable set, say T={0,1,2, …
 { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains.
Chapter 3 : Problems 7, 11, 14 Chapter 4 : Problems 5, 6, 14 Due date : Monday, March 15, 2004 Assignment 3.
Mrs. McConaughyGeometry1 Patterns and Inductive Reasoning During this lesson, you will use inductive reasoning to make conjectures.
1 Parrondo's Paradox. 2 Two losing games can be combined to make a winning game. Game A: repeatedly flip a biased coin (coin a) that comes up head with.
MaskIt: Privately Releasing User Context Streams for Personalized Mobile Applications SIGMOD '12 Proceedings of the 2012 ACM SIGMOD International Conference.
Discrete Time Markov Chains
Finite Automata Chapter 1. Automatic Door Example Top View.
Stochastic Processes and Transition Probabilities D Nagesh Kumar, IISc Water Resources Planning and Management: M6L5 Stochastic Optimization.
Meaning of Markov Chain Markov Chain is a stochastic model describing a sequence of possible events in which the probability of each event depends only.
1 LESSON 1.1 PATTERNS AND INDUCTIVE REASONING. 2 Objectives To find and describe patterns. To use inductive reasoning to make conjectures.
PERTEMUAN 26. Markov Chains and Random Walks Fundamental Theorem of Markov Chains If M g is an irreducible, aperiodic Markov Chain: 1. All states are.
11. Markov Chains (MCs) 2 Courtesy of J. Bard, L. Page, and J. Heyl.
How to find foreign genes? Markov Models AAAA: 10% AAAC: 15% AAAG: 40% AAAT: 35% AAA AAC AAG AAT ACA... TTG TTT Training Set Building the model.
Goldstein/Schnieder/Lay: Finite Math & Its Applications, 9e 1 of 60 Chapter 8 Markov Processes.
Daphne Koller Sampling Methods Metropolis- Hastings Algorithm Probabilistic Graphical Models Inference.
Let E denote some event. Define a random variable X by Computing probabilities by conditioning.
Markov Chain Hasan AlShahrani CS6800
Ergodicity, Balance Equations, and Time Reversibility
University of Bologna & UnipolSai Assicurazioni
Each term is half the preceding term. So the next two terms are
Hidden Markov Autoregressive Models
Discrete-time markov chain (continuation)
IENG 362 Markov Chains.
IENG 362 Markov Chains.
Chapman-Kolmogorov Equations
The Hailstone Sequence
Easy, Hard, Impossible! A Lecture in CE Freshman Seminar Series:
CSE 531: Performance Analysis of Systems Lecture 4: DTMC
2) For each of the finite state machines above, use the sets derived in part 1) to produce the following:   i.                  a set of sequences that.
MCMC for PGMs: The Gibbs Chain
Pairwise Markov Networks
Autonomous Cyber-Physical Systems: Probabilistic Models
Discrete-time markov chain (continuation)
Solutions Markov Chains 6
Random Processes / Markov Processes
Presentation transcript:

YAP: Yet Another Puzzle Boudewijn R. Haverkort November 16, 2007

Collatz Conjecture (1937) n = 1, 2, 3,... Sequence of n such that n := n/2 if n is even n := 3n + 1, if n is odd Conjecture: for all n ≥ 1, the above sequence returns to 1 “Always Eventually 1”

A few cases... Sequence of n such that n := n/2 if n is even n := 3n + 1, if n is odd 5, 16, 8, 4, 2, 1 6, 3, 10, 5, 16, 8, 4, 2, 1 27, 82, 41, steps..., 4, 2, 1 Computer-checked up to 2.88 x 10 18

Probabilistic interpretation View sequence of n’s as DTMC on the positive natural numbers, with “probability 1” transitions pCTL-type of model checking: -- find the satisfaction set for P ≥1 {AE “1”} -- this set should be equal to { 1, 2,... } Use all we know about Markov chains, abstraction, bisimulation, infinite-state Markov chains, regular model checking,... //en.wikipedia.org/wiki/Collatz_conjecture