AI-Class.com Dan Vasicek
Overview of the course (so far) Problem Solving Probability in AI Probabilistic Inference Machine Learning Unsupervised Learning Representation with Logic Planning Planning with Uncertainty Reinforcement Learning Hidden Markovian Models Markov Decision Processes Midterm Exam
General information Huge Student base, ~135,000 students Publicized in WSJ, and other non-technical sites Random people that you meet on the street know about the class This is big enough to have a significant social impact
Course Textbook Artificial Intelligence: A Modern Approach 3 rd Ed (1152 pages) By Peter Norvig and Stuart Russell Paperback version available from Amazon for $114 PDF version is available for download for free (45 Megabytes, 7 minutes, & searches for the book name will find the PDF) CD containing the 45 Megabyte PDF available from Daniel Vasicek
Course Mechanics – 20 approx 1 hour long lectures presented as 30 or so 2 minute video clips. Lectures are also available on youtube: 5e935u9hE 5e935u9hE During the lectures non-credit quizzes are given Credit given for homework once per week Midterm, final exam, and homework count toward your grade Artistic Intelligence Course at
Planning with Uncertainty Markov Decision Process tml Carl Dahlman’s Theorem guarantees convergence tml Select the policy that maximizes the value of each position for each iteration Assume we know V(S’) for each location. Then compute a new value maximizing V(S) = Sum(P(S’|S, policy)V(S’) + R(S)
Partially Observable Markov Decision Process Adds information gathering to the MD process
Reinforcement Learning Learn information about the environment – Supervised learning (previous lecture) – Unsupervised learning (previous lecture) – Find an optimal policy when the environment is not known
Statistical Independence Bayes Theorem for events a and b: – P(a and b) = P(a|b)P(b) = P(b|a)P(a) – This gives us a way to evaluate two of the five probabilities when we know three of P(a|b), P(b), P(b|a), P(a) Statistically independent means: – P(a and b) = P(a)P(b) = P(b)P(a)
Conditional Independence Conditionally independent means: – P(a|(b and c)) = P(a |c) or – P((a and b)|c) = P(a |c)P(b|c) Conditional independence does not imply statistical independence Statistical independence does not imply conditional independence
Binary Variables Battery Age converted from a continuous variable to binary (too old versus young)
Bayes Networks Laplace Smoothing (Not Laplacian !!) – Laplacian smoothing is averaging of adjacent values Laplace smoothing avoids overfitting by adding weight to unobserved (but possible) cases Spam Detection Vocabulary size for Spam versus Ham?
K-Means There seems to be some confusion between K mean values vs K-nearest neighbors
K- Nearest Neighbors Classify unknown points by taking the majority vote of the K nearest neighbor training examples
Contact me if you wish Daniel Vasicek Cell Phone Metzgar Road SW Land Line