Midterm Review
The Midterm Everything we have talked about so far Stuff from HW I won’t ask you to do as complicated calculations as the HW Don’t need a calculator No books / notes
Maximum Likelihood Estimation How to apply the maximum likelihood principle – log likelihood + derivative + solve for 0 – You should know how to do this for Bernoulli trials and 1-D Gaussian Conjugate distributions – Dirichlet, Beta
Mixture Models and EM What does the EM algorithm do? – Understand the E-step and M-step Log-exp-sum trick – You should be able to derive this – You should understand why we need to use it
Hidden Markov Models Viterbi – What does it do? – What is the running time? Forward-backward – What does it do? Be able to compute the probability of a “parse” – Joint probability of a sequence of observed and hidden states
Bayesian Networks Understand d-separation criteria Be able to answer simple questions about whether variables are independent given some evidence Markov Blanket
Markov Networks / Belief Propagation Moralizing a graph (convert Bayesian network into Markov Network) Belief propagation – What does it do, when is it guaranteed to converge to the correct posterior distribution.