- 강의실: 공학 x관 xxx호
- Textbooks
- D. Koller and N. Friedman, Probabilistic Graphical Models: Principles and Techniques, MIT Press, 2009 [book][pdf]
- Kevin Patrick Murphy, Machine Learning: a Probabilistic Perspective, MIT Press, 2012 [link][pdf] [pdf2]
- C. E. Rasmussen and C. K. I. Williams, Gaussian Processes for Machine Learning, MIT process, 2006 [link]

- References
- Stan[link]
- Pytorch[link]
- TensorFlow [link]
- Pyro [link]
- Thoretical concept of machine leanring [pdf]
- CMU lecture on probabilistic graphical models [link]
- Stanford lecture on probabilistic graphical models [link]
- Caltech lecture on probabilistic graphical models [link]
- NYU lecture on probabilistic graphical models [link]
- UChigago lecture on probabilistic graphical models [link]
- Brown U lecture on probabilistic graphical models [link]

- Assignment 1: Solve problem sets
- MLAPP: Exercises 10.1, 10.2, 10.4

- Assignment 2: Solve problem sets
- MLAPP: Exercises 19.1, 19.5
- MLAPP: Derive Eq.(19.43) and Eq. (19.60)

- Assignment 3: [pdf]

- Lecture 0: Introduction [pdf]
- Lecture 1: Bayesian machine learning [pdf]
- Lecture 2: Bayesian statistics [pdf]
- Lecture 3: Frequentist statistics [pdf]
- [ref] Cramer-Rao Lower Bound and Information Geometry [pdf]

- Lecture 4: Directed models [pdf] [pdf]
- Lecture 4: Mixture models and the EM algorithm [pdf]
- Lecture 5: Undirected graphical models [pdf] [pdf]
- Lecture 6: Exact inference [pdf] [pdf]
- Lecture 7: Variational inference [pdf] [pdf]
- Lecture 8: Markov chain Monte Carlo (MCMC) inference [pdf]
- Lecture 9: Gaussian processes
- Lecture 10: Bayesian neural networks
- Lecture 11: Uncertainty in deep Learning

- Lecture 1: Introduction, Bayesian networks [pdf]
- Bayesian Networks [pdf]
- Lecture 2: Undirected graphical models [pdf]
- Lecture 3: Undirected graphical models II [pdf]
- Undirected graphical models[pdf]
- Lecture 4: Exact inference [pdf]
- Variable Elimination [pdf]
- Complexity [pdf]
- Variable Elimination: Basic Ideas [pdf]
- Variable Elimination: Algorithm [pdf]
- Lecture 5: Exact inference II (Junction tree algorithm) [pdf]
- Belief propagation [pdf]
- Other slides
- Lecture 6: Exact inference III (Junction tree algorithm) [Message passing I] [Message passing II]
- Lecture 7: Inference as optimization: Message passing [pdf], Mean field approximation [pdf], Variational approximation [pdf]
- [ref] Variational inference [pdf]
- Application: Latent Dirichlet Allocation (LDA) [pdf]
- Exponential families [pdf]
- KKT condition [pdf]
- Additional material: Mean field approximation [pdf]
- Lecture 8: Inference as optimization: Loopy belief propagation [pdf][pdf]
- Lecture 9: Monte-Carlo methods for inference: [pdf] [pdf]
- Lecture 10: MAP inference: [pdf]
- [ref] MAP and dual decomposition [pdf]

- Lecture 11: Learning in undirected graphical models: [pdf]
- Lecture 12: Partially observed data - Parameter estimation: [pdf]
- Lecture 13: Structure learning: [pdf]
- Lecture 14: Topic models and Dirichlet processes: [pdf][pdf]
- Lecture 15: Markov logic network [pdf]
- [ref] Paper on Markov logic network [pdf]

- Lecture 16: Bayesian Nonparametrics in Document and Language Modeling [pdf]
- Lecture 17: Dual decomposition [pdf]