Difference between revisions of "Course syllabus: Bayesian model selection"
From Research management course
(Created page with "<!-- Course delivered in autumn 2021. Short URL of the page https://bit.ly/3DitwLA--> ===Part 1=== * Lecture 1: Introduction * Lecture 2: Naive Bayes classifier, expo...") |
(→Part 1) |
||
Line 25: | Line 25: | ||
** Written report | ** Written report | ||
− | ===Part 2===Choice of priors, non-informative priors, and Jeffreys distributions | + | ===Part 2=== |
+ | <!--Choice of priors, non-informative priors, and Jeffreys distributions--> | ||
* Lecture 1 EM-algorithm | * Lecture 1 EM-algorithm | ||
* Lecture 2 Applications of the EM algorithm | * Lecture 2 Applications of the EM algorithm |
Revision as of 00:46, 4 March 2023
Part 1
- Lecture 1: Introduction
- Lecture 2: Naive Bayes classifier, exponential family of distributions
- Exercise 1
- Lecture 3: Bayesian linear regression and Model evidence
- Lecture 4: Model evidence
- Test 1
- Lecture 5: Analysis of model evidence and statistical significance
- Task 2
- Practice 1
- Data for Practice 1
- Lecture 6: Bayesian logistic regression and feature selection, EM algorithm
- Task 3
- Lecture 7: EM-algorithm and variational EM-algorithm, missed data
- Lecture 8: Variational EM-algorithm
- Lecture 9: Gaussian processes and evolution of models in time
- Lecture 10: Construction of adequate multi-models
- Task 4
- Lecture 11: Monte Carlo Methods for Markov Chains
- Practice 1 (continued)
- Lecture 12: Hamiltonian Monte Carlo Methods for Markov Chains
- Lecture 13: Bayesian optimization.
- Written report
Part 2
- Lecture 1 EM-algorithm
- Lecture 2 Applications of the EM algorithm
- Lecture 3 Variational EM-algorithm
- Lecture 3 Practice on EM and the variational EM algorithm
- Lecture 4 Hamiltonian Monte Carlo methods and comparison with the variational EM algorithm
- Lecture 4 Practice on the variational EM algorithm and comparison with HMC
- Lecture 5: Graphic models, Conditional independence of variables
- Practice 1
- Data for Practice 1
- Lecture 6: Oriented and undirected graphical models and the relationship between them
- Lecture 7: Factor graphs and exact inference in acyclic graphical models
- Lecture 8: Max-Sum Algorithm and Hidden Markov Models
- Lecture 9: Baum-Welch algorithm for estimating the parameters of hidden Markov models
- Lecture 9: Practice on the Baum-Welch algorithm
- Lecture 10: Algorithms for finding the minimum cut in graphs for output in graphical models
- Lecture 11: TRW algorithm for inference in cyclic graphical models for total energy
- Lecture 12: Estimation of hyperparameters of graphical models
- Competition
- Exam
References
- David MacKay, 2005, Information Theory, Inference, and Learning Algorithms
- Christopher Bishop, 2006, Pattern Recognition and Machine Learning
- David Barber, 2014, Bayesian Reasoning and Machine Learning
- Daphne Koller and Nir Friedman, 2009, Probabilistic Graphical Models
- Kevin P. Murphy, 2012, Machine Learning: a Probabilistic Perspective