Course syllabus: Bayesian model selection and multimodeling

From Research management course
Jump to: navigation, search

The lecture course delivers the main problem of machine learning, the problem of model selection. One can set a heuristic model and optimize its parameters, select a model from a class, make a teacher model to transform its knowledge into a student model, or even make an ensemble from models. Behind all these strategies is a fundamental technique: the Bayesian inference. It assumes hypotheses about the measured data set, the model parameters, and even about the model structure. It deduces the error function to optimize. This is called the Minimum Description Length principle. It selects simple, stable, and precise models. This course joins the theory and the practical lab works of model selection and multimodeling. Course page

Grading

  • Labs: 6 in total
  • Forms: 1 in total
  • Reports: 2 in total
  • The maximum score is 11, so the final score is MIN(10, score)

Syllabus

  1. 8.09 Intro
  2. 15.09 Distributions, expectation, likelihood
  3. 22.09 Bayesian inference
  4. 29.09 MDL, Minimum description length principle
  5. 6.10 Probabilistic metric spaces
  6. 13.10 Generative and discriminative models
  7. 20.10 Data generation, VAE, GAN
  8. 27.10 Probabilistic graphical models
  9. 3.11 Variational inference
  10. 10.11 Variational inference 2
  11. 17.11 Hyperparameter optimization
  12. 24.11 Meta-optimization
  13. 1.12 Bayesian PCA, GLM and NN
  14. 8.12 Gaussian processes

References

  1. Bishop, Barber, Murphy, Rasmussen and Williams, Taboga to catch up
  2. Kuznetsov M.P., Tokmakova A.A., Strijov V.V. Analytic and stochastic methods of structure parameter estimation // Informatica, 2016, 27(3): 607-624.
  3. Bakhteev O.Y., Strijov V.V. Deep learning model selection of suboptimal complexity // Automation and Remote Control, 2018, 79(8): 1474–1488.
  4. Bakhteev O.Y., Strijov V.V. Comprehensive analysis of gradient-based hyperparameter optimization algorithms // Annals of Operations Research, 2020: 1-15.

Syllabus

Bayesian models and ensembles (variant)

  1. Models, distributions, expectation, likelihood
  2. Algebra on distributions, ways of marginalisation and reconstruction of joint
  3. Bayesian inference
  4. Probabilistic metric spaces
  5. Generative and discriminative models
  6. Data generation, VAE, GAN
  7. Probabilistic graphical models
  8. Variational inference
  9. Bayesian PCA, GLM and NN
  10. Gaussian processes
  11. Belief propagation, networks, and hierarchical models
  12. Bayesian inference for model selection
  13. Structure priors and model selection
  14. Informative prior and MCMC
  15. Sampling, importance, Metropolis-Hastings
  16. Random processes and genetics for model generation
  17. Model ensembles
  18. Mixture of experts
  19. Distilling and privileged learning
  20. Transfer learning, multitask learning
  21. Domain adaptation
  22. Projection to latent space
  23. Bayesian agents
  24. Multi-agents and reinforcement