<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en-GB">
	<id>https://m1p.org/index.php?action=history&amp;feed=atom&amp;title=Course_syllabus%3A_Machine_Learning</id>
	<title>Course syllabus: Machine Learning - Revision history</title>
	<link rel="self" type="application/atom+xml" href="https://m1p.org/index.php?action=history&amp;feed=atom&amp;title=Course_syllabus%3A_Machine_Learning"/>
	<link rel="alternate" type="text/html" href="https://m1p.org/index.php?title=Course_syllabus:_Machine_Learning&amp;action=history"/>
	<updated>2026-04-14T15:56:03Z</updated>
	<subtitle>Revision history for this page on the wiki</subtitle>
	<generator>MediaWiki 1.34.0</generator>
	<entry>
		<id>https://m1p.org/index.php?title=Course_syllabus:_Machine_Learning&amp;diff=1418&amp;oldid=prev</id>
		<title>Vs at 21:00, 13 February 2024</title>
		<link rel="alternate" type="text/html" href="https://m1p.org/index.php?title=Course_syllabus:_Machine_Learning&amp;diff=1418&amp;oldid=prev"/>
		<updated>2024-02-13T21:00:42Z</updated>

		<summary type="html">&lt;p&gt;&lt;/p&gt;
&lt;table class=&quot;diff diff-contentalign-left&quot; data-mw=&quot;interface&quot;&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;col class=&quot;diff-marker&quot; /&gt;
				&lt;col class=&quot;diff-content&quot; /&gt;
				&lt;tr class=&quot;diff-title&quot; lang=&quot;en-GB&quot;&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;← Older revision&lt;/td&gt;
				&lt;td colspan=&quot;2&quot; style=&quot;background-color: #fff; color: #222; text-align: center;&quot;&gt;Revision as of 21:00, 13 February 2024&lt;/td&gt;
				&lt;/tr&gt;&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l1&quot; &gt;Line 1:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 1:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The module ''Master's Machine Learning'' explores the principles of machine learning algorithms. The main goal of the module is to discuss how to state a practical problem correctly, according to theoretical principles. This module explains what hypotheses should be applied to the dataset, how to set error functions, how to select an optimal model&lt;del class=&quot;diffchange diffchange-inline&quot;&gt;, &lt;/del&gt;and analyze its features. The module includes lectures on clustering, classification, regression, model selection, and multi-modeling. It includes a series of homework problems based on real-life projects.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;{{#seo:&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt; |title=Master's Machine Learning&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt; |titlemode=replace&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt; |keywords=Machine Learning&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt; |description=The module ''Master's Machine Learning'' explores the principles of machine learning algorithms. The main goal of the module is to discuss how to state a practical problem correctly, according to theoretical principles.&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt; }}&lt;/ins&gt;&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot;&gt; &lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;The module ''Master's Machine Learning'' explores the principles of machine learning algorithms. The main goal of the module is to discuss how to state a practical problem correctly, according to theoretical principles. This module explains what hypotheses should be applied to the dataset, how to set error functions, how to select an optimal model and analyze its features. The module includes lectures on clustering, classification, regression, model selection, and multi-modeling. It includes a series of homework problems based on real-life projects.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;   &lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;   &lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;# Probabilistic nature of datasets. Data generation hypothesis. Conditional and marginal distributions. Exponential family.  &lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;# Probabilistic nature of datasets. Data generation hypothesis. Conditional and marginal distributions. Exponential family.  &lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot; id=&quot;mw-diff-left-l29&quot; &gt;Line 29:&lt;/td&gt;
&lt;td colspan=&quot;2&quot; class=&quot;diff-lineno&quot;&gt;Line 35:&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;# Multi-modeling. Adaboost. Tree-based voting. Mixture of models. Mixture of experts.  &lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;# Multi-modeling. Adaboost. Tree-based voting. Mixture of models. Mixture of experts.  &lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;#* Practice: sociological data modeling.  &lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;#* Practice: sociological data modeling.  &lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt;−&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #ffe49c; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Review: clustering, regression and classification models. Model selection. Parameter and error analysis.  &lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt;+&lt;/td&gt;&lt;td style=&quot;color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #a3d3ff; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;Review: clustering, regression&lt;ins class=&quot;diffchange diffchange-inline&quot;&gt;, &lt;/ins&gt;and classification models. Model selection. Parameter and error analysis.  &lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;#* Practice: creating error analysis report.  &lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;#* Practice: creating error analysis report.  &lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;tr&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;# Exam.&lt;/div&gt;&lt;/td&gt;&lt;td class='diff-marker'&gt; &lt;/td&gt;&lt;td style=&quot;background-color: #f8f9fa; color: #222; font-size: 88%; border-style: solid; border-width: 1px 1px 1px 4px; border-radius: 0.33em; border-color: #eaecf0; vertical-align: top; white-space: pre-wrap;&quot;&gt;&lt;div&gt;# Exam.&lt;/div&gt;&lt;/td&gt;&lt;/tr&gt;
&lt;/table&gt;</summary>
		<author><name>Vs</name></author>
		
	</entry>
	<entry>
		<id>https://m1p.org/index.php?title=Course_syllabus:_Machine_Learning&amp;diff=1204&amp;oldid=prev</id>
		<title>Wiki: Created page with &quot;The module ''Master's Machine Learning'' explores the principles of machine learning algorithms. The main goal of the module is to discuss how to state a practical problem cor...&quot;</title>
		<link rel="alternate" type="text/html" href="https://m1p.org/index.php?title=Course_syllabus:_Machine_Learning&amp;diff=1204&amp;oldid=prev"/>
		<updated>2023-03-06T16:30:51Z</updated>

		<summary type="html">&lt;p&gt;Created page with &amp;quot;The module &amp;#039;&amp;#039;Master&amp;#039;s Machine Learning&amp;#039;&amp;#039; explores the principles of machine learning algorithms. The main goal of the module is to discuss how to state a practical problem cor...&amp;quot;&lt;/p&gt;
&lt;p&gt;&lt;b&gt;New page&lt;/b&gt;&lt;/p&gt;&lt;div&gt;The module ''Master's Machine Learning'' explores the principles of machine learning algorithms. The main goal of the module is to discuss how to state a practical problem correctly, according to theoretical principles. This module explains what hypotheses should be applied to the dataset, how to set error functions, how to select an optimal model, and analyze its features. The module includes lectures on clustering, classification, regression, model selection, and multi-modeling. It includes a series of homework problems based on real-life projects.&lt;br /&gt;
 &lt;br /&gt;
# Probabilistic nature of datasets. Data generation hypothesis. Conditional and marginal distributions. Exponential family. &lt;br /&gt;
#* Practice: small-size bio-medical dataset processing.&lt;br /&gt;
# Regression models. Maximum likelihood and error function. Bias-variance decomposition. Bayesian regression. &lt;br /&gt;
#* Practice: energy consumption forecasting.&lt;br /&gt;
# Classification models. Generative and discriminative models. Model parameters covariance. Laplace approximation. &lt;br /&gt;
#* Practice: scoring model constructing.&lt;br /&gt;
# Feature selection algorithms. Add and Del strategies, group method for data handling. Analysis of variance. Regularization, Lasso, LARS. Multicollinearity tests. &lt;br /&gt;
#* Practice: feature selection for immunological data. &lt;br /&gt;
# Neural networks. Parameter and structure optimization. Regularization. Optimal brain surgery. Bayesian networks. &lt;br /&gt;
#* Practice: network structure selection for the red vine quality dataset. &lt;br /&gt;
# Graphical models. Bayesian networks. Conditional independence. Bayesian inference. Learning the graph structure. &lt;br /&gt;
#* Practice: image de-noising.  &lt;br /&gt;
# Kernel methods and support vector machines. Radial basis functions and networks. Maximum margin classifiers. Relevance vector machines. Privileged learning. &lt;br /&gt;
#* Practice: document ranking.&lt;br /&gt;
# Principal component analysis and continuous latent variables. Probabilistic PCA. Kernel PCA. Non-linear PCA. Manifold learning. &lt;br /&gt;
#* Practice: ecological footprint integral indicator constructing. &lt;br /&gt;
# Mixture models and Expectation-maximization. Mixture of Gaussians. EM for Bayesian regression. &lt;br /&gt;
#* Practice: image segmentation.&lt;br /&gt;
# Clustering. Metrics learning. Kohonen maps. Topological mapping. Deformation energy functions. &lt;br /&gt;
#* Practice: trajectories clustering.&lt;br /&gt;
# Representation learning. Restricted Boltzmann machine. Autoencoder. Predictive sparse decomposition. Probabilistic and direct encoding models. &lt;br /&gt;
#* Practice: human physical activity classification using deep learning.&lt;br /&gt;
# Approximate inference. Variational linear and logistic regression. Variational mixture of Gaussians. Expectation propagation.&lt;br /&gt;
#* Practice: comparison of parameter estimation methods.&lt;br /&gt;
# Model selection. Model evidence. Coherent Bayesian inference. Minimum description length principle. &lt;br /&gt;
#* Practice: econometric model selection.&lt;br /&gt;
# Multi-modeling. Adaboost. Tree-based voting. Mixture of models. Mixture of experts. &lt;br /&gt;
#* Practice: sociological data modeling. &lt;br /&gt;
Review: clustering, regression and classification models. Model selection. Parameter and error analysis. &lt;br /&gt;
#* Practice: creating error analysis report. &lt;br /&gt;
# Exam.&lt;br /&gt;
&lt;br /&gt;
===References===&lt;br /&gt;
# Christopher Bishop, 2006. Pattern Recognition and Machine Learning&lt;br /&gt;
# David MacKay, 2003. Information Theory, Inference, and Learning Algorithms&lt;br /&gt;
# David Barber, 2012. Bayesian Reasoning and Machine Learning&lt;br /&gt;
===Supplementary===&lt;br /&gt;
# Peter Flach, 2013. Machine Learning: The Art and Science of Algorithms that Make Sense of Data&lt;br /&gt;
# Trevor Hastie, Robert Tibshirani and Jerome Friedman, 2013 The Elements of Statistical Learning: Data Mining, Inference, and Prediction&lt;br /&gt;
# Shai Shalev-Shwartz and Shai Ben-David, 2014. Understanding Machine Learning: From Theory to Algorithms&lt;br /&gt;
===Prerequisites===&lt;br /&gt;
Calculus, Linear algebra, Probability, Statistics, Python (Matlab)&lt;/div&gt;</summary>
		<author><name>Wiki</name></author>
		
	</entry>
</feed>