| At the end of this course, the student - is able to formulate a probabilistic model for regression, classification, and density estimation
- understands the principle of maximum likelihood estimation as well as the full Bayesian approach
- is able to derive algorithms using these principles for a wide class of models, such as the linear Gaussian model, linear models for regression, logistic regression, and Gaussian mixtures (the EM algorithm)
- is able to understand and implement mathematically described methods from modern statistical machine learning
|
| Machine learning is concerned with methods for decision taking based on data. In statistical machine learning, these methods are based on probabilistic models and statistical inference methods, including the maximum likelihood estimate and Bayesian learning. These methods have a wide variety of applications such as visual object recognition, analysis of genetic data, financial data or neuroscience data etc. In this course we provide a principled treatment of the basic models and methods from statistical machine learning. This requires a certain mathematical depth, but we will take ample time to acquire the necessary mathematical knowledge and skills using exercises, (computer) assignments, and optionally student projects on more advanced state-of-the-art machine learning methods (such as Gaussian processes, support vector machines, graphical models, Markov Chain Monte Carlo.) |
|
|
|
• Christopher M. Bishop, Pattern Recognition and Machine Learning, Springer NB, strongly advised: students are allowed to consult the book (paper copy!) during the written exam. |
• 24 hours lecture • 24 hours problem session • 120 hours individual study period Extra information teaching methods: Lectures; exercise class; assignments (take-home); individual study period and/or student project (optionally) |
This course is intended for master students in Computer Science and Artificial Intelligence. Students in Physics and Mathematics who are interested in this topic are advised to follow the course Inleiding Machine Learning in the Bachelor phase and the course Machine Learning in the Masters' phase. The course is given if there is sufficient interest. |
• Probability theory, Bayesian probabilities • Learning, generalization, and over-fitting • Decision theory • Information theory • Probability models, Gaussian distribution, exponential family • Probability density estimation, maximum likelihood and Bayesian inference • Linear models for regression and classification • Laplace approximation • Bayesian model selection • Mixture models and EM • Advanced methods in modern statistical machine learning |
• Written exam (open book) • Take-home assignments • Project report (optional) |
"Wiskunde 1 and 2 voor Kunstmatige Intelligentie” or equivalent (basic calculus and linear algebra) |
| | Recommended materialsBookChristopher M. Bishop, Pattern Recognition and Machine Learning, Springer NB, strongly advised: students are allowed to consult the book (paper copy!) during the written exam. |
|
| Instructional modesCourseAttendance Mandatory | | Yes |
| LectureAttendance Mandatory | | Yes |
| TutorialAttendance Mandatory | | Yes |
| ZelfstudieAttendance Mandatory | | Yes |
|
| TestsTentamenTest weight | | 1 |
Test type | | Exam |
Opportunities | | Block KW2, Block KW4 |
|
|
| |