**Description**

As the World Wide Web keeps growing, computer science keeps evolving from its traditional form, slowly slowly becoming the art to create intelligent software and hardware systems that draw relevant information from the enormous amount of available data.

Why? Let's look at the facts: billions of web pages are at our disposal, videos with an accumulated time of 20 hours are uploaded every minute on Youtube and the supermarket chain Walmart alone performed more than one million transactions per hour, creating a database of more than 2.5 petabytes of information. John Naisbitt has stated the problem very clearly:

* "We are drowning in information and starving for knowledge." *

In the future of computer science, machine learning will therefore be an important core technology. Not only that, machine learning already is the technology which promises the best computer science jobs. Hal Varian, the Chief Engineer of Google in 2009 depicted it like this:

* "I keep saying the sexy job in the next ten years
will be statisticians and machine learners. People think I am joking,
but who would have guessed that computer engineers would have been the
sexy job of the 1990s? The ability to take data, to be able to
understand it, to process it, to extract value from it, to visualize it,
to communicate it, that is going to be a hugely important skill in the
next decades.* "

Accordingly, this lecture serves as an introduction to machine learning. Special emphasis is placed on a clear presentation of the lectures contents supplemented by small sample problems regarding each of the topics. The teacher pays particular attention to his interactions with the participants of the lecture, asking multiple question and appreciating enthusiastic students.

**Contents**

The course gives an introduction to statistical machine learning methods. The following topics are expected to be covered throughout the semester:

- Probability Distributions
- Linear Models for Regression and Classification
- Kernel Methods, Graphical Models
- Mixture Models and EM
- Approximate Inference
- Continuous Latent Variables
- Neural Networks
- Hidden Markov Models

**Requirements**

Math classes from the bachelor's degree, basic programming abilities, introductory classes to computer science.

**Literature**

The most important books for this class are:

- C.M. Bishop.
*Pattern Recognition and Machine Learning*, Springer free online copy - K.P. Murphy.
*Machine Learning: a Probabilistic Perspective*, MIT Press

Additionally, the following books might be useful for specific topics:

- D. Barber.
*Bayesian Reasoning and Machine Learning*, Cambridge University Press Free online copy - T. Hastie, R. Tibshirani, and J. Friedman.
*The Elements of Statistical Learning*, Springer Verlag Free online copy - D. MacKay.
*Information Theory, Inference, and Learning Algorithms*, Cambridge University Press Free online copy - R. O. Duda, P. E. Hart, and D. G. Stork.
*Pattern Classification*, Willey-Interscience - T. M. Mitchell.
*Machine Learning*, McGraw-Hill - R. Sutton, A. Barto.
*Reinforcement Learning - An Introduction*, MIT Press free online copy - M. Jordan.
*An Introduction to Probabilistic Graphical Models*Free online copy

**Additional Material**

Here are some tutorials on relevant topics for the lecture.

- An overview of gradient descent optimization algorithms (Sebastian Ruder)
- Covariant/natural gradient notes (Marc Toussaint)
- Bayesian Linear Regression, video (Jeff Miller)
- GP, video (John Cunningham)
- GP Regression, video (Jeff Miller)
- SVM, pdf (Andrew Ng)
- SVM, video (Patrick Winston)

**Homework**

- The homework will be given during the course.
- Homework is
**not compulsory**but we strongly suggest that you complete them as they provide hands-on experience on the topics. - Credit for the final exam may be earned by regularly handing in homework.
- Details will be announced during the class.

- Dozent*in: Kristian Kersting
- Dozent*in: Karl Stelzner
- Dozent*in: Claas Alexander Völcker