### Organisation

Time | Wednesday, 8.15h-11.30h |
---|---|

Room | S104 |

Credits | 4 SWS / 5 ECTS |

Exam | written, 60min |

#### Announcements

- First lesson in WS 1920: 09.10.2018

### Machine Learning

Machine Learning is currently one of the hottest topics in computer science. Almost daily we find new press releases about groundbreaking improvements in a wide field of applications, comprising e.g. Object Recognition, Speech Recognition, Automatic Translation, Digital Assistents, Robotics, Autonomous Driving, Intelligent Web-Search, Recommendation Systems, Computer Games and many more. Global Players such as Google, Apple, Facebook, IBM, Bosch, Daimler but also many medium-sized enterprises have been recognized the potential of intelligent, self-learning systems (see e.g. Die Zeit - Interview Sebastian Thrun).

*So, what is Machine Learning?*
Machine Learning is the science of building computer systems that can automatically improve with experience. In contrast to conventional computer systems, adaptive systems that integrate Machine Learning algorithms do not only process data according to a manually programmed sequence of instructions. ML systems learn and adapt the way how they process data automatically. These adaptations are based on the learned knowledge (experience). The learning process requires usually large amounts of data. If there are latent correlations or patterns the learning algorithms will find them and integrate the corresponding knowledge in their processing-model.

This lecture provides an introduction in the currently most relevant machine learning algorithms and their applications. All categories of machine learning - Supervised learning, Unsupervised learning, Reinforcement learning - are covered. Fundamental concepts e.g. on training, test and evaluation in general, are covered as well as well established conventional algorithms (e.g. Support Vector Machines) and the recently top performing Deep Learning algorithms.

### Structure, Contents, Documents

Repository, which contains all slidesets and jupyter notebooks: Lecture’s repo. For starting with jupyter notebooks get some hints from here

Note:The following table contains the contents and links to slides of term WS 1819. The contents for term WS 1920 will adopted to recent developments in particular in the field of Deep Learning.

Lecture | Contents | Additional Material |
---|---|---|

1. Introduction | What is Machine Learning? Applications; Definitions;Categorization in supervised, unsupervised and reinforcement learning; Classification, Regression, Clustering, Model evaluation and selection | Intro Example Classification; Intro Example Regression |

2. Parametric, Generative Models | Probability Theory; Parametric, generative Models, Bayesian Classification, Linear Regression | Example ML Estimation, Parametric Classification, Linear Regression |

3. Gaussian Process | Gaussian Process Regression | Gaussian Process Regression |

4. Linear Discriminants | Linear Discriminants, Logistic Regression, Activation Functions | |

5. Gradient Descent Learning | Loss-functions, Gradient Descent, Stochastic Gradient Descent, Regularisation | |

6. Support Vector Machines | Classification, Regression | Character Recognition,Temperature Prediction |

7. Neural Networks 1 | SLP, MLP, Backpropagation, Dropout | SLP, MLP |

8. Deep Neural Networks | Convolutional Neural Networks, Deep Belief Nets, Stacked Autoencoders | CNN, MLP and CNN in Keras, Keras: Apply pretrained nets, Keras: Apply and adapt pretrained nets |

9. Generative Adversarial Networks (GANs) | DCGAN | |

9. Recurrent Neural Networks | RNN, LSTM, GRU, Bidirectional RNN | RNNs, Temperature Prediction with RNNs |

10. Unsupervised Learning | ||

11. (Deep) Reinforcement Learning | Value Iteration, Q-Learning, Q-Network, AlphaGo |