### Organisation

Time | only SS |
---|---|

Time | Tuesday, 14.15h-17.30h |

Room | 136 |

Credits | 3 SWS / 5 ECTS |

Exam | Lab Work |

#### Announcements

**This lecture will be held next time in summer term 2018**

#### Prerequisites for participation

- Successful completion of either the CSM Machine Learning Lecture or the CSM Object Recognition Lecture

### Structure, Contents, Documents

In this lab student-groups must implement selected applications from Artificial Intelligence, Machine Learning, Object Recognition and Natural Language Processing. All applications must be implemented in Python Jupyter-Notebooks. The Python machine learning libraries scikit-learn, Tensorflow and Keras will be applied.

I recommend to download Anaconda for Python 3.6. Then a new virtual environment shall be created by
`conda create -n pia python=3.6 anaconda`

.
In this virtual environment use `pip install`

to install `tensorflow, keras, gensim,`

and other required modules.

Each of the lab-exercises (applications) will be graded. The final grad of the course is the mean of the exercise grades.

### Preliminary Timeplan and link to resources

Lecture | Contents | Document Links |
---|---|---|

20.03.2018 | Short Introduction and Registration | |

27.03.2018 | Introduction to Tensorflow | |

03.04.2018 | Data Mining Process in Scikit-Learn | |

10.04.2018 | Generative Adversarial Networks (GAN) in Tensorflow | |

17.04.2018 | Word Embeddings and Deep Neural Networks for Document Classification (Keras) | |

24.04.2018 | Deep Reinforcement Learning with Tensorflow (1) | |

08.05.2018 | Deep Reinforcement Learning with Tensorflow (2) | |

15.05.2018 | LSTM - Time Series Modelling with Keras | |

29.05.2018 | Music Generation (Tensorflow) |

### Description of the Lab-Exercises

#### 1. Introduction Machine Learning Process in general

All of the following items will be demonstrated by scikit-learn implementations:

- Access and clean data
- Understand data: Descriptive statistics and visualization
- Transformation of categorical attributes
- Feature Selection and -Extraction
- Normalisation and Scaling
- Test, evaluate and optimize learned models
- Performance measures and visualization of performance

Goals of this lecture:

- Understand all steps required to implement and solve typical machine-learning tasks
- Know how to define and design machine-learning projects

#### 2. Introduction to Tensorflow

This lecture will introduce the Tensorflow framework for Deep Learning. The students will build a simple neural network to classify the MNIST dataset.

Goals of this lecture:

- Understand the basics of Tensorflow Graphs, lazy execution and why this is important for performance
- Build a basic classifier for the MNIST Dataset using Tensorflow
- Train the classifier using mini-batches

#### 3. MNIST GAN (Tensorflow)

In this lecture the students will build a generative adversarial neural network (GAN) to generate images similar to the MNIST dataset.

Goals of this lecture:

- Understand the basics of GANs
- Understand more advanced Tensorflow features (Variable Scopes, Variable reuse, Tensorboard)
- Build and train a simple GAN network to generate MNIST-like images
- Use the Deeplearn machines @ HdM

#### 4. Deep Reinforcement Learning (Tensorflow)

In this lecture the students will build various reinforcement learning agents. Starting from simple gridworld environments, the goal is to implement and understand the core concepts of “classical” reinforcement learning. In the second part the students will build a Deep Q-Network agent (DQN) that is able to play Atari video games like Pong or Breakout.

Goals of this lecture:

- Understand/Recap the core concepts of reinforcment learning (agent-environment loop, observations, policies, exploration, etc.)
- Solve a gridworld environment with dynamic programming
- Implement a gridworld agent with tabular Q-learning
- Implement a video game playing agent with Deep Q-Networks (DQN)

#### 5. Word Embeddings and Text Classification (Keras)

This lecture comprises:

- Access and preprocess text data
- Distributional semantics and word-embeddings
- Learn word-embeddings from Wikipedia dump
- Apply word-embeddings for text classification with deep neural networks (CNNs)

Goals of this lecture: - Understand the concept and potential of word-embeddings - Know how to build a deep neural network for text-classification

#### 6. Time-Series

This lecture will introduce how recurrent neural networks (RNNs) can be used to learn on temporal sequences of data to predict the next values in the future. For this, the students will build and train a LSTM network that comopses new music given an initial *seed* of music. The LSTM will be trained on classical music.

Goals of this lecture:

- Understand the basics of RNNs and LSTM networks
- Understand how RNNs are able to predict the future (and why uncertainty increases over time)
- Build a model that can compose new music