As we would like to have a headcount as soon as possible to estimate how many students are interested in this lecture, please sign in for this lecture using the Key learnDL if you plan to hear this lecture.
In recent years, deep learning has steadily increased in popularity, mainly due to their state-of-the-art performance in image and speech recognition, text mining and related tasks. Deep neural networks attempt to automatically learn multi-level representations and features of data and are able to uncover complex underlying data structures.
The
lecture aims at providing a basic theoretical and practical
understanding of modern neural network approaches. We will start out by
covering the necessary background on traditional artificial neural
networks, backpropagation, online learning and regularization. Then we
will cover special methods used in deep learning, like drop-out and
rectified linear units. We will also talk about further advanced topics
like convolutional layers, recurrent neural networks and auto-encoders.
- Teacher: Bernd Bischl
- Teacher: Susanne Dandl
- Teacher: Emilio Dorigatti
- Teacher: Julia Moosbauer
- Teacher: Mina Rezaei