The repository latex-math is used. Please read its ReadMe here: https://github.com/compstat-lmu/latex-math
Please observe the following rules when creating and editing the lecture and exercise slides:
- Clone this repository
- Clone the latex-math repository into the main directory of this repository
- Navigate to a folder where the slideset is contained, e.g. 2020/01-introduction
- If there is Makefile in the folder: do make -f "Makefile" unless render slides by knitr::knit2pdf("slides.Rnw")
Topics will be added or remade, the normal ones are already in the slides:
-
Introduction, Overview, and a Brief History of Deep Learning
-
Deep Feed-Forward Neural Networks, Gradient Descent, Backpropagation, Hardware and Software
-
Regularization of Neural Networks, Early Stopping
-
Dropout and Challenges in Optimization
-
Advances in Optimization
-
Activation Functions and Initialization
-
Convolutional Neural Networks, Variants of CNNs, Applications
-
Modern CNNs and Overview of some Applications
-
Recurrent Neural Networks
-
Modern RNNs and Applications
-
Deep Unsupervised Learning
-
Autoencoders, AE Regularization and Variants
-
Manifold Learning
-
Deep Generative Models, VAE, GANs
-
Math environments within a text line are created by $ environment, separate equation lines are created by $$ environment
-
The abbreviations defined in the header file should always be used within the code for simplification purposes
-
The repo latex-math is used. Please read the corresponding ReadMe: https://github.com/compstat-lmu/latex-math
- Alex J. Smola (2020): Dive into Deep Learning (An interactive deep learning book with code, math, and discussions Provides NumPy/MXNet, PyTorch, and TensorFlow implementations)(free HTML version)
- Goodfellow, Bengio, Courville (2016): Deep Learning (free HTML version)
- Awesome Deep Learning
- Andrej Karpathy blog
- Coursera Kurs "Neural Networks for Machine Learning"
- Youtube Talk von Geoff Hinton "Recent Developments in Deep Learning"
- Practical Deep Learning For Coders - contains many detailed python notebooks on how to implement different DL architectures
- The Matrix Calculus You Need For Deep Learning
- Tensorflow Neural Network Playground
- A Weird Introduction to Deep Learning
- Deep Learning Achievements Over the Past Year
- Scalable Deep Learning (Talk)
- Deep Learning Resources
- distill.pub: in-depth explanations of important concepts, worth checking out periodically for new material
- Why Momentum Really Works
- Adam -- latest trends in deep learning optimization
- Overview of Gradient Descent Optimization Algorithms
- Yes you should understand backprop
- A Recipe for Training Neural Networks
- The Sobel and Laplacian Edge Detectors
- Keras Blog: How convolutional neural networks see the world
- The 9 Deep Learning Papers You Need To Know About
- Python based visualization repo for CNNs
- Computing Receptive Fields of Convolutional Neural Networks
- Deconvolution and Checkerboard Artifacts
- Understanding Convolution in Deep Learning
- [What do we learn from region based object detectors (Faster R-CNN, R-FCN, FPN)]
- How Convolutional Neural Networks see the World
- Attention in Neural Networks and How to Use It
- Neural Networks - A Systematic Introduction (FU Berlin)
- Deep Learning - The Straight Dope (contains notebooks designed to teach deep learning)
- Computing Receptive Fields of Convolutional Neural Networks
- Stanford: Convolutional Neural Networks for Visual Recognition
- Understanding LSTM and its diagrams
- The most comprehensive yet simple and fun RNN/LSTM tutorial on the Internet.
- R vs Python: Image Classification with Keras
- H20 related stuff:
- other
- PyData-2017-TF_TFS: Slides for Spark + Tensorflow + Notebooks