Learn pytorch with examples.
Toy example shows how to write customized Function and Module.
Implementation RNN (Vanilla, GRU, LSTM, LSTMP) from scratch. Gradient is clipped to avoid explosion, using pytorch Variable's register_hook function.
BinaryNet with pytorch. Manipulate learning by 1) modifying optimizer (mlp.py) or 2) using param_groups (cnn.py).
Extend pytorch with cffi.
meprop training
Focal Loss
Squeeze and Excitation Networks
Sigmoid-weighted Linear Units (i.e. Swish activation)
Modified (corrected) Adam optimizer.
CUDA extension by compiling CUDA kernels online.
Deep Gradient Compression.