This is a description of classic convolutional neural network and its implementation in Pytorch
We briefly introduce the structure and #params and FLOPs of these network
LeNet, proposed in 1998 by LeCun, contains 7 layers including 2 conv, 2 pooling and 3 fc layers
AlexNet, proposed in 2012, contains 8 layers including 5 conv and 3 fc layers.
ReLU
Dropout
Local Response Normalization
Overlapping Pooling
VGG, proposed in 2014, is a series of neural network and contains 16-19 layers.
We only implement VGG16 which consists of 13 conv, 5 pooling and 3 fc layers
SqueezeNet, proposed in 2016, is a lightweight neural network.
(reimplementation)
Total param:1,244,448
MAdd: 1.67G
Flops: 838.94M