Here in this blog, I’m going to discuss the code implementation of LeNet using Keras. In this, I used this research for http://yann.lecun.com/exdb/publis/pdf/lecun-98.pdf for implementation.
LeNet-5 Total seven-layer does not comprise an input, each containing trainable parameters; each layer has a plurality of the Map the Feature, a characteristic of each of the input FeatureMap extracted by means of a convolution filter, and then each FeatureMap There is multiple neurons.
Here this is basic import.
First, we define sequential layer here the layer is stack in the form Conv2D and MaxPool2D in repeated form. At last, I use the Dense layer before the dense layer Conv2D and MaxPool2D layer used for feature extraction, and this dense layer used for classification.
Here I did some changes in the activation function and input shape in the original paper they used tanh activation function and the input shape is 32*32. You can see the complete arch in fig1.
To get complete notebook use this link https://github.com/anjanimsp/LeNet_From_Scratch