Deep neural networks show to have elite on image order undertakings while being harder to prepare. The unpredictability and disappearing angle issue ordinarily take a great deal of time and more computational capacity to prepare deeper neural networks. Deep residual networks (ResNets) can make the training cycle quicker and accomplish more precision contrasted with their identical neural networks. ResNets accomplish this improvement by adding an essential skip association corresponding to the layers of convolution neural networks. In this venture, we first plan a ResNet model to play out the image arrangement task on the Tiny Image Net dataset with high precision. At that point, we think about the presentation of this ResNet model with its identical Convolution Network (ConvNet). Our discoveries show that ResNets are more inclined to over fitting regardless of their higher exactness.
Dr. PREETHA
Deep neural networks, ResNet, ReLU
