Master of Engineering Science
Electrical and Computer Engineering
Convolutional neural networks (CNN) have been widely known in literature to be extremely effective for classifying images. Some of the filters learned during training of the first layer of a CNN resemble the Gabor filter. Gabor filters are extremely good at extracting features within an image. We have taken this as an incentive by replacing the first layer of a CNN with the Gabor filter to increase speed and accuracy for classifying images. We created two simple 5-layer AlexNet-like CNNs comparing grid-search to random-search for initializing the Gabor filter bank. We trained on MNIST, CIFAR-10, and CIFAR-100 as well as a rock dataset created at Western University to study the classification of rock images using a CNN. When training on this rock dataset, we use an architecture from literature and use our Gabor filter substitution method to show the usage of the Gabor filter. Using the Gabor convolutional neural network (GCNN) showed improvements in the training speed across all datasets tested. We also found that the GCNN underperforms when dropout is added, even when overfitting becomes an issue. The size of the Gabor filter bank becomes a hyperparameter that can be tuned per dataset. Applying our Gabor filter replacement method to a 3-layer CNN reduced final accuracy at epoch 200 by 1:16% but showed large improvements in the speed of convergence during training with 93:44% accuracy on a validation set after 10 epochs compared to the original network’s 82:19%.
Pham, Long, "Gabor Filter Initialization And Parameterization Strategies In Convolutional Neural Networks" (2019). Electronic Thesis and Dissertation Repository. 6155.