Author ORCID Identifier
Date of Award
Doctor of Philosophy (PhD)
Deep neural networks currently play a prominent role in solving problems across a wide variety of disciplines. Improving performance of deep learning models and reducing their training times are some of the ongoing challenges. Increasing the depth of the networks improves performance but suffers from the problem of vanishing gradients and increased training times. In this research, we design methods to address these challenges in deep neural networks and demonstrate deep learning applications in several domains. We propose a gradient amplification based approach to train deep neural networks, which improves their training and testing accuraries, addresses vanishing gradients, as well as reduces the training time by reaching higher accuracies even at higher learning rates. We also develop an integrated training strategy to enable/disable amplification at certain epochs. Detailed analysis is performed on different neural networks using random amplification, where the layers to be amplified are selected randomly. The implications of gradient amplification on the number of layers, types of layers, amplification factors, training strategies and learning rates are studied in detail. With this knowledge, effective ways to update gradients are designed to perform amplification at layer-level and also at neuron-level. Lastly, we provide applications of deep learning methods to some of the challenging problems in the areas of smartgrids and bioinformatics. Deep neural networks with feed forward architectures are used to solve data integrity attacks in smart grids. We propose an image based preprocessing method to convert heterogenous genomic sequences into images which are then classified to detect Hepatitis C virus(HCV) infection stages. In summary, this research advances deep learning techniques and their applications to real world problems.
Basodi, Sunitha, "Advances in Deep Learning through Gradient Amplification and Applications." Dissertation, Georgia State University, 2020.
File Upload Confirmation