Author ORCID Identifier
Date of Award
Doctor of Philosophy (PhD)
Two problems have plagued artificial neural networks since their birth in the mid-20th century. The first is a tendency to lose previously acquired knowledge when there is a large shift in the underlying data-distribution, a phenomenon provocatively known as catastrophic forgetting. The second is an inability to know-what-they-don’t-know, resulting in excessively confident behavior, even in uncertain or novel conditions. This text provides an in-depth history of these obstacles, complete with formal problem definitions and literature reviews. Most importantly, the proposed solutions herein demonstrate that these challenges can be overcome with the right architectures and training objectives. As this text will show, a thorough investigation of these topics necessitated several distinct approaches. Each of which, when considered in isolation, offers evidence that these problems are likely temporary obstacles on the path to true human-level intelligence. Lastly, we present a new learning framework called Hyper-Learning, which might allow both of these problems to be mitigated by a single architecture when coupled with the right training algorithm.
Camp, Brendan Blake, "Hyper-Learning with Deep Artificial Neurons." Dissertation, Georgia State University, 2023.
File Upload Confirmation