Date of Award

8-2018

Document Type

Thesis

Degree Name

Master of Science (MS)

Department

Electrical and Computer Engineering (Holcomb Dept. of)

Committee Member

Dr. Robert J. Schalkoff, Committee Chair

Committee Member

Dr. Harlan B. Russell

Committee Member

Dr. Ilya Safro

Abstract

In recent times, we have seen a surge in usage of Convolutional Neural Networks to solve all kinds of problems - from handwriting recognition to object recognition and from natural language processing to detecting exoplanets. Though the technology has been around for quite some time, there is still a lot of scope to do research on what’s really happening ’under the hood’ in a CNN model.

CNNs are considered to be black boxes which learn something from complex data and provides desired results. In this thesis, an effort has been made to explain what exactly CNNs are learning by training the network with carefully selected input data. The data considered here are one dimensional time varying signals and hence the 1-D convolutional neural networks are used to train, test and to analyze the learned weights.

The field of digital signal processing (DSP) gives a lot of insight into understanding the seemingly random weights learned by CNN. In particular, the concepts of Fourier transform, Savitzky-Golay filters, Guassian filters and FIR filter design lights up seeming dark alley of CNNs. As a result of this study, a few interesting inferences can be made regarding dropout regularization, optimal kernel length and optimal number of convolution layers.

Share

COinS