Neurale- og rekurrente neurale netværk

Morten Thøfner & Christoffer Balle Refsgaard

Studenteropgave: Kandidatafhandlinger

Abstrakt

In this thesis, we investigate the theory behind Neural Networks (NN), Recurrent Neural Networks (RNN) and an expansion of RNNs called Long ShortTerm Memory (LSTM). Standard NN models are slow in their calculation of the gradient. Therefore, we propose different methods to speed up training. Furthermore, in standard NN models, we have a problem with vanishing and exploding gradients, and we explain various methods to fix this. This problem turns out to be an even bigger issue in RNN models. Therefore, we propose an expansion of the RNN model called LSTM. We have gathered data with the Twitter API, which gives us the tweet itself and numerical meta data for each tweet. To be able to model both the tweets and the numerical data at the same time, we suggest to stack a NN on top of a LSTM. In order for the LSTM model to understand sentence inputs, we propose different methods to represent words. With the stacked network we train a model that can predict changes in Tesla’s stock price. Using our predictions to decide how much to invest in Tesla at a given time, we introduce a trading strategy that we can profit from.

UddannelserCand.merc.mat Erhvervsøkonomi og Matematik, (Kandidatuddannelse) Afsluttende afhandling
SprogDansk
Udgivelsesdato2019
Antal sider86
VejledereSøren Feodor Nielsen