A Window-Based Approach to Training Deep Neural Networks for Predictive Sequence Modeling

  • Kirori, Zachary, Wasike, Jotham
Keywords: Deep Learning, Predictive Sequence Modeling, Time Series Data Analysis, Multi-Layer Perceptron, Deep Learning Optimization, Fixed Window Method


 Deep machine learning potentially holds the key to unlocking the door to modern applied computational intelligence. Presently, it is becoming progressively possible to process great amounts of data whether static or arriving in streams of varying velocities using deep learning models. Applications are innumerably many ranging from time series data modeling, signal processing, image analysis, natural language processing to object recognition among others. The critical area of predictive data modeling requires efficient and carefully selected algorithms and models for effective and accurate predictions. In this paper, we present a novel deep machine learning Neural Network for predictive tasks based on a fixed size window of time steps, tested on a well-known dataset on customer arrivals to an airline. At the core of the architecture is a Multi-Layer Perceptron – a classical deep learning Neural Network optimized on a number of dimensions that include the training algorithm, batch size, number of iterations, and the loss function among others. We present experimental results and conclude that, upon tuning and optimization, classical deep learning neural networks such as the Multi -Layer Perceptron (MLP) have comparable predictive abilities compared to advanced neural networks such as the Recurrent Neural Networks (RNN) and Convolution Neural Network (CNN).