Desktop version

Home arrow Computer Science arrow Applied Big Data Analytics in Operations Management


After identifying the variables influencing the gauge values, this section aims to develop a model to predict gauge widening as a function of various external factors. In this study, ANN are used to predict gauge widening.

Figure 5. Impact of repair on degradation of curves: Relationship between "Gauge Value Change per Month” and "Route”

The relationship between different variables and "Gauge Value Change per Month”

Figure 6. The relationship between different variables and "Gauge Value Change per Month”

Artificial Neural Networks

Artificial neural networks, commonly termed as neural networks or neural models, are a class of information processing systems which consist of a set of inter-related processing elements called neurons (Mazloumi, Rose, Currie, and Moridpour, 2011; Mazloumi, Moridpour. Currie, and Rose, 2012). Each neuron in the network carries out simple computation tasks such as receiving inputs from other neurons and computing an output sent to other neurons. A group of neurons constitutes a layer. The layer that receives the input data from external sources is called the “input layer”, and the one which outputs the final results of the network is termed the “output layer”. Often, between the input and output layers, there is one or more hidden layers, where the main data processing task is conducted.

The number of layers in a neural network determines if it is a single, two, or multi-layer neural network. When connections only exist in one direction between adjacent layers, i.e. they do not form a feedback loop, the network is defined as having feed-forward architecture. The most common structure of neural networks is the multi-layer feed-forward neural network. Figure 5 shows the typical topology of a three layer feed forward neural network. Each inter-connection in the network has an associated weight (e.g. Wj or Wjk) and each neuron has an assigned threshold (e.g. ej is the threshold associated with neuron j). These weights and thresholds are the model parameters calibrated through a “training procedure”.

In the training mechanism, a number of input-output pairs (training examples) are introduced to the model, and the model then understands the relationship between the inputs and corresponding outputs through adjusting its parameters. Training involves finding the optimal values for model parameters. In this study, a “Back Propagation Training Algorithm” has been adopted.

An important factor that influences the neural network outcomes is the initial point to start searching for the optimal answer. Therefore, different initial model parameter may lead to a different optimal point. One way to alleviate this problem and to increase the likelihood of obtaining near-optimum local minima is to train several neural networks with a random set of initial weights and to make predictions based on an ensemble of neural networks.

In this study, the idea of ensemble prediction is adopted, and each neural network one hidden layer and is trained 10 times. The number of neurons in the hidden layer is calculated automatically by the model to achieve the best prediction accuracy. The results are then averaged to calculate the final prediction values corresponding to a certain set of input values. However, for ease of referencing, the neural network ensemble is referred as a neural network model, and results that are reported are the average for the ensemble. In addition, for the first time in light rail maintenance prediction, the prediction model is time dependant and degradation of tram tracks is predicted over time. It should be noted that in the existing models, the degradation predictions is based on the mass passing through tracks. Understanding the influence of different values of mass on degradation of tracks may be difficult, even for experts in this area. However, understanding the rate of degradation over time is easy for everyone to understand and interpret.

< Prev   CONTENTS   Source   Next >

Related topics