Skip to main content

Table 16 Summary of ML-based fault prediction

From: A comprehensive survey on machine learning for networking: evolution, applications and research opportunities

Ref.

ML Technique

Network

Dataset

Features

Output

Evaluation

  

(location)

  

(training)

Settings

Results

Hood et al. [193]

Supervised: · BN

Campus network

Data collected from router

Management information base (MIB) variables for following network functions · Interface group · IP group · UDP group

Predict network health

500 samples for each of 14 MIB variables of the 3 network functions

Predict approximately 8 min before fault occurrence

Kogeda et al. [248]

Supervised: · BN

Cellular network

Simulation with fault injection

·Power ·Multiplexer ·Cell ·Transmission

Faulty or not

4 nodes each with 3 states

Confidence level of 99.8%

Snow et al. [414]

Supervised: · NN (MLP)

Wireless network

Generated using discrete time event simulation

·Mean time to failure ·Mean time to restore ·Time Profile ·Run Time

Dependability of a network ·Survivability ·Availability ·Failed components ·Reportable outages

14 inputs, 10 and 5 nodes in the first and second hidden layer, respectively

Closely approximates reportable outages

Wang et al. [466]

Supervised: · DT (J4.8) · Rule learners (JRip) · SVM · BN · Ensemble

Wireless sensor network

Generated using sensor network testbed

·Received signal strength indication ·Send and forward buffer sizes ·Channel load assessment ·Forward and backward

Link quality estimation

10-fold cross validation was used with 5000 samples

Accuracy · 82% for J4.8 ·80% for JRip

Lu et al. [285]

Manifold learning: ·SHLLE

Distributed systems

Generated from a testbed of a distributed environment with a file transfer application

System performance ·interface group ·IP group ·TCP group ·UDP group

Prediction of network, CPU, and memory failures

Not provided

· Precision: 0.452 ·Recall: 0.456 · False positive rate: 0.152

Pellegrini et al. [355]

Different ML methods: ·Linear Regression · M5P · REP-Tree · LASSO · SVM · Least-Square SVM

Multi-tier e-commerce web application

Generated from a testbed of a virtual architecture

Different system performance

Remaining Time to Failure (RTTF)

Not provided

Soft mean absolute error · Linear regression: 137.600 · M5P: 79.182 · REP-Tree: 69.832 · LASSO as a Predictor: 405.187 · SVM: 132.668 · Least-Square SVM: 132.675

Wang et al. [469]

Supervised: · Double-exponential smoothing (DES) and SVM

Optical network

Real data collected from an optical network of a telecommunications operator

Indicators In Board Data: ·Input Optical Power · Laser Bias Current · Laser Temperature Offset · Output Optical Power · Environmental Temperature ·Unusable Time

Predicting equipment failure

10-fold cross-validation was used to test model accuracy

DES with SVM · Prediction accuracy: 95%

Kumar et al. [255]

Unsupervised: · DNN with Autoencoders

Cellular Network

Fault data from one of the national mobile operators of USA for a month

Historical data of fault occurrence and their inter-arrival times

Prediction of inter-arrival time of faults

10 neurons in the hidden layer

DNN with autoencoders · NRMSE: 0.122092 ·RMSE: 0.504425