An Improved Self-Structuring Neural Network

Rami M. Mohammad, Fadi Thabtah, Lee McCluskey

Research output: Chapter in Book/Report/Conference proceedingConference contribution

4 Citations (Scopus)

Abstract

Creating a neural network based classification model is traditionally accomplished using the trial and error technique. However, the trial and error structuring method nornally suffers from several difficulties including overtraining. In this article, a new algorithm that simplifies structuring neural network classification models has been proposed. It aims at creating a large structure to derive classifiers from the training dataset that have generally good predictive accuracy performance on domain applications. The proposed algorithm tunes crucial NN model thresholds during the training phase in order to cope with dynamic behavior of the learning process. This indeed may reduce the chance of overfitting the training dataset or early convergence of the model. Several experiments using our algorithm as well as other classification algorithms, have been conducted against a number of datasets from University of California Irvine (UCI) repository. The experiments’ are performed to assess the pros and cons of our proposed NN method. The derived results show that our algorithm outperformed the compared classification algorithms with respect to several performance measures.
Original languageEnglish
Title of host publicationTrends and Applications in Knowledge Discovery and Data Mining
Subtitle of host publicationPAKDD 2016 Workshops, BDM, MLSDA, PACC, WDMBF, Auckland, New Zealand, April 19, 2016, Revised Selected Papers
EditorsHuiping Cao, Jinyan Li, Ruili Wang
Place of PublicationCham
PublisherSpringer, Cham
Pages35-47
Number of pages13
VolumeLNAI9794
ISBN (Electronic)9783319429960
ISBN (Print)9783319429953
DOIs
Publication statusPublished - 15 Jul 2016
Event20th Pacific-Asia Conference on Knowledge Discovery and Data Mining - Auckland, New Zealand
Duration: 19 Apr 201622 Apr 2016
Conference number: 20
https://pakdd16.wordpress.fos.auckland.ac.nz/ (Link to Conference Website)

Publication series

NameLecture Notes in Computer Science
PublisherSpringer Verlag
Volume9794
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference20th Pacific-Asia Conference on Knowledge Discovery and Data Mining
Abbreviated titlePAKDD 2016
CountryNew Zealand
CityAuckland
Period19/04/1622/04/16
Internet address

Fingerprint

Neural networks
Classifiers
Experiments

Cite this

Mohammad, R. M., Thabtah, F., & McCluskey, L. (2016). An Improved Self-Structuring Neural Network. In H. Cao, J. Li, & R. Wang (Eds.), Trends and Applications in Knowledge Discovery and Data Mining: PAKDD 2016 Workshops, BDM, MLSDA, PACC, WDMBF, Auckland, New Zealand, April 19, 2016, Revised Selected Papers (Vol. LNAI9794, pp. 35-47). (Lecture Notes in Computer Science; Vol. 9794). Cham: Springer, Cham. https://doi.org/10.1007/978-3-319-42996-0_4
Mohammad, Rami M. ; Thabtah, Fadi ; McCluskey, Lee. / An Improved Self-Structuring Neural Network. Trends and Applications in Knowledge Discovery and Data Mining: PAKDD 2016 Workshops, BDM, MLSDA, PACC, WDMBF, Auckland, New Zealand, April 19, 2016, Revised Selected Papers. editor / Huiping Cao ; Jinyan Li ; Ruili Wang. Vol. LNAI9794 Cham : Springer, Cham, 2016. pp. 35-47 (Lecture Notes in Computer Science).
@inproceedings{fd35121bc2f14c26a1274877ad847f52,
title = "An Improved Self-Structuring Neural Network",
abstract = "Creating a neural network based classification model is traditionally accomplished using the trial and error technique. However, the trial and error structuring method nornally suffers from several difficulties including overtraining. In this article, a new algorithm that simplifies structuring neural network classification models has been proposed. It aims at creating a large structure to derive classifiers from the training dataset that have generally good predictive accuracy performance on domain applications. The proposed algorithm tunes crucial NN model thresholds during the training phase in order to cope with dynamic behavior of the learning process. This indeed may reduce the chance of overfitting the training dataset or early convergence of the model. Several experiments using our algorithm as well as other classification algorithms, have been conducted against a number of datasets from University of California Irvine (UCI) repository. The experiments’ are performed to assess the pros and cons of our proposed NN method. The derived results show that our algorithm outperformed the compared classification algorithms with respect to several performance measures.",
keywords = "Classification, Neural network, Phishing, Pruning, Structure",
author = "Mohammad, {Rami M.} and Fadi Thabtah and Lee McCluskey",
year = "2016",
month = "7",
day = "15",
doi = "10.1007/978-3-319-42996-0_4",
language = "English",
isbn = "9783319429953",
volume = "LNAI9794",
series = "Lecture Notes in Computer Science",
publisher = "Springer, Cham",
pages = "35--47",
editor = "Huiping Cao and Jinyan Li and Ruili Wang",
booktitle = "Trends and Applications in Knowledge Discovery and Data Mining",

}

Mohammad, RM, Thabtah, F & McCluskey, L 2016, An Improved Self-Structuring Neural Network. in H Cao, J Li & R Wang (eds), Trends and Applications in Knowledge Discovery and Data Mining: PAKDD 2016 Workshops, BDM, MLSDA, PACC, WDMBF, Auckland, New Zealand, April 19, 2016, Revised Selected Papers. vol. LNAI9794, Lecture Notes in Computer Science, vol. 9794, Springer, Cham, Cham, pp. 35-47, 20th Pacific-Asia Conference on Knowledge Discovery and Data Mining, Auckland, New Zealand, 19/04/16. https://doi.org/10.1007/978-3-319-42996-0_4

An Improved Self-Structuring Neural Network. / Mohammad, Rami M.; Thabtah, Fadi; McCluskey, Lee.

Trends and Applications in Knowledge Discovery and Data Mining: PAKDD 2016 Workshops, BDM, MLSDA, PACC, WDMBF, Auckland, New Zealand, April 19, 2016, Revised Selected Papers. ed. / Huiping Cao; Jinyan Li; Ruili Wang. Vol. LNAI9794 Cham : Springer, Cham, 2016. p. 35-47 (Lecture Notes in Computer Science; Vol. 9794).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

TY - GEN

T1 - An Improved Self-Structuring Neural Network

AU - Mohammad, Rami M.

AU - Thabtah, Fadi

AU - McCluskey, Lee

PY - 2016/7/15

Y1 - 2016/7/15

N2 - Creating a neural network based classification model is traditionally accomplished using the trial and error technique. However, the trial and error structuring method nornally suffers from several difficulties including overtraining. In this article, a new algorithm that simplifies structuring neural network classification models has been proposed. It aims at creating a large structure to derive classifiers from the training dataset that have generally good predictive accuracy performance on domain applications. The proposed algorithm tunes crucial NN model thresholds during the training phase in order to cope with dynamic behavior of the learning process. This indeed may reduce the chance of overfitting the training dataset or early convergence of the model. Several experiments using our algorithm as well as other classification algorithms, have been conducted against a number of datasets from University of California Irvine (UCI) repository. The experiments’ are performed to assess the pros and cons of our proposed NN method. The derived results show that our algorithm outperformed the compared classification algorithms with respect to several performance measures.

AB - Creating a neural network based classification model is traditionally accomplished using the trial and error technique. However, the trial and error structuring method nornally suffers from several difficulties including overtraining. In this article, a new algorithm that simplifies structuring neural network classification models has been proposed. It aims at creating a large structure to derive classifiers from the training dataset that have generally good predictive accuracy performance on domain applications. The proposed algorithm tunes crucial NN model thresholds during the training phase in order to cope with dynamic behavior of the learning process. This indeed may reduce the chance of overfitting the training dataset or early convergence of the model. Several experiments using our algorithm as well as other classification algorithms, have been conducted against a number of datasets from University of California Irvine (UCI) repository. The experiments’ are performed to assess the pros and cons of our proposed NN method. The derived results show that our algorithm outperformed the compared classification algorithms with respect to several performance measures.

KW - Classification

KW - Neural network

KW - Phishing

KW - Pruning

KW - Structure

UR - http://www.scopus.com/inward/record.url?scp=84978880141&partnerID=8YFLogxK

U2 - 10.1007/978-3-319-42996-0_4

DO - 10.1007/978-3-319-42996-0_4

M3 - Conference contribution

SN - 9783319429953

VL - LNAI9794

T3 - Lecture Notes in Computer Science

SP - 35

EP - 47

BT - Trends and Applications in Knowledge Discovery and Data Mining

A2 - Cao, Huiping

A2 - Li, Jinyan

A2 - Wang, Ruili

PB - Springer, Cham

CY - Cham

ER -

Mohammad RM, Thabtah F, McCluskey L. An Improved Self-Structuring Neural Network. In Cao H, Li J, Wang R, editors, Trends and Applications in Knowledge Discovery and Data Mining: PAKDD 2016 Workshops, BDM, MLSDA, PACC, WDMBF, Auckland, New Zealand, April 19, 2016, Revised Selected Papers. Vol. LNAI9794. Cham: Springer, Cham. 2016. p. 35-47. (Lecture Notes in Computer Science). https://doi.org/10.1007/978-3-319-42996-0_4