Mitigating Overfitting in Extreme Learning Machine Classifier Through Dropout Regularization

Authors

  • Fateh Alrahman Kamal Qasem Alnagashi Faculty of Electrical Engineering & Technology, Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia https://orcid.org/0000-0002-2560-6944
  • Norasmadi Abdul Rahim Faculty of Electrical Engineering & Technology, Universiti Malaysia Perlis Pauh Putra Campus, 02600 Arau, Perlis
  • Shazmin Aniza Abdul Shukor Centre of Excellence for Intelligent Robotics & Autonomous System (CIRAS), Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia
  • Mohamad Hanif Ab. Hamid Centre of Excellence for Intelligent Robotics & Autonomous System (CIRAS), Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia

DOI:

https://doi.org/10.58915/amci.v13iNo.1.561

Abstract

Achieving optimal machine learning model performance is often hindered by the limited availability of diverse datasets, a challenge exacerbated by small sample sizes in real-world scenarios. In this study, we address this critical issue in classification tasks by integrating the Dropout technique into the Extreme Learning Machine (ELM) classifier. Our research underscores the effectiveness of Dropout-ELM in mitigating overfitting, especially when data is scarce, leading to enhanced generalization capabilities. Through extensive experiments on synthetic and real-world datasets, our findings consistently demonstrate that Dropout-ELM outperforms traditional ELM, yielding significant accuracy improvements ranging from 0.19% to 16.20%. By strategically implementing dropout during training, we promote the development of robust models that reduce reliance on specific features or neurons, resulting in increased adaptability and resilience across diverse datasets. Ultimately, Dropout-ELM emerges as a potent tool to counter overfitting and bolster the performance of ELM-based classifiers, particularly in scenarios with limited data. Its established efficacy positions it as a valuable asset for enhancing the reliability and generalization of machine learning models, providing a robust solution to the challenges posed by constrained training data.

Keywords:

Artificial datasets, Classification, Dropout, Machine Learning, Real-world datasets, small sample-sized, Test Accuracy

Downloads

Published

2024-02-14

How to Cite

Fateh Alrahman Kamal Qasem Alnagashi, Norasmadi Abdul Rahim, Shazmin Aniza Abdul Shukor, & Mohamad Hanif Ab. Hamid. (2024). Mitigating Overfitting in Extreme Learning Machine Classifier Through Dropout Regularization. Applied Mathematics and Computational Intelligence (AMCI), 13(No.1), 26–35. https://doi.org/10.58915/amci.v13iNo.1.561