Download PDFOpen PDF in browser

The Effects of Bi-Label Classification on the Learning Efficiancy of Feed-Forward Neutral Networks

EasyChair Preprint 15138

9 pagesDate: September 28, 2024

Abstract

This study examines the impact of bi-label classification on the performance of feed-forward neural networks (FFNNs) by comparing it with single-label classification models. The research focuses on key performance metrics such as convergence speed, loss reduction, and model stability across three case studies of varying complexity. The findings indicate that bi-label classification consistently achieves faster convergence, lower loss values, and greater stability compared to single-label classification, particularly in simpler datasets. As the complexity of the data increases, the performance gap between the two models narrows, though bi-label classification continues to maintain a slight advantage in terms of loss reduction, generalization, and the ability to handle diverse datasets. These results suggest that bi-label classification can significantly enhance the efficiency of deep learning models, making them more effective in solving tasks involving multiple dependent variables. The study’s findings are particularly relevant to fields like predictive analytics and large-scale data classification, where processing speed and model stability are critical.

Keyphrases: : Multi-label Classification, Bi-label classification, Single-label classification, deep learning, neural networks

BibTeX entry
BibTeX does not have the right entry for preprints. This is a hack for producing the correct reference:
@booklet{EasyChair:15138,
  author    = {Panupong Jirasetsiri},
  title     = {The Effects of Bi-Label Classification on the Learning Efficiancy of Feed-Forward Neutral Networks},
  howpublished = {EasyChair Preprint 15138},
  year      = {EasyChair, 2024}}
Download PDFOpen PDF in browser