RMIT University
Browse

Nonsmooth Optimization-Based Hyperparameter-Free Neural Networks for Large-Scale Regression

journal contribution
posted on 2024-11-03, 10:17 authored by Napsu Karmitsa, Sona TaheriSona Taheri, Kaisa Joki, Pauliina Paasivirta, Adil Baghirov, Marko Makela
In this paper, a new nonsmooth optimization-based algorithm for solving large-scale regression problems is introduced. The regression problem is modeled as fully-connected feedforward neural networks with one hidden layer, piecewise linear activation, and the ��1-loss functions. A modified version of the limited memory bundle method is applied to minimize this nonsmooth objective. In addition, a novel constructive approach for automated determination of the proper number of hidden nodes is developed. Finally, large real-world data sets are used to evaluate the proposed algorithm and to compare it with some state-of-the-art neural network algorithms for regression. The results demonstrate the superiority of the proposed algorithm as a predictive tool in most data sets used in numerical experiments.

History

Related Materials

  1. 1.
    DOI - Is published in 10.3390/a16090444
  2. 2.
    ISSN - Is published in 19994893

Journal

Algorithms

Volume

16

Number

444

Issue

9

Start page

1

End page

18

Total pages

18

Publisher

MDPI AG

Place published

Switzerland

Language

English

Copyright

Copyright: © 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https:// creativecommons.org/licenses/by/ 4.0/)

Former Identifier

2006125990

Esploro creation date

2023-10-06

Usage metrics

    Scholarly Works

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC