RMIT University
Browse

Two-hidden-layer extreme learning machine for regression and classification

journal contribution
posted on 2024-11-02, 00:49 authored by B Qu, B Lang, J Liang, A Qin, O Crisalle
As a single-hidden-layer feedforward neural network, an extreme learning machine (ELM) randomizes the weights between the input layer and the hidden layer as well as the bias of hidden neurons, and analytically determines the weights between the hidden layer and the output layer using the least-squares method. This paper proposes a two-hidden-layer ELM (denoted TELM) by introducing a novel method for obtaining the parameters of the second hidden layer (connection weights between the first and second hidden layer and the bias of the second hidden layer), hence bringing the actual hidden layer output closer to the expected hidden layer output in the two-hidden-layer feedforward network. Simultaneously, the TELM method inherits the randomness of the ELM technique for the first hidden layer (connection weights between the input weights and the first hidden layer and the bias of the first hidden layer). Experiments on several regression problems and some popular classification datasets demonstrate that the proposed TELM can consistently outperform the original ELM, as well as some existing multilayer ELM variants, in terms of average accuracy and the number of hidden neurons.

History

Related Materials

  1. 1.
    DOI - Is published in 10.1016/j.neucom.2015.11.009
  2. 2.
    ISSN - Is published in 09252312

Journal

Neurocomputing

Volume

175

Start page

826

End page

834

Total pages

9

Publisher

Elsevier

Place published

United States

Language

English

Copyright

© 2015 Elsevier B.V. All rights reserved.

Former Identifier

2006061075

Esploro creation date

2020-06-22

Fedora creation date

2016-05-05

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC