A multi-output two-stage locally regularized model construction method using the extreme learning machine

D. Du, K. Li, X. Li, M. Fei, H. Wang

Research output: Contribution to journalArticlepeer-review

9 Citations (Scopus)
351 Downloads (Pure)

Abstract

This paper investigates the construction of linear-in-the-parameters (LITP) models for multi-output regression problems. Most existing stepwise forward algorithms choose the regressor terms one by one, each time maximizing the model error reduction ratio. The drawback is that such procedures cannot guarantee a sparse model, especially under highly noisy learning conditions. The main objective of this paper is to improve the sparsity and generalization capability of a model for multi-output regression problems, while reducing the computational complexity. This is achieved by proposing a novel multi-output two-stage locally regularized model construction (MTLRMC) method using the extreme learning machine (ELM). In this new algorithm, the nonlinear parameters in each term, such as the width of the Gaussian function and the power of a polynomial term, are firstly determined by the ELM. An initial multi-output LITP model is then generated according to the termination criteria in the first stage. The significance of each selected regressor is checked and the insignificant ones are replaced at the second stage. The proposed method can produce an optimized compact model by using the regularized parameters. Further, to reduce the computational complexity, a proper regression context is used to allow fast implementation of the proposed method. Simulation results confirm the effectiveness of the proposed technique.
Original languageEnglish
Pages (from-to)104-112
Number of pages9
JournalNeurocomputing
Volume128
DOIs
Publication statusPublished - 27 Mar 2014

Fingerprint

Dive into the research topics of 'A multi-output two-stage locally regularized model construction method using the extreme learning machine'. Together they form a unique fingerprint.

Cite this