The IRMA Community
Newsletters
Research IRM
Click a keyword to search titles using our InfoSci-OnDemand powered search:
|
Soft-constrained Linear Programming Support Vector Regression for Nonlinear Black-box Systems Identification
Abstract
As an innovative sparse kernel modeling method, support vector regression (SVR) has been regarded as the state-of-the-art technique for regression and approximation. In the support vector regression, Vapnik developed the -insensitive loss function as a trade-off between the robust loss function of Huber and one that enables sparsity within the support vectors. The use of support vector kernel expansion provides us a potential avenue to represent nonlinear dynamical systems and underpin advanced analysis. However, in the standard quadratic programming support vector regression (QP-SVR), its implementation is more computationally expensive and enough model sparsity can not be guaranteed. In an attempt to surmount these drawbacks, this article focus on the application of soft-constrained linear programming support vector regression (LP-SVR) in nonlinear black-box systems identification, and the simulation results demonstrates that the LP-SVR is superior to QP-SVR in model sparsity and computational efficiency
Related Content
Kamel Mouloudj, Vu Lan Oanh LE, Achouak Bouarar, Ahmed Chemseddine Bouarar, Dachel Martínez Asanza, Mayuri Srivastava.
© 2024.
20 pages.
|
José Eduardo Aleixo, José Luís Reis, Sandrina Francisca Teixeira, Ana Pinto de Lima.
© 2024.
52 pages.
|
Jorge Figueiredo, Isabel Oliveira, Sérgio Silva, Margarida Pocinho, António Cardoso, Manuel Pereira.
© 2024.
24 pages.
|
Fatih Pinarbasi.
© 2024.
20 pages.
|
Stavros Kaperonis.
© 2024.
25 pages.
|
Thomas Rui Mendes, Ana Cristina Antunes.
© 2024.
24 pages.
|
Nuno Geada.
© 2024.
12 pages.
|
|
|