dc.description.abstract |
In this paper, motivated by the works on twin parametric insensitive support vector regression (TPISVR) (Peng in Neurocomputing 79(1):26–38, 2012), and Lagrangian twin support vector regression (Balasundaram and Tanveer in Neural Comput Appl 22(1):257–267, 2013), a new efficient approach is proposed as Lagrangian twin parametric insensitive support vector regression (LTPISVR). In order to make the objective function strongly convex, we consider square of 2-norm of slack variables in the optimization problem. To reduce the computation cost, the solution of proposed LTPISVR is obtained by solving simple linearly convergent iterative schemes, instead of quadratic programming problems as in TPISVR. There is no requirement of any optimization toolbox for proposed LTPISVR. To demonstrate the effectiveness of proposed method, we present numerical results on well-known synthetic and real-world datasets. The results clearly show similar or better generalization performance of proposed method with lesser training time in comparison with support vector regression, twin support vector regression and twin parametric insensitive support vector regression. |
en_US |