Journal of Systems Engineering and Electronics ›› 2021, Vol. 32 ›› Issue (1): 151-162.doi: 10.23919/JSEE.2021.000014
• SYSTEMS ENGINEERING • Previous Articles Next Articles
Lei HU(), Guoxing YI*(), Chao HUANG()
Received:
2020-06-02
Online:
2021-02-25
Published:
2021-02-25
Contact:
Guoxing YI
E-mail:maple_hsjz@163.com;ygx@hit.edu.cn;huangchao198311@126.com
About author:
Supported by:
Lei HU, Guoxing YI, Chao HUANG. A sparse algorithm for adaptive pruning least square support vector regression machine based on global representative point ranking[J]. Journal of Systems Engineering and Electronics, 2021, 32(1): 151-162.
Add to citation manager EndNote|Reference Manager|ProCite|BibTeX|RefWorks
Table 1
Description of datasets"
Dataset name | Input dimension | Output dimension | Dataset size | Training size | Testing size |
Heart_failure | 12 | 1 | 299 | 270 | 29 |
Mpg | 7 | 1 | 390 | 351 | 39 |
Real_estate | 6 | 1 | 410 | 369 | 41 |
Housing | 13 | 1 | 500 | 450 | 50 |
ENB2012 | 9 | 1 | 760 | 684 | 76 |
Qsar_fish_toxicity | 6 | 1 | 908 | 828 | 90 |
Stock | 9 | 1 | 940 | 846 | 94 |
ConcreteData | 8 | 1 | 1030 | 927 | 103 |
Sin | 1 | 1 | 1600 | 1440 | 160 |
Sample | 1 | 1 | 3600 | 3240 | 360 |
Table 2
Parameters setting"
Dataset | LSSVR | S-LSSVR | AI-LSSVR | IAP-LSSVR | PEM-LSSVR | DSAP-LSSVR | |||||||||||
| | | | | | | | | | | | ||||||
Heart_failure | 4 | 4 | 2 | 4 | 16 | 8 | 4 | 4 | | 4 | 2 | 8 | |||||
Mpg | 512 | 16 | 16 | 4 | 16 | 4 | 16 | 4 | 16 | 4 | 16 | 4 | |||||
Real_estate | 32 | 2 | 16 | 1 | 64 | 8 | 8 | 4 | 16 | 1 | 0.5 | 4 | |||||
Housing | 128 | 8 | 16 | 4 | | 8 | 32 | 8 | | 8 | 256 | 8 | |||||
ENB2012 | | 16 | | 16 | 128 | 16 | | 16 | 128 | 16 | 64 | 16 | |||||
Qsar_fish_toxicity | 16 | 0.5 | 2 | 1 | 32 | 8 | 32 | 8 | 32 | 8 | | 4 | |||||
Stock | 256 | 1 | 256 | 1 | 238 | 1 | 512 | 4 | 16 | 2 | 512 | 4 | |||||
ConcreteData | 32 | 2 | 64 | 2 | 512 | 8 | 512 | 8 | 16 | 4 | 512 | 8 | |||||
Sin | 64 | | 64 | | 32 | 0.25 | 64 | | 32 | 0.25 | 32 | 0.25 | |||||
Sample | | 2 | 16 | 4 | | 1 | | 1 | 512 | 2 | 512 | 2 |
Table 3
Experimental result of Heart_failure dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 270 | 2.4×10?4 | 0.91 | 2.8×10?2 | 2.1×10?3 |
S-LSSVR | 257 | 2.4×10?4 | 0.9 | 7.3×10?2 | 2.3×10?3 |
AI-LSSVR | 141 | 2.6×10?4 | 0.88 | 5.4×10?2 | 1.3×10?4 |
IAP-LSSVR | 141 | 2.4×10?4 | 0.89 | 5.6×10?2 | 1.4×10?4 |
PEM-LSSVR | 141 | 2.5×10?4 | 0.89 | 5.4×10?2 | 1.4×10?4 |
GRPR-AP-LSSVR | 120 | 6.8×10?4 | 0.88 | 5.8×10?2 | 1.5×10?4 |
Table 4
Experimental result of Mpg dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 351 | 1.4×10?4 | 0.96 | 4.7×10?2 | 3.3×10?3 |
S-LSSVR | 334 | 1.4×10?4 | 0.96 | 0.12 | 3.4×10?3 |
AI-LSSVR | 120 | 1.5×10?4 | 0.96 | 3.6×10?2 | 1.4×10?4 |
IAP-LSSVR | 120 | 1.5×10?4 | 0.96 | 3.6×10?2 | 1.3×10?4 |
PEM-LSSVR | 119 | 1.5×10?4 | 0.96 | 3.7×10?2 | 1.3×10?4 |
GRPR-AP-LSSVR | 117 | 1.5×10?4 | 0.96 | 4.2×10?2 | 1.6×10?4 |
Table 5
Experimental result of Real_estate dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 369 | 1.3×10?4 | 0.89 | 0.57 | 4.1×10?3 |
S-LSSVR | 351 | 1.4×10?4 | 0.88 | 0.89 | 4×10?3 |
AI-LSSVR | 227 | 1.5×10?4 | 0.86 | 0.15 | 2×10?3 |
IAP-LSSVR | 228 | 2.7×10?4 | 0.83 | 0.16 | 2.1×10?3 |
PEM-LSSVR | 232 | 3.9×10?4 | 0.81 | 0.16 | 2.8×10?3 |
GRPR-AP-LSSVR | 195 | 4.9×10?4 | 0.82 | 0.15 | 1.9×10?3 |
Table 6
Experimental result of Housing dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 450 | 1.2×10?4 | 0.95 | 0.86 | 5.4×10?3 |
S-LSSVR | 428 | 1.2×10?4 | 0. 95 | 1.01 | 5.4×10?3 |
AI-LSSVR | 204 | 1.3×10?4 | 0. 95 | 0.12 | 2.5×10?3 |
IAP-LSSVR | 205 | 1.9×10?4 | 0. 92 | 0.12 | 2.1×10?3 |
PEM-LSSVR | 206 | 1.3×10?4 | 0. 95 | 0.12 | 2.3×10?3 |
GRPR-AP-LSSVR | 191 | 1.7×10?4 | 0. 94 | 0.13 | 2.1×10?3 |
Table 7
Experimental result of ENB2012 dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 684 | 3.4×10?5 | 0.99 | 0.29 | 1.2×10?2 |
S-LSSVR | 650 | 3.4×10?5 | 0. 99 | 0.48 | 1.2×10?2 |
AI-LSSVR | 79 | 3.6×10?5 | 0. 99 | 1.5×10?2 | 1.3×10?3 |
IAP-LSSVR | 78 | 3.5×10?5 | 0. 99 | 1.5×10?2 | 1.2×10?3 |
PEM-LSSVR | 82 | 3.6×10?5 | 0. 99 | 1.7×10?2 | 1.4×10?3 |
GRPR-AP-LSSVR | 68 | 3.7×10?5 | 0. 98 | 5.5×10?2 | 1.1×10?3 |
Table 8
Experimental result of Qsar_fish_toxicity dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 818 | 1×10?4 | 0.87 | 0.53 | 1.8×10?2 |
S-LSSVR | 778 | 1.3×10?4 | 0. 87 | 0.71 | 1.7×10?2 |
AI-LSSVR | 350 | 1.4×10?4 | 0. 81 | 0.56 | 1.2×10?2 |
IAP-LSSVR | 376 | 3.2×10?4 | 0. 8 | 0.78 | 1.1×10?2 |
PEM-LSSVR | 384 | 2.5×10?4 | 0. 8 | 0.91 | 1.3×10?2 |
GRPR-AP-LSSVR | 146 | 1.4×10?4 | 0. 83 | 0.1 | 1.1×10?2 |
Table 9
Experimental result of Stock dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 846 | 5.2×10?6 | 0.99 | 0.57 | 1.9×10?2 |
S-LSSVR | 804 | 5.5×10?6 | 0. 99 | 0.78 | 1.9×10?2 |
AI-LSSVR | 191 | 1.5×10?5 | 0. 99 | 0.12 | 4.1×10?3 |
IAP-LSSVR | 195 | 1.6×10?5 | 0. 99 | 0.13 | 4.1×10?3 |
PEM-LSSVR | 196 | 1.4×10?5 | 0. 99 | 0.14 | 4.7×10?3 |
GRPR-AP-LSSVR | 192 | 1.5×10?5 | 0. 99 | 0.13 | 3.8×10?3 |
Table 10
Experimental result of ConcreteData dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 927 | 5.7×10?5 | 0.96 | 0.73 | 2.4×10?2 |
S-LSSVR | 881 | 5.4×10?5 | 0. 96 | 0.94 | 2.4×10?2 |
AI-LSSVR | 780 | 8.8×10?5 | 0. 94 | 9.25 | 1.6×10?2 |
IAP-LSSVR | 782 | 8.9×10?5 | 0. 94 | 9.26 | 1.6×10?2 |
PEM-LSSVR | 778 | 9×10?5 | 0. 94 | 9.1 | 1.6×10?2 |
GRPR-AP-LSSVR | 724 | 8.9×10?5 | 0. 94 | 7.71 | 1.5×10?2 |
Table 11
Experimental result of Sin dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 1440 | 5×10?11 | 1 | 2.28 | 1.1×10?2 |
S-LSSVR | 296 | 4×10?7 | 1 | 5.08 | 3.3×10?3 |
AI-LSSVR | 59 | 2.8×10?8 | 1 | 1.3×10?2 | 7×10?4 |
IAP-LSSVR | 56 | 2.3×10?8 | 1 | 1.3×10?2 | 8×10?4 |
PEM-LSSVR | 58 | 2.6×10?8 | 1 | 1.3×10?2 | 7×10?4 |
GRPR-AP-LSSVR | 30 | 3.5×10?8 | 1 | 1.2×10?2 | 5×10?4 |
Table 12
Experimental result of Sample dataset"
Algorithm | #SV | RMSE | WIA | Training time/s | Testing time/s |
LSSVR | 3240 | 2×10?13 | 1 | 26.4 | 5.6×10?2 |
S-LSSVR | 324 | 1.6×10?7 | 1 | 39.4 | 7.7×10?3 |
AI-LSSVR | 60 | 1×10?8 | 1 | 1.8×10?2 | 1.8×10?3 |
IAP-LSSVR | 62 | 1×10?8 | 1 | 1.9×10?2 | 1.6×10?3 |
PEM-LSSVR | 52 | 1.5×10?8 | 1 | 1.1×10?2 | 1.6×10?3 |
GRPR-AP-LSSVR | 31 | 1.5×10?8 | 1 | 1.1×10?2 | 1.5×10?3 |
1 | VAPNIK V N, CHERVONENKIS A Y Necessary and sufficient conditions for the uniform convergence of empirical means to their true values. Journal of Forecasting, 1981, 24 (7): 505- 521. |
2 | SHUAI Y, SONG T L, WANG J P Method on support vector machine prediction method considering the whole process optimization. System Engineering and Electronics, 2017, 39 (4): 931- 940. |
3 |
SHUYU D, DONGXIAO N, YAN L Forecasting of energy consumption in China based on ensemble empirical mode decomposition and least squares support vector machine optimized by improved shuffled frog leaping algorithm. Applied Sciences, 2018, 8 (5): 678.
doi: 10.3390/app8050678 |
4 | DEO R C, KISI O, SINGH V P Drought forecasting in eastern Australia using multivariate adaptive regression spline least square support vector machine and M5Tree model. Atmospheric Research, 2017, 184 (16): 149- 175. |
5 | ZHANG W W, DING W R, LIU C H Prediction method of UAV data link interference effect in complex environment. System Engineering and Electronics, 2016, 38 (4): 760- 766. |
6 | LI B, LI X T, GAO X G, et al Air ground integrated attack task decision based on SVM and skyline query. System Engineering and Electronics, 2018, 40 (6): 1281- 1287. |
7 | WANG C Y, HUANG P P, LI X F, et al Radar HRRP target recognition based on AEPSO-SVM algorithm. System Engineering and Electronics, 2019, 41 (9): 1984- 1989. |
8 | HONG X, MITCHELL R, DI FATTA G Simplex basis function based sparse least squares support vector regression. Neurocomputing, 2019, 330 (22): 394- 402. |
9 |
GAO R P, SAN Y Improved adaptive pruning algorithm for least squares support vector regression. Journal of System Engineering and Electronics, 2012, 23 (3): 438- 444.
doi: 10.1109/JSEE.2012.00055 |
10 | WU Q, ZANG B Y, QI Z X, et al Multi-kernal sparse support vector machine using compressive sensing. System Engineering and Electronics, 2019, 41 (9): 1930- 1936. |
11 | SUYKENS J A K, LUKAS L, VANDEWALLE J. Sparse approximation using least squares support vector machines. Proc. of the IEEE International Symposium on Circuits & Systems, 2002. DOI: 10.1109/ISCAS.2000.856439. |
12 | YANG J, BOUZERDOUM A, PHUNG S L. A training algorithm for sparse LS-SVM using compressive sampling. Proc. of the International Conference on Acoustics, Speech and Signal Processing, 2010: 2054–2057. |
13 | YANG L X, YANG S Y, ZHANG R Sparse least square support vector machine via coupled compressive pruning. Neurocomputing, 2014, 131 (5): 77- 86. |
14 | DE B K, DE B J, SUYKENS J A K Optimized fixed-size kernel models for large data sets. Computational Statistics & Data Analysis, 2010, 54 (6): 1484- 1504. |
15 | KARSMAKERS P, PELCKMANS K, BRABANTER K D Sparse conjugate directions pursuit with application to fixed-size kernel models. Machine Learning, 2011, 85 (1): 109- 148. |
16 | CAUWENBERGHS G, POGGIO T A. Incremental and decremental support vector machine learning. Proc. of the International Conference on Neural Information Processing Systems, 2000. |
17 | CHEN Y T, XIONG J, XU W H A novel online incremental and decremental learning algorithm based on variable support vector machine. Cluster Computing, 2019, 22 (3): 7435- 7445. |
18 | LEE W H, KO B J, WANG S. Exact incremental and decremental learning for LS-SVM. Proc. of the IEEE International Conference on Image Processing, 2019. DOI:10.1109/ICIP.2019.8803291. |
19 |
JIN B, JING Z L, ZHAO H T Incremental and decremental extreme learning machine based on generalized inverse. IEEE Access, 2017, 5, 1- 14.
doi: 10.1109/ACCESS.2017.2755738 |
20 |
ZHAO Y P, WANG K K, LI F A pruning method of refining recursive reduced least squares support vector regression. Information Sciences, 2015, 296, 160- 174.
doi: 10.1016/j.ins.2014.10.058 |
21 |
JIAO L C, BO L F, WANG L Fast sparse approximation for least squares support vector machine. IEEE Trans. on Neural Network, 2007, 18 (3): 685- 697.
doi: 10.1109/TNN.2006.889500 |
22 |
GUO G, ZHANG J S Reducing examples to accelerate support vector regression. Pattern Recognition Letters, 2007, 28 (16): 2173- 2183.
doi: 10.1016/j.patrec.2007.04.017 |
23 | ZHAO Y P, SUN J G Improved scheme to accelerate support vector regression. Journal of Systems Engineering and Electronics, 2009, 20 (5): 1086- 1090. |
24 |
KRUIF B J D, VRIES T J A D Pruning error minimization in least squares support vector machines. IEEE Trans. on Neural Networks, 2003, 14 (3): 696- 702.
doi: 10.1109/TNN.2003.810597 |
25 | ZENG X Y, CHEN X W SMO-based pruning methods for sparse least squares support vector machines. IEEE Trans. on Neural Networks, 2006, 16 (6): 1541- 1546. |
26 | WU C G. Research on generalized chromosome genetic algorithm and iterative least squares support vector machine regression algorithm. Changchun: Jilin University, 2006. |
27 | MA Y F, LIANG X, ZHOU X P A fast least squares support vector machine sparse algorithm based on global representative points. Acta Automatica Sinica, 2017, 43 (1): 132- 141. |
28 | LIU J H, CHEN J P, CHENG J S Online LS-SVM for function estimation and classification. Journal of Beijing University of Science and Technology, 2003, 10 (5): 73- 77. |
29 |
CAWLEY G C, TALBOT N L C Fast exact leave-one-out cross-validation of sparse least-squares support vector machines. Neural Networks, 2004, 17 (10): 1467- 1475.
doi: 10.1016/j.neunet.2004.07.002 |
30 |
AN S, LIU W, VENKATESH S Fast cross-validation algorithms for least squares support vector machine and kernel ridge regression. Pattern Recognition, 2007, 40 (8): 2154- 2162.
doi: 10.1016/j.patcog.2006.12.015 |
31 | WANG W J, XU Z B A heuristic training for support vector regression. Neurocomputing, 2004, 61 (10): 259- 275. |
[1] | Zhiyuan SHEN, Qianqian WANG, Xinmiao CHENG. A sparsity adaptive compressed signal reconstruction based on sensing dictionary [J]. Journal of Systems Engineering and Electronics, 2021, 32(6): 1345-1353. |
[2] | Chaozhu ZHANG, Hongyi XU, Haiqing JIANG. Adaptive block greedy algorithms for receiving multi-narrowband signal in compressive sensing radar reconnaissance receiver [J]. Journal of Systems Engineering and Electronics, 2018, 29(6): 1158-1169. |
[3] | Hongyi Xu, Haiqing Jiang, and Chaozhu Zhang. Multi-narrowband signals receiving method based on analog-to-information convertor and block sparsity [J]. Systems Engineering and Electronics, 2017, 28(4): 643-. |
[4] | Yuli Fu, Jian Zou, Qiheng Zhang, and Haifeng Li. Recoverability analysis of block-sparse representation [J]. Journal of Systems Engineering and Electronics, 2014, 25(3): 373-379. |
Viewed | ||||||
Full text |
|
|||||
Abstract |
|
|||||