
تعداد نشریات | 21 |
تعداد شمارهها | 610 |
تعداد مقالات | 9,026 |
تعداد مشاهده مقاله | 67,082,741 |
تعداد دریافت فایل اصل مقاله | 7,656,160 |
A new method of linear support vector regression with interval data | ||
International Journal of Nonlinear Analysis and Applications | ||
دوره 12، شماره 2، بهمن 2021، صفحه 857-868 اصل مقاله (457.28 K) | ||
نوع مقاله: Research Paper | ||
شناسه دیجیتال (DOI): 10.22075/ijnaa.2020.19163.2067 | ||
نویسندگان | ||
Mojtaba Baymani* 1؛ Hoda Saffaran2؛ Nima Salehi-M.3 | ||
1Applied Mathematics, Mathematics, Quchan University of Technology, Quchan, Iran | ||
2Department of Computer, Engineering Faculty, Ferdowsi University of Mashhad, Mashhad, Iran | ||
3Department of Computer, Engineering Faculty, Ferdowsi University of Mashhad, Mashhad, IRAN | ||
تاریخ دریافت: 11 آذر 1398، تاریخ بازنگری: 21 دی 1398، تاریخ پذیرش: 09 بهمن 1398 | ||
چکیده | ||
In this paper, the linear support vector regression approach is proposed for solving the regression problem with interval data, which is called interval support vector regression(ISVR). The ISVR approach is equivalent to solving a linear constrained quadratic programming problem (QPP) with an interval cost coefficient in which the value of the objective function is in an interval. Instead of solving an interval QPP, we solve two QPPs and prove that the cost values of these two problems are the lower bound and the upper bound of the target value of the interval QPP. We show these two mentioned QPPs are equivalent to two support vector regression problems which the first problem applies the lower bound of data and the second problem considers the upper bound of the data. to obtain the regression function. Some experiments are made to demonstrate the performance of our method compared with the known algorithms on several artificial, benchmark and real practical datasets. | ||
کلیدواژهها | ||
Quadratic programming؛ Computing methodologies and applications؛ Linear regression. | ||
مراجع | ||
[1] J. Bi and K. P. Bennett, A geometric approach to support vector regression, Neurocomp. 55 (1-2) (2003) 79—108. [2] L. Billard, Dependencies and Variation Components of Symbolic Interval-Valued Data, Selected Contributions in Data Analysis and Classification, Springer, (2007) 3—12. [3] L. Billard and E. Diday, Regression Analysis for Interval-Valued Data, Data Analysis, Classification, and Related Methods, Springer, (2000) 369—374. [4] C. Blake, Uci repository of machine learning databases, http://www. ics. uci. edu/˜ mlearn/MLRepository. html.1998. [5] C. J Burges, A tutorial on support vector machines for pattern recognition, Data Min. Knowl- edge Disc. 2(2) (1998) 121—167. [6] Cortes, C., Vapnik, V., 1995. Support-vector networks. Machine learning 20 (3), 273–297. [7] R.-E. Fan, K.-W. Chang, C.-J. Hsieh, X.-R. Wang and C.-J. Lin, Liblinear: A library for large linear classification, J. Mach. Lear. Res. 9 (2008) 1871—1874. [8] E. d. A. L. Neto, G. M. Cordeiro and F. d. A. de Carvalho, Bivariate symbolic regression models for interval-valued variables, J. Stat. Comput. Simul. 81(11) (2011) 1727—1744. [9] E. d. A. L. Neto, F. A. de Carvalho and C. P. Tenorio, Univariate and multivariate linear regression methods to predict interval-valued features, Aust. Joint Conf. Artif. Intel., Springer, (2004) 526–537. [10] E. d. A. L. Neto and F. A. de Carvalho, Centre and range method for fitting a linear regression model to symbolic interval data, Comput. Stat. Data Anal. 52(3) (2008) 1500—1515. [11] E. d. A. L. Neto and F. A. de Carvalho, Constrained linear regression models for symbolic interval valued variables, Comput. Stat. Data Anal. 54 (2) (2010) 333-–347. [12] X. Peng, Efficient twin parametric insensitive support vector regression model, Neurocomp. 79 (2012) 26-–38. [13] X. Peng and Y. Wang, The robust and efficient adaptive normal direction support vector regression, Expert Syst. Appl. 38(4) (2011) 2998-–3008. [14] Y.-H. Shao, C.-H. Zhang, Z.-M. Yang, L. Jing and N.-Y. Deng, An ε-twin support vector machine for regression, Neural Comput. Appl. 23(1) (2013) 175—185. [15] Shao, Y.-H., Zhang, C.-H., Yang, Z.-M., Jing, L., Deng, N.-Y., 2013b. An ”-twin support vector machine for regression. Neural Computing and Applications 23 (1), 175–185. [16] Z.-M. Yang, X.-Y. Hua, Y.-H. Shao and Y.-F. Ye, A novel parametric-insensitive nonparallel support vector machine for regression, Neurocomp. 171 (2016) 649–663. | ||
آمار تعداد مشاهده مقاله: 15,518 تعداد دریافت فایل اصل مقاله: 402 |