Support vector regression (SVR) has been regarded as a state-of-the-art method for approximation and regression. The importance of kernel function, which is so-called admissible support vector kernel (SV kernel) in SVR, has motivated many studies on its composition. The Gaussian kernel (RBF) is regarded as a "best" choice of SV kernel used by non-expert in SVR, whereas there is no evidence, except for its superior performance on some practical applications, to prove the statement. Its well-known that reproducing kernel (R.K) is also a SV kernel which possesses many important properties, e.g. positive definiteness, reproducing property and composing complex R.K by simpler ones. However, there are a limited number of R.Ks with explicit forms and consequently few quantitative comparison studies in practice. In this paper, two R.Ks, i.e. SV kernels, composed by the sum and product of a translation invariant kernel in a Sobolev space are proposed. An exploratory study on the performance of SVR based general R.K is presented through a systematic comparison to that of RBF using multiple criteria and synthetic problems. The results show that the R.K is an equivalent or even better SV kernel than RBF for the problems with more input variables (more than 5, especially more than 10) and higher nonlinearity., {"references":["V. Vapnik, The Nature of Statistical Learning Theory. New York:\nSpringer-Verlag, 1995.","G. Bloch, F. Lauer, G. Colin, and Y. Chamaillard, \"Support vector regression\nfrom simulation data and few experimental samples,\" Information\nSciences, vol. 178, pp. 3813-3827, 2008.","J.-B. Gao, S. R. Gunn, and C. J. Harris, \"Mean field method for the\nsupport vector machine regression,\" Neurocomputing, vol. 50, pp. 391-\n405, 2003.","M. A. Mohandes, T. O. Halawani, S. Rehman, and A. A. Hussain, \"Support\nvector machines for wind speed prediction,\" Renewable Energy,\nvol. 29, no. 6, pp. 939-947, 2004.","W.-W. He, Z.-Z. Wang, and H. Jiang, \"Model optimizing and feature\nselecting for support vector regression in time series forecasting,\"\nNeurocomputing, vol. 73, no. 3, pp. 600-611, 2008.","F. Pan, P. Zhu, and Y. Zhang, \"Metamodel-based lightweight design of\nb-pillar with twb structure via support vector regression,\" Computers\nand Structures, vol. 88, pp. 36-44, 2010.","C. J. Burges, \"A tutorial on support vector machines for pattern recognition,\"\nData Mining and Knowledge Discovery, vol. 2, pp. 121-167,\n1998.","A. J. Smola and B. Sch¨olkopf, \"A tutorial on support vector regression,\"\nStatistics and Computing, vol. 14, no. 3, pp. 199-222, 2004.","J. Mercer, Ed., Functions of positive and negative type and their\nconnection with the theory of integral equations, ser. Philosophical\nTransactions of the Royal Society, London, 1909, vol. A, 209.\n[10] B. E. Boser, I. M. Guyon, and V. Vapnik, \"A training algorithm for optimal\nmargin classifiers,\" in Proceedings of the 5th Annual ACM Workshop\non Computational Learning Theory, D. Haussler, Ed. Pittsburgh, PA:\nACM Press, 1992, pp. 144-152.\n[11] B. Schp¨olkopf, \"The kernel trick for distances,\" Neural Information\nProcess. Systems (NIPS), vol. 13, 2000.\n[12] B. Sch¨olkopf, K.-K. Sung, C. J. Burges, F. Girosi, P. Niyogi, T. Poggio,\nand V. Vapnik, \"Comparing support vector machines with gaussian\nkernels to radial basis function classifiers,\" IEEE Transactions on Signal\nProcessing, vol. 45, pp. 2758-2765, 1997.\n[13] D. Anguita and G. Bozza, \"The effect of quantization on support vector\nmachines with gaussian kernel,\" in Proceedings of International Joint\nConference on Neural Networks, Montreal, Canada, 2005, pp. 681-684.\n[14] X.-Y. Zhang and Y.-C. Liu, \"Performance analysis of support vector\nmachines with gauss kernel,\" Computer Engineering, vol. 29, no. 8, pp.\n22-25, 2003.\n[15] Y. Tan and J. Wang, \"A support vector machine with a hybrid kernel\nand minimal vapnik-chervonenkis dimension,\" IEEE Transactions on\nKnowledge and Data Engineering, vol. 16, pp. 385-395, 2004.\n[16] J.-X. Liu, J. Li, and Y.-J. Tan, \"An empirical assessment on the\nrobustness of support vector regression with different kernels,\" in\nProceedings of the 4th International Conference on Machine Learning\nand Cybernetics, vol. 7, Guangzhou, China, 2005.\n[17] R. Opfer, \"Multiscale kernels,\" Advances in Computational Mathematics,\nvol. 25, pp. 357-380, 2006.\n[18] L. Zhang, W.-D. Zhou, and L.-C. Jiao, \"Wavelet support vector machine,\"\nIEEE Transactions on Systems, Man and Cybernetics - Part B:\nCybernetics, vol. 34, pp. 34-39, 2004.\n[19] X.-G. Zhang, D. Gao, X.-G. Zhang, and S.-J. Ren, \"Robust wavelat\nsupport machines for regression estimation,\" International Journal Information\nTechnology, vol. 11, no. 9, pp. 35-46, 2005.\n[20] A. J. Smola, B. Sch¨olkopf, and K.-R. M¨uller, \"The connection between\nregularization operators and support vector kernels,\" Neural Networks,\nvol. 11, pp. 637-649, 1998.\n[21] G. Wahba, \"Support vector machines,reproducing kernel hilbert spaces\nand randomized gacv,\" in Advances in Kernel Methods - Support Vector\nLearning, B. Sch?lkopf, C. J. Burges, and A. J. Smola, Eds. Cambridge,\nEngland: MIT Press, 1999, pp. 69-88.\n[22] L.-M. Ma and Z.-M. Wu, \"Kernel based approximation in sobolev spaces\nwith radial basis functions,\" Applied Mathematics and Computation, vol.\n215, pp. 2229-2237, 2009.\n[23] N. Aronszajn, \"Theory of reproducing kernels,\" Transactions of the\nAmerican Mathematical Society, vol. 68, no. 3, pp. 337-404, 1950.\n[24] J. Gao, C. J. Harris, and S. R. Gunn, \"Support vector kernel based on\nframes in function hilbert spaces,\" Neural Computation, vol. 13, pp.\n1975-1994, 2001.\n[25] R. Schaback, \"A unified theory of radial basis functions native hilbert\nspaces for radial basis functions ii,\" Journal of Computational and\nApplied Mathematics, vol. 121, pp. 165-177, 2000.\n[26] R. Schaback and H. Wendland, \"Approximation by positive definite\nkernels,\" in Advanced Problems in Constructive Approximation, M. D.\nBuhmann and D. H. Mache, Eds. Birkh?user, Basel: Verlag, 2002, pp.\n203-221.\n[27] A. J. Smola, B. Sch¨olkopf, and G. R¨atsch, \"Linear programs for\nautomatic accuracy control in regression,\" in Proceedings of the 9th\ninternational conference on artificial neural networks, vol. 2, Edinburgh,\nUK., 1999.\n[28] O. L. Mangasarian and D. R. Musicant, \"Large scale kernel regression\nvia linear programming,\" Machine Learning, vol. 46, no. 1-3, pp. 255-\n269, 2002.\n[29] A. J. Smola, B. Sch¨olkopf, and K.-R. Mller, \"General cost functions for\nsupport vector regression,\" in Proceedings of Ninth Australian Conf. on\nNeural Networks, Brisbane, Australia, University of Queensland, 1998,\npp. 79-83.\n[30] S. Bochner, Lectures on Fourier Integral. Princeton, New Jersey:\nPrinceton University Press, 1959.\n[31] A. J. Smola, Z. L. O' vri, and R. C. Williamson, \"Regularization with dotproduct\nkernels,\" in Advances in Neural Information Processing Systems,\nT. K. Leen, T. G. Dietterich, and V. Tresp, Eds., vol. 13. MIT Press,\n2001, pp. 308-314.\n[32] G. Wahba, Spline Models for Observational Data, SIAM, Philadephia,\n1990.\n[33] J. St¨ockler, \"Multivariate bernoulli splines and the periodic interpolation\nproblem,\" Constr. Approx., vol. 7, pp. 105-120, 1991.\n[34] A. Berlinet and C. Thomas-Agnan, Reproducing Kernel Hilbert Spaces\nin Probability and Statistics. Boston, Dordrecht, London: Kluwer\nAcademic Publishers Group, 2003.\n[35] N. Cristianini and J. Shawe-Taylor, An Introduction to Support Vector\nMachines and Other Kernel-based Learning Methods. Cambridge, U.K:\nCambridge University Press, 2000.\n[36] B. Sch¨olkopf, R. Herbrich, A. J. Smola, and R. C. Williamson, \"A\ngeneralized representer theorem,\" Tech. Rep. NeuroCOLT Technical\nReport 2000-81, 2000.\n[37] W. Zhang, \"The construction of reproducing kernel and some approximating\nproblems in the reproducing kernel spaces,\" Ph.D. dissertation,\nNational University of Defense Technology, 2005.\n[38] R. A. Adams, Sobolev Spaces. New York: Academic Press, 1975.\n[39] R. Jin, W. Chen, and T. W. Simpson, \"Comparative studies of metamodeling\ntechniques under multiple modeling criteria,\" in Proceedings of the\n8th AIIA/NASA/USAF/ISSMO Symposium on Multidisciplinary Analysis\nand Optimization, Long Beach, CA, 2000.\n[40] Y.-J. Park, \"Application of genetic algorithms in response surface\noptimization problems,\" Doctor of Philosophy, Arizona State University,\nDecember 2003.\n[41] S. S. Chaudhry and W. Luo, \"Application of genetic algorithms in production\nand operations management: A review,\" International Journal\nof Production Research, vol. 43, pp. 4083-4101, 2005.\n[42] C. R. Houck, J. A. Joines, and M. G. Kay, \"A genetic algorithm for\nfunction optimization: A matlab implementation,\" Tech. Rep. NCSU-IE\nTR 95-09, 1995.\n[43] W. Zhang, H. Wu, J. Liu, Y.-F. Zhu, and Q. Li, \"A study of exploratory\nanalysis experimental design supporting robust decision-making,\" Journal\nof System Simulation, vol. 21, no. 14, pp. 4461-4466, 2009.\n[44] J. P. C. Kleijnen and R. G. Sargent, \"A methodology for fitting and\nvalidating metamodels in simulation,\" European Journal of Operational\nResearch, vol. 120, no. 1, pp. 14-29, 2000."]}