36 results on '"Geometric function theory"'
Search Results
2. An efficient method for clustered multi-metric learning.
- Author
-
Nguyen, Bac, Ferri, Francesc J., Morell, Carlos, and De Baets, Bernard
- Subjects
- *
KERNEL (Mathematics) , *KERNEL functions , *SUPPORT vector machines , *GEOMETRIC function theory , *COMPLEX variables - Abstract
Abstract Distance metric learning, which aims at finding a distance metric that separates examples of one class from examples of the other classes, is the key to the success of many machine learning tasks. Although there has been an increasing interest in this field, learning a global distance metric is insufficient to obtain satisfactory results when dealing with heterogeneously distributed data. A simple solution to tackle this kind of data is based on kernel embedding methods. However, it quickly becomes computationally intractable as the number of examples increases. In this paper, we propose an efficient method that learns multiple local distance metrics instead of a single global one. More specifically, the training examples are divided into several disjoint clusters, in each of which a distance metric is trained to separate the data locally. Additionally, a global regularization is introduced to preserve some common properties of different clusters in the learned metric space. By learning multiple distance metrics jointly within a single unified optimization framework, our method consistently outperforms single distance metric learning methods, while being more efficient than other state-of-the-art multi-metric learning methods. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
3. Applying a kernel function on time-dependent data to provide supervised-learning guarantees.
- Author
-
de Carvalho Pagliosa, Lucas and de Mello, Rodrigo Fernandes
- Subjects
- *
COMPLEX variables , *KERNEL functions , *GEOMETRIC function theory , *COGNITIVE structures , *MACHINE learning - Abstract
The Statistical Learning Theory (SLT) defines five assumptions to ensure learning for supervised algorithms. Data independency is one of those assumptions, once the SLT relies on the Law of Large Numbers to ensure learning bounds. As a consequence, this assumption imposes a strong limitation to guarantee learning on time-dependent scenarios. In order to tackle this issue, some researchers relax this assumption with the detriment of invalidating all theoretical results provided by the SLT. In this paper we apply a kernel function, more precisely the Takens’ immersion theorem, to reconstruct time-dependent open-ended sequences of observations, also referred to as data streams in the context of Machine Learning, into multidimensional spaces (a.k.a. phase spaces) in attempt to hold the data independency assumption. At first, we study the best immersion parameterization for our kernel function using the Distance-Weighted Nearest Neighbors (DWNN). Next, we use this best immersion to recursively forecast next observations based on the prediction horizon, estimated using the Lyapunov exponent. Afterwards, predicted observations are compared against the expected ones using the Mean Distance from the Diagonal Line (MDDL). Theoretical and experimental results based on a cross-validation strategy provide stronger evidences of generalization, what allows us to conclude that one can learn from time-dependent data after using our approach. This opens up a very important possibility for ensuring supervised learning when it comes to time-dependent data, being useful to tackle applications such as in the climate, animal tracking, biology and other domains. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
4. Higher-order probabilistic sensitivity calculations using the multicomplex score function method.
- Author
-
Garza, J. and Millwater, H.
- Subjects
- *
KERNEL functions , *PROBABILITY theory , *MATRICES (Mathematics) , *COMPLEX variables , *GEOMETRIC function theory - Abstract
The score function method used to compute first order probabilistic sensitivities is extended in this work to arbitrary-order derivatives included mixed partial derivatives through the use of multicomplex mathematics. Multicomplex mathematics provides an effective and convenient numerical means to compute the high-order kernel functions with respect to natural parameters or moments (mean and standard deviation) obviating the need to analytically determine the kernel functions. Using these numerical kernel functions, high-order derivatives of the response moments or the probability-of-failure with respect to the parameters of the input distributions can be obtained. Numerical results indicate that the high-order probabilistic sensitivities converge with respect to the number of samples at the same rate as standard Monte Carlo estimates. Implementation of multicomplex mathematics is facilitated through the use of the Cauchy–Riemann matrices; therefore, the extension of common engineering probability distributions to matrix form is presented. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
5. Approximate analytical solutions of nonlocal fractional boundary value problems.
- Author
-
Li, Xiuying and Wu, Boying
- Subjects
- *
DIFFERENTIAL equations , *MATHEMATICAL physics , *KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory - Abstract
A new computational method is proposed for solving fractional differential equations with nonlocal boundary conditions based on the reproducing kernel method (RKM). The present method can avoid finding a reproducing kernel satisfying nonlocal boundary conditions. Four numerical examples are provided to demonstrate the accuracy and efficiency of the method. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
6. Spectral restoration using semi-blind deconvolution method with detail-preserving regularization.
- Author
-
Zhu, Hu and Deng, Lizhen
- Subjects
- *
DECONVOLUTION (Mathematics) , *INTEGRAL equations , *NUMERICAL solutions to differential equations , *MATHEMATICAL convolutions , *KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory - Abstract
Peak-details are often smoothed when deconvolution methods are used for spectral restoration. In order to preserve spectral details, detail-preserving regularization is devised and a semi-blind deconvolution method with the detail-preserving regularization (SBD-DP) is proposed. The cost function of SBD-DP is formulated and the numerical solution processes are deduced for restoring spectra and estimating parameter of blur kernel. The deconvolution results of simulated spectra demonstrate that the proposed SBD-DP can restore the spectrum effectively and has a merit on preserving peak details, as well as can estimate the parameter of blur kernel accurately. Then the deconvolution result of experimental Raman spectrum indicates the effectiveness of the proposed SBD-DP method on improving spectral resolution. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
7. Feature Selection Based Hybrid Anomaly Intrusion Detection System Using K Means and RBF Kernel Function.
- Author
-
Ravale, Ujwala, Marathe, Nilesh, and Padiya, Puja
- Subjects
KERNEL functions ,COMPLEX variables ,GEOMETRIC function theory ,MATHEMATICAL functions ,REAL variables - Abstract
In Information Security, intrusion detection is the act of detecting actions that attempt to compromise the security goals. One of the primary challenges to intrusion detection is the problem of misjudgment, misdetection and lack of real time response to the attack. Various data mining techniques as clustering, classification and association rule discovery are being used for intrusion detection. The proposed hybrid technique combines data mining approaches like K Means clustering algorithm and RBF kernel function of Support Vector Machine as a classification module. The main purpose of proposed technique is to decrease the number of attributes associated with each data point. So, the proposed technique can perform better in terms of Detection Rate and Accuracy when applied to KDDCUP’99 Data Set. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
8. View independent face detection based on horizontal rectangular features and accuracy improvement using combination kernel of various sizes
- Author
-
Hotta, Kazuhiro
- Subjects
- *
RESEARCH & development , *KERNEL functions , *GEOMETRIC function theory , *ALGORITHMS - Abstract
Abstract: This paper proposes a view independent face detection method based on horizontal rectangular features, and accuracy improvement by combining kernels of various sizes. Since the view changes of faces induce large variation in appearance in the horizontal direction, local kernels are applied to horizontal rectangular regions to model such appearance changes. Local kernels are integrated by summation, and then used as a summation kernel for support vector machine (SVM). View independence is shown to be realized by the integration of local horizontal rectangular kernels. However, in general, local kernels (features) of various sizes have different similarity measures, such as detailed and rough similarity, and thus their error patterns are different. If the local and global kernels are combined well, the generalization ability is improved. This research demonstrates the comparative effectiveness of combining the global kernel and local kernels of various sizes as a summation kernel for SVM against use of only the global kernel, only the combination of local kernels and Adaboost with SVMs with a kind of local kernel. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
9. A boundary method for outlier detection based on support vector domain description
- Author
-
Guo, S.M., Chen, L.C., and Tsai, J.S.H.
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *BERGMAN kernel functions - Abstract
Abstract: The support vector domain description (SVDD) is a popular kernel method for outlier detection, which tries to fit a class of data with a sphere and uses a few target objects to support its decision boundary. The problem is that even with a flexible Gaussian kernel function, the SVDD could sometimes generate such a loose decision boundary that the discrimination ability becomes poor. Therefore, a computationally intensive procedure called kernel whitening is often required to improve the performance. In this paper, we propose a simple post-processing method which tries to modify the SVDD boundary in order to achieve a tight data description with no need of kernel whitening. With the derivation of the distance between an object and its nearest boundary point in input space, the proposed method can efficiently construct a new decision boundary based on the SVDD boundary. The improvement from the proposed method is demonstrated with synthetic and real-world datasets. The results show that the proposed decision boundary can fit the shape of synthetic data distribution closely and achieves better or comparable classification performance on real-world datasets. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
10. Using kernel density function as an urban analysis tool: Investigating the association between nightlight exposure and the incidence of breast cancer in Haifa, Israel
- Author
-
Kloog, Itai, Haim, Abraham, and Portnov, Boris A.
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *DENSITY functionals - Abstract
Abstract: The kernel density (KD) function estimates the intensity of events across a surface by calculating the overall number of cases situated within a given search radius from a target point. To form a continuous surface from individual observations, the KD technique does not require the presence of a parameter’s value in a given location (e.g., the incidence rate of a disease). This feature of KD smoothing is especially beneficial for empirical studies in which individual observations are represented by geographic coordinates only and have no other attributes, required by more commonly used smoothing techniques, such as spline and kriging. In the present study, we illustrate the use of KD technique for a study of association between the geographical distributions of breast cancer cases and exposure to artificial illumination during nighttime (light-at-night or LAN) in the city of Haifa, Israel. [Copyright &y& Elsevier]
- Published
- 2009
- Full Text
- View/download PDF
11. Parameterized algorithmics for linear arrangement problems
- Author
-
Fernau, Henning
- Subjects
- *
ALGORITHMS , *KERNEL functions , *COMPUTATIONAL mathematics , *GEOMETRIC function theory , *MATHEMATICAL analysis - Abstract
Abstract: We discuss different variants of linear arrangement problems from a parameterized perspective. More specifically, we concentrate on developing simple search tree algorithms for these problems. Despite this simplicity, the analysis of the algorithms is often rather intricate. For the newly introduced problem linear arrangement by deleting edges, we also show how to derive a small problem kernel. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
12. Infinite sequence of parallel cracks in an anisotropic piezoelectric solid
- Author
-
Yang, P.S., Liou, J.Y., and Sung, J.C.
- Subjects
- *
KERNEL functions , *GEOMETRIC function theory , *COMPLEX variables , *FUNCTIONAL equations - Abstract
Abstract: In this paper, the problem of an infinite sequence of parallel cracks in an infinite extended piezoelectric solid is analyzed. A system of singular integral equations is formulated for general anisotropic piezoelectric materials, of which the kernel functions developed are in complex form. For commonly used transversely isotropic piezoelectric materials, the kernel functions are given in real forms. Furthermore, the obtained real kernel functions may be reduced, respectively, to those kernel functions for purely elastic and purely electric problems when the coupled mechanical and electric effects disappear. The system of singular integral equations is solved numerically and the coupling effects of the mechanical and electric phenomena are presented by the generalized stress intensity factors. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
13. On the granularity of summative kernels
- Author
-
Loquin, Kevin and Strauss, Olivier
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *GRANULAR computing - Abstract
Abstract: In this paper, we propose granularity as a new index to characterize the non-specificity of a summative kernel. This index is intended to reflect the behavior of a kernel in the usual signal processing applications. We show, in different experiments, that two kernels having the same granularity have very similar behavior. This granularity-based adaptation is compared to other adaptation methods. These experiments highlight the ability of the granularity index to measure the spreading and collecting properties of a summative kernel. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
14. Wavelet support vector machine for induction machine fault diagnosis based on transient current signal
- Author
-
Widodo, Achmad and Yang, Bo-Suk
- Subjects
- *
INDUCTION motors , *KERNEL functions , *COMPLEX variables , *INDUCTION machinery , *GEOMETRIC function theory , *CAPACITOR motors - Abstract
Abstract: This paper presents establishing intelligent system for faults detection and classification of induction motor using wavelet support vector machine (W-SVM). Support vector machines (SVM) is well known as intelligent classifier with strong generalization ability. Application of nonlinear SVM using kernel function is widely used for multi-class classification procedure. In this paper, building kernel function using wavelet will be introduced and applied for SVM multi-class classifier. Moreover, the feature vectors for training classification routine are obtained from transient current signal that preprocessed by discrete wavelet transform. In this work, principal component analysis (PCA) and kernel PCA are performed to reduce the dimension of features and to extract the useful features for classification process. Hence, a relatively new intelligent faults detection and classification method called W-SVM is established. This method is used to induction motor for faults classification based on transient current signal. The results show that the performance of classification has high accuracy based on experimental work. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
15. Dually discrete spaces
- Author
-
Alas, Ofelia T., Junqueira, Lucia R., and Wilson, Richard G.
- Subjects
- *
SOCIAL groups , *KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory - Abstract
Abstract: A neighbourhood assignment in a space X is a family of open subsets of X such that for any . A set is a kernel of if . We obtain some new results concerning dually discrete spaces, being those spaces for which every neighbourhood assignment has a discrete kernel. This is a strictly larger class than the class of D-spaces of [E.K. van Douwen, W.F. Pfeffer, Some properties of the Sorgenfrey line and related spaces, Pacific J. Math. 81 (2) (1979) 371–377]. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
16. A new extension of kernel feature and its application for visual recognition
- Author
-
Liu, Qingshan, Jin, Hongliang, Tang, Xiaoou, Lu, Hanqing, and Ma, Songde
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *SENSORY perception - Abstract
Abstract: In this paper, we first conceive a new perception of the kernel feature. The kernel subspace methods can be regarded as two independent steps: an explicit kernel feature extraction step and a linear subspace analysis step on the extracted kernel features. The kernel feature vector of an image is composed of dot products between the image and all the training images using nonlinear dot product kernel. Then, based on this perception, we further extend the kernel feature vector of an image to a kernel feature matrix for visual recognition. This extension takes different representation cues of images into account, respectively, while only global average information is used in the traditional kernel methods. From the view of dot product as similarity, this extension means using multiple similarities to measure two images, which is more accordant to human vision. In order to efficiently deal with the problem of numerical computation, a matrix-based kernel discriminant analysis algorithm is employed to learn discriminating kernel features for visual recognition. Experiments on the FERET face database, the COIL-100 object database, and the Wang''s nature image database show the advantage of the proposed method. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
17. A kernel optimization method based on the localized kernel Fisher criterion
- Author
-
Chen, Bo, Liu, Hongwei, and Bao, Zheng
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *ELECTRONIC pulse techniques - Abstract
Abstract: It is widely recognized that whether the selected kernel matches the data determines the performance of kernel-based methods. Ideally it is expected that the data is linearly separable in the kernel induced feature space, therefore, Fisher linear discriminant criterion can be used as a cost function to optimize the kernel function. However, the data may not be linearly separable even after kernel transformation in many applications, e.g., the data may exist as multimodally distributed structure, in this case, a nonlinear classifier is preferred, and obviously Fisher criterion is not a suitable choice as kernel optimization rule. Motivated by this issue, we propose a localized kernel Fisher criterion, instead of traditional Fisher criterion, as the kernel optimization rule to increase the local margins between embedded classes in kernel induced feature space. Experimental results based on some benchmark data and measured radar high-resolution range profile (HRRP) data show that the classification performance can be improved by using the proposed method. [Copyright &y& Elsevier]
- Published
- 2008
- Full Text
- View/download PDF
18. Learning the kernel matrix by maximizing a KFD-based class separability criterion
- Author
-
Yeung, Dit-Yan, Chang, Hong, and Dai, Guang
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *GEOMETRY - Abstract
Abstract: The advantage of a kernel method often depends critically on a proper choice of the kernel function. A promising approach is to learn the kernel from data automatically. In this paper, we propose a novel method for learning the kernel matrix based on maximizing a class separability criterion that is similar to those used by linear discriminant analysis (LDA) and kernel Fisher discriminant (KFD). It is interesting to note that optimizing this criterion function does not require inverting the possibly singular within-class scatter matrix which is a computational problem encountered by many LDA and KFD methods. We have conducted experiments on both synthetic data and real-world data from UCI and FERET, showing that our method consistently outperforms some previous kernel learning methods. [Copyright &y& Elsevier]
- Published
- 2007
- Full Text
- View/download PDF
19. A new edge detector based on Fresnel diffraction
- Author
-
Diao, Luhong, Yu, Bin, and Li, Hua
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *MATHEMATICS , *FILTERS & filtration , *QUANTITATIVE research - Abstract
This paper proposed an edge detection scheme which is deduced from Fresnel diffraction. Analysis in this paper shows that Fresnel convolution kernel function performs well on edge enhancement when images are transformed into complex functions. Due to its mathematical complexity, the method is simplified into a linear convolution filter. The new edge detector is designed based on the simplified linear filter. Experimental results indicate that the new detector gives quantitative results equal to the Canny detector while it is more simple to be implemented. [Copyright &y& Elsevier]
- Published
- 2007
- Full Text
- View/download PDF
20. A new large-update interior point algorithm for P ∗(κ) LCPs based on kernel functions
- Author
-
Cho, Gyeong-Mi and Kim, Min-Kyung
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *ALGORITHMS - Abstract
Abstract: In this paper we propose a new large-update primal-dual interior point algorithm for P ∗(κ) linear complementarity problems (LCPs). Recently, Peng et al. introduced self-regular barrier functions for primal-dual interior point methods (IPMs) for linear optimization (LO) problems and reduced the gap between the practical behavior of the algorithm and its theoretical worst case complexity. We introduce a new class of kernel functions which is not logarithmic barrier nor self-regular in the complexity analysis of interior point method (IPM) for P ∗(κ) linear complementarity problem (LCP). New search directions and proximity measures are proposed based on the kernel function. We showed that if a strictly feasible starting point is available, then the new large-update primal-dual interior point algorithms for solving P ∗(κ) LCPs have the polynomial complexity which is better than the classical large-update primal-dual algorithm based on the classical logarithmic barrier function. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
21. Data representations and generalization error in kernel based learning machines
- Author
-
Ancona, Nicola, Maglietta, Rosalia, and Stella, Ettore
- Subjects
- *
KERNEL functions , *MACHINE learning , *GEOMETRIC function theory , *MACHINE theory - Abstract
Abstract: This paper focuses on the problem of how data representation influences the generalization error of kernel based learning machines like support vector machines (SVM) for classification. Frame theory provides a well founded mathematical framework for representing data in many different ways. We analyze the effects of sparse and dense data representations on the generalization error of such learning machines measured by using leave-one-out error given a finite amount of training data. We show that, in the case of sparse data representations, the generalization error of an SVM trained by using polynomial or Gaussian kernel functions is equal to the one of a linear SVM. This is equivalent to saying that the capacity of separating points of functions belonging to hypothesis spaces induced by polynomial or Gaussian kernel functions reduces to the capacity of a separating hyperplane in the input space. Moreover, we show that, in general, sparse data representations increase or leave unchanged the generalization error of kernel based methods. Dense data representations, on the contrary, reduce the generalization error in the case of very large frames. We use two different schemes for representing data in overcomplete systems of Haar and Gabor functions, and measure SVM generalization error on benchmarked data sets. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
22. Strictly positive definite kernels on subsets of the complex plane
- Author
-
Menegatto, V.A., Oliveira, C.P., and Peron, A.P.
- Subjects
- *
EQUATIONS , *KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *MATHEMATICAL functions - Abstract
Abstract: Let (z, w) ∈ ℂ × ℂ (zw) be a positive definite kernel and B a subset of ℂ. In this paper, we seek conditions in order that the restriction (z, w) ∈ B × B(zw) be strictly positive definite. Since this problem has been solved recently in the cases in which B is either ℂ or the unit circle in ℂ, our purpose here is twofold: to present some results we obtained when attempting to solve the problem for the above and other choices of B and to acquaint the audience with some other questions that remain. For two different classes of subsets, we completely characterize the strict positive definiteness of the kernel. We include a complete discussion of the case in which B is the unit circle of ℂ, making a comparison with the classical problem of strict positive definiteness on the real circle. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
23. A kernel-based subtractive clustering method
- Author
-
Kim, Dae-Won, Lee, KiYoung, Lee, Doheon, and Lee, Kwang H.
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory - Abstract
Abstract: In this paper the conventional subtractive clustering method is extended by calculating the mountain value of each data point based on a kernel-induced distance instead of the conventional sum-of-squares distance. The kernel function is a generalization of the distance metric that measures the distance between two data points as the data points are mapped into a high dimensional space. Use of the kernel function makes it possible to cluster data that is linearly non-separable in the original space into homogeneous groups in the transformed high dimensional space. Application of the conventional subtractive method and the kernel-based subtractive method to well-known data sets showed the superiority of the proposed approach. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
24. Explicit 3-D RKPM shape functions in terms of kernel function moments for accelerated computation
- Author
-
Zhou, J.X., Wang, X.M., Zhang, Z.Q., and Zhang, L.
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *FINITE element method - Abstract
Abstract: The construction of meshless shape functions is more time-consuming than evaluation of FEM shape functions. Therefore, it is of great importance to take measures to speed up the computation of meshless shape functions. 3-D meshless shape functions and their derivatives are, in the context of reproducing kernel particle method (RKPM), expressed explicitly in terms of kernel function moments for the very first time. This avoids solutions of linear algebraic equations and numerical inversions encountered in standard RKPM implementation, thus speeds up computation of meshless shape functions. A numerical test is performed in a hexahedral domain with the mere purpose of comparing the computation time for shape functions construction between the standard RKPM implementation and the enhanced procedure. Then two 3-D elastostatics numerical examples are presented, which demonstrate that the proposed unique treatment of RKPM shape functions is especially effective. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
25. Kernel of the variation operator and periodicity of open books
- Author
-
Kauffman, Louis H. and Krylov, Nikolai A.
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *DIFFEOMORPHISMS - Abstract
Abstract: We consider a parallelizable 2n-manifold F which has the homotopy type of the wedge product of n-spheres and show that the group of pseudo-isotopy classes of orientation preserving diffeomorphisms that keep the boundary ∂F pointwise fixed and induce the trivial variation operator is a central extension of the group of all homotopy -spheres by . Then we apply this result to study the periodicity properties of branched cyclic covers of manifolds with simple open book decompositions and extend the previous results of Durfee, Kauffman and Stevens to dimensions 7 and 15. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
26. Uncorrelated discriminant vectors using a kernel method
- Author
-
Liang, Zhizheng and Shi, Pengfei
- Subjects
- *
KERNEL functions , *LINEAR statistical models , *COMPLEX variables , *GEOMETRIC function theory - Abstract
Uncorrelated discriminant vectors using a kernel method are proposed in this paper. In some sense, kernel uncorrelated discriminant vectors extend Jin''s method and then several related theorems are stated. Most importantly, the proposed method can deal with nonlinear problems. Finally, experimental results on handwritten numeral characters show that the proposed method is effective and feasible. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
27. Kernel sections for non-autonomous strongly damped wave equations of non-degenerate Kirchhoff-type
- Author
-
Fan, Xiaoming and Zhou, Shengfan
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *ELLIPTIC functions , *INVARIANT wave equations - Abstract
We prove the existence of compact kernel sections for the process generated by strongly damped wave equations of non-degenerate Kirchhoff-type modelling the non-linear vibrations of an elastic string. And we obtain a precise estimate of upper bound of Hausdorff dimension of kernel sections, which decreases as the strong damping grows for large strong damping under some conditions, particularly in the autonomous case. [Copyright &y& Elsevier]
- Published
- 2004
- Full Text
- View/download PDF
28. Graphs with not all possible path-kernels
- Author
-
Aldred, R.E.L. and Thomassen, Carsten
- Subjects
- *
GRAPHIC methods , *KERNEL functions , *LEAST squares , *CHARTS, diagrams, etc. , *GEOMETRIC function theory - Abstract
The Path Partition Conjecture states that the vertices of a graph
G with longest path of lengthc may be partitioned into two partsX andY such that the longest path in the subgraph ofG induced byX has length at mosta and the longest path in the subgraph ofG induced byY has length at mostb , wherea+b=c . Moreover, for each paira,b such thata+b=c there is a partition with this property. A stronger conjecture by Broere, Hajnal and Mihók, raised as a problem by Mihók in 1985, states the following: For every graphG and each integerk ,c⩾k⩾2 there is a partition ofV(G) into two parts(K,&Kmacr;) such that the subgraphG[K] ofG induced byK has no path on more thank-1 vertices and each vertex in&Kmacr; is adjacent to an endvertex of a path onk-1 vertices inG[K] . In this paper we provide a counterexample to this conjecture. [Copyright &y& Elsevier]- Published
- 2004
- Full Text
- View/download PDF
29. Kernel smoothing of periodograms under Kullback–Leibler discrepancy
- Author
-
Hannig, Jan and Lee, Thomas C.M.
- Subjects
- *
KERNEL functions , *DIGITAL communications , *COMPLEX variables , *GEOMETRIC function theory - Abstract
Kernel smoothing on the periodogram is a popular nonparametric method for spectral density estimation. Most important in the implementation of this method is the choice of the bandwidth, or span, for smoothing. One idealized way of choosing the bandwidth is to choose it as the one that minimizes the Kullback–Leibler (KL) discrepancy between the smoothed estimate and the true spectrum. However, this method fails in practice, as the KL discrepancy is an unknown quantity. This paper introduces an estimator for this discrepancy, so that the bandwidth that minimizes the unknown discrepancy can be empirically approximated via the minimization of it. It is shown that this discrepancy estimator is consistent. Numerical results also suggest that this empirical choice of bandwidth often outperforms some other commonly used bandwidth choices. The same idea is also applied to choose the bandwidth for log-periodogram smoothing. [Copyright &y& Elsevier]
- Published
- 2004
- Full Text
- View/download PDF
30. On problem-oriented kernel refining
- Author
-
Parrado-Hernández, E., Arenas-García, J., Mora-Jiménez, I., and Navia-Vázquez, A.
- Subjects
- *
MACHINE learning , *KERNEL functions , *GEOMETRIC function theory - Abstract
Much attention has been recently devoted to those machine learning procedures known as kernel methods, the Support Vector Machines being an instance of them. Their performance heavily depends on the particular ‘distance measurement’ between patterns, function also known as ‘kernel’, which represents a dot product in a projection space. Although some attempts are being made to ‘a priori’ decide which kernel function is more suitable for a problem, no definite solution for this task has been found yet, since choosing the best kernel very often reduces to a selection among different possibilities by a cross-validation process. In this paper, we propose a method for solving classification problems relying on the ad hoc determination of a kernel for every problem at hand, i.e., a problem-oriented kernel design method. We iteratively obtain a semiparametric projecting function of the input data into a space which has an appropriately low dimension to avoid both overfitting and complexity explosion of the resulting machine, but being powerful enough to solve the classification problems with good accuracy. The performance of the proposed method is illustrated using standard databases, and we further discuss its suitability for developing problem-oriented feature extraction procedures. [Copyright &y& Elsevier]
- Published
- 2003
- Full Text
- View/download PDF
31. Plug-in bandwidth selection in kernel hazard estimation from dependent data
- Author
-
Quintela-del-Río, Alejandro
- Subjects
- *
KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *SIMULATION methods & models - Abstract
Abstract: The plug-in bandwidth selection method in nonparametric kernel hazard estimation is considered, and a weak dependence on the sample data is assumed. A general result of asymptotic optimality for the plug-in bandwidth is presented, that is valid for the hazard function, as well as for the density and distribution functions. In a simulation study, this method is compared with the “leave more than one out” cross-validation criterion under dependence. Simulations show that smaller errors and much less sample variability can be reached, and that a good selection of the pilot bandwidth can be done by means of “leave one out” cross-validation. Finally, an application to an earthquake data set is made. [Copyright &y& Elsevier]
- Published
- 2007
- Full Text
- View/download PDF
32. A strong uniform convergence rate of kernel conditional quantile estimator under random censorship
- Author
-
Ould-Saı¨d, Elias
- Subjects
- *
KERNEL functions , *GEOMETRIC function theory , *STOCHASTIC convergence , *MATHEMATICS - Abstract
Abstract: In this paper, we study the kernel conditional quantile estimation for censored data and a uniform strong convergence rate of the resulting estimator is established. The rate obtained here is the same as that for uncensored case. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
33. On boundary correction in kernel density estimation.
- Author
-
Karunamuni, R.J. and Alberts, T.
- Subjects
KERNEL functions ,COMPLEX variables ,GEOMETRIC function theory ,CONTRACT proposals - Abstract
Abstract: It is well known now that kernel density estimators are not consistent when estimating a density near the finite end points of the support of the density to be estimated. This is due to boundary effects that occur in nonparametric curve estimation problems. A number of proposals have been made in the kernel density estimation context with some success. As of yet there appears to be no single dominating solution that corrects the boundary problem for all shapes of densities. In this paper, we propose a new general method of boundary correction for univariate kernel density estimation. The proposed method generates a class of boundary corrected estimators. They all possess desirable properties such as local adaptivity and non-negativity. In simulation, it is observed that the proposed method perform quite well when compared with other existing methods available in the literature for most shapes of densities, showing a very important robustness property of the method. The theory behind the new approach and the bias and variance of the proposed estimators are given. Results of a data analysis are also given. [Copyright &y& Elsevier]
- Published
- 2005
- Full Text
- View/download PDF
34. Convergence rate of empirical Bayes estimation for two-dimensional truncation parameters under linex loss.
- Author
-
Yimun Shi, Sheseng Gao, and Xiaolin Shi
- Subjects
- *
BAYES' estimation , *STATISTICAL decision making , *GAME theory , *KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory , *MATHEMATICAL models - Abstract
This paper deals with the empirical Bayes (EB) estimation for two-dimensional truncation parameters under linex loss. By using kernel estimation of density function, we construct the EB estimation of parameters. Under some suitable conditions, we prove that the constructed EB estimations is asymptotically optimal and we obtain its convergence rate. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
35. Hopf bifurcation in a Volterra prey–predator model with strong kernel
- Author
-
Li, Shaowen, Liao, Xiaofeng, and Li, Chunguang
- Subjects
- *
EQUATIONS , *KERNEL functions , *COMPLEX variables , *GEOMETRIC function theory - Abstract
In this paper, a Volterra prey–predator model with distributed delays and strong kernel is investigated. By applying the frequency domain approach and analyzing the associated characteristic equation, the existence of bifurcation parameter point is determined. Furthermore, if the density coefficient of the prey is used as a bifurcation parameter, it is found that Hopf bifurcation occurs for the strong kernel. The direction and stability of the bifurcating periodic solutions are determined by the Nyquist criterion and the graphical Hopf bifurcation theorem. Some numerical simulations for justifying the theoretical analysis are also given. [Copyright &y& Elsevier]
- Published
- 2004
- Full Text
- View/download PDF
36. Practical bandwidth selection in deconvolution kernel density estimation
- Author
-
Delaigle, A. and Gijbels, I.
- Subjects
- *
KERNEL functions , *BANDWIDTHS , *COMPLEX variables , *GEOMETRIC function theory - Abstract
Kernel estimation of a density based on contaminated data is considered and the important issue of how to choose the bandwidth parameter in practice is discussed. Some plug-in (PI) type of bandwidth selectors, which are based on non-parametric estimation of an approximation of the mean integrated squared error, are proposed. The selectors are a refinement of the simple normal reference bandwidth selector, which is obtained by parametrically estimating the approximated mean integrated squared error by referring to a normal density. A simulation study compares these PI bandwidth selectors with a bootstrap (BT) and a cross-validated (CV) bandwidth selector. It is concluded that in finite samples, an appropriately chosen PI bandwidth selector and the BT bandwidth selector perform comparably and both outperform the CV bandwidth. The use of the various practical bandwidth selectors is illustrated on a real data example. [Copyright &y& Elsevier]
- Published
- 2004
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.