10 results on '"Tanveer, M."'
Search Results
2. Automated Identification System for Focal EEG Signals Using Fractal Dimension of FAWT-Based Sub-bands Signals
- Author
-
Dalal, M., Tanveer, M., Pachori, Ram Bilas, Kacprzyk, Janusz, Series Editor, Pal, Nikhil R., Advisory Editor, Bello Perez, Rafael, Advisory Editor, Corchado, Emilio S., Advisory Editor, Hagras, Hani, Advisory Editor, Kóczy, László T., Advisory Editor, Kreinovich, Vladik, Advisory Editor, Lin, Chin-Teng, Advisory Editor, Lu, Jie, Advisory Editor, Melin, Patricia, Advisory Editor, Nedjah, Nadia, Advisory Editor, Nguyen, Ngoc Thanh, Advisory Editor, Wang, Jun, Advisory Editor, Tanveer, M., editor, and Pachori, Ram Bilas, editor
- Published
- 2019
- Full Text
- View/download PDF
3. Smooth Twin Support Vector Machines via Unconstrained Convex Minimization
- Author
-
Tanveer, M. and Shubham, K.
- Published
- 2017
4. Advancing Supervised Learning with the Wave Loss Function: A Robust and Smooth Approach.
- Author
-
Akhtar, Mushir, Tanveer, M., and Arshad, Mohd.
- Subjects
- *
SUPERVISED learning , *MACHINE learning , *SUPPORT vector machines , *WAVE functions , *ALZHEIMER'S disease - Abstract
Loss function plays a vital role in supervised learning frameworks. The selection of the appropriate loss function holds the potential to have a substantial impact on the proficiency attained by the acquired model. The training of supervised learning algorithms inherently adheres to predetermined loss functions during the optimization process. In this paper, we present a novel contribution to the realm of supervised machine learning: an asymmetric loss function named wave loss. It exhibits robustness against outliers, insensitivity to noise, boundedness, and a crucial smoothness property. Theoretically, we establish that the proposed wave loss function manifests the essential characteristic of being classification-calibrated. Leveraging this breakthrough, we incorporate the proposed wave loss function into the least squares setting of support vector machines (SVM) and twin support vector machines (TSVM), resulting in two robust and smooth models termed as Wave-SVM and Wave-TSVM, respectively. To address the optimization problem inherent in Wave-SVM, we utilize the adaptive moment estimation (Adam) algorithm, which confers multiple benefits, including the incorporation of adaptive learning rates, efficient memory utilization, and faster convergence during training. It is noteworthy that this paper marks the first instance of Adam's application to solve an SVM model. Further, we devise an iterative algorithm to solve the optimization problems of Wave-TSVM. To empirically showcase the effectiveness of the proposed Wave-SVM and Wave-TSVM, we evaluate them on benchmark UCI and KEEL datasets (with and without feature noise) from diverse domains. Moreover, to exemplify the applicability of Wave-SVM in the biomedical domain, we evaluate it on the Alzheimer's Disease Neuroimaging Initiative (ADNI) dataset. The experimental outcomes unequivocally reveal the prowess of Wave-SVM and Wave-TSVM in achieving superior prediction accuracy against the baseline models. The source codes of the proposed models are publicly available at https://github.com/mtanveer1/Wave-SVM. • A new asymmetric, bounded, and smooth loss function termed wave loss is proposed. • Theoretically, we analyzed the classification-calibrated characteristic of the wave loss function. • Two new robust and smooth models, termed Wave-SVM and Wave-TSVM, are proposed. • The Adam algorithm is used to solve the optimization problem of the Wave-SVM. • The optimization problem of the Wave-TSVM is solved by an efficient iterative method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Pinball Loss Twin Support Vector Clustering.
- Author
-
TANVEER, M., GUPTA, TARUN, and SHAH, MITEN
- Subjects
SUPPORT vector machines ,FACIAL expression ,FUZZY algorithms ,ALGORITHMS - Abstract
Twin Support Vector Clustering (TWSVC) is a clustering algorithm inspired by the principles of Twin Support Vector Machine (TWSVM). TWSVC has already outperformed other traditional plane based clustering algorithms. However, TWSVC uses hinge loss, which maximizes shortest distance between clusters and hence suffers from noise-sensitivity and low re-sampling stability. In this article, we propose Pinball loss Twin Support Vector Clustering (pinTSVC) as a clustering algorithm. The proposed pinTSVC model incorporates the pinball loss function in the plane clustering formulation. Pinball loss function introduces favorable properties such as noise-insensitivity and re-sampling stability. The time complexity of the proposed pinTSVC remains equivalent to that of TWSVC. Extensive numerical experiments on noise-corrupted benchmark UCI and arti- ficial datasets have been provided. Results of the proposed pinTSVC model are compared with TWSVC, Twin Bounded Support Vector Clustering (TBSVC) and Fuzzy c-means clustering (FCM). Detailed and exhaustive comparisons demonstrate the better performance and generalization of the proposed pinTSVC for noisecorrupted datasets. Further experiments and analysis on the performance of the above-mentioned clustering algorithms on structural MRI (sMRI) images taken from the ADNI database, face clustering, and facial expression clustering have been done to demonstrate the effectiveness and feasibility of the proposed pinTSVC model. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
6. Universum twin support vector machine with truncated pinball loss.
- Author
-
Kumari, Anuradha and Tanveer, M.
- Subjects
- *
SUPPORT vector machines , *ALZHEIMER'S disease , *SIGNAL classification , *SOURCE code , *MACHINE performance - Abstract
For classification problems, twin support vector machine with pinball loss (Pin-GTSVM) is noise insensitive and has better performance than twin support vector machine (TWSVM). However, it lacks sparsity in comparison to TWSVM. In this article, to maintain a trade-off between the noise insensitivity and sparsity of the model along with preserving the theoretical properties of pinball loss, we propose universum twin support vector machine with truncated pinball loss (Tpin-UTWSVM). The proposed Tpin-UTWSVM considers universum data which gives prior information about the distribution of the data, thus improves the generalization performance of the proposed model. Further, the proposed optimization problem is non-convex and non-differentiable which is solved by concave–convex procedure. We employed the SOR approach to train the proposed model effectively with minimum training time. We conducted numerical experiments on 19 UCI binary datasets with different noise levels to validate the noise insensitivity of the proposed Tpin-UTWSVM model. We also conducted numerical experiments for electroencephalogram (EEG) signal classification and Alzheimer's disease (AD) detection. The overall experimental outcomes and statistical tests demonstrate the superiority of the proposed Tpin-UTWSVM model in comparison to the baseline models. The source code for the proposed Tpin-UTWSVM is available at https://github.com/mtanveer1/Universum-twin-SVM-with-truncated-pinball-loss. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. EEG signal classification using universum support vector machine.
- Author
-
Richhariya, B. and Tanveer, M.
- Subjects
- *
SUPPORT vector machines , *ELECTROENCEPHALOGRAPHY , *DIGITAL signal processing , *CLASSIFICATION algorithms , *DIAGNOSIS of neurological disorders - Abstract
Support vector machine (SVM) has been used widely for classification of electroencephalogram (EEG) signals for the diagnosis of neurological disorders such as epilepsy and sleep disorders. SVM shows good generalization performance for high dimensional data due to its convex optimization problem. The incorporation of prior knowledge about the data leads to a better optimized classifier. Different types of EEG signals provide information about the distribution of EEG data. To include prior information in the classification of EEG signals, we propose a novel machine learning approach based on universum support vector machine (USVM) for classification. In our approach, the universum data points are generated by selecting universum from the EEG dataset itself which are the interictal EEG signals. This removes the effect of outliers on the generation of universum data. Further, to reduce the computation time, we use our approach of universum selection with universum twin support vector machine (UTSVM) which has less computational cost in comparison to traditional SVM. For checking the validity of our proposed methods, we use various feature extraction techniques for different datasets consisting of healthy and seizure signals. Several numerical experiments are performed on the generated datasets and the results of our proposed approach are compared with other baseline methods. Our proposed USVM and proposed UTSVM show better generalization performance compared to SVM, USVM, Twin SVM (TWSVM) and UTSVM. The proposed UTSVM has achieved highest classification accuracy of 99% for the healthy and seizure EEG signals. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
8. KNN weighted reduced universum twin SVM for class imbalance learning.
- Author
-
Ganaie, M.A. and Tanveer, M.
- Subjects
- *
SUPPORT vector machines , *STATISTICAL learning , *FETOFETAL transfusion , *SOCIAL problems , *ATTENTIONAL bias , *ALZHEIMER'S disease , *HYPERPLANES - Abstract
In real world problems, imbalance of data samples poses major challenge for the classification problems as the data samples of a particular class are dominating. Problems like fault and disease detection involve imbalance data and hence need attention to avoid the bias towards a particular class. The classification models like support vector machines (SVM) get biased to majority class samples and hence results in misclassification of the minority class samples. SVM suffers as no prior information related to the data is involved in the generation of hyperplanes. Also, local information of the neighbourhood is ignored in SVM samples and thus treats each sample equally for generating the hyperplanes. However, the data points may be contaminated and may mislead the generation of hyperplanes. Inspired by the idea of prior data information and local neighbourhood information, we propose K -nearest neighbour based weighted reduced universum twin SVM for class imbalance learning (KWRUTSVM-CIL). The proposed KWRUTSVM-CIL embodies the local neighbourhood information and uses universum data to balance the classes in class imbalance problems. Local neighbourhood information is incorporated via weight matrix in the objective function. In proposed KWRUTSVM-CIL model, weight vectors are used in the corresponding constraints of the objective functions to exploit the interclass information. The oversampling and undersampling approaches are followed to balance the data in class imbalance problems. Universum data gives prior information of the data. Twin SVM, universum twin SVM, and reduced universum twin SVM for class imbalance implement empirical risk minimization principle and thus may lead to overfitting. However, the proposed KWRUTSVM-CIL model embodies regularization term to maximize the margin and implement the structural risk minimization principle which is the marrow of statistical learning and overcomes the issues of overfitting. Experimental results and the statistical analysis signify that the generalization ability of proposed KWRUTSVM-CIL model is superior in comparison to other twin SVM based models. As an application, we use the proposed KWRUTSVM-CIL model for the diagnosis of Alzheimer's disease and breast cancer disease. The proposed KWRUTSVM-CIL model showed better generalization performance compared to other twin SVM based models in biomedical datasets. • To incorporate the local neighbourhood information, K nearest neighbourbased weights are used in the proposed KWRUTSVM-CIL. • Unlike RUTSVM-CIL, UTSVM, TSVM and FTWSVM models which implement the empirical risk minimization principle, the proposed KWRUTSVM-CIL model implements the structural risk minimization principle. • Similar to RUTSVM-CIL, the proposed KWRUTSVM-CIL model incorporates prior information about the data (universum data) to handle the class imbalance problem. • The matrices appearing in the Wolfe dual of the proposed KWRUTSVM-CIL are positive definite, while as the matrices in the Wolfe dual of RUTSVM-CIL, UTSVM, TSVM and FTWSVM are positive semi-definite. • Experimental results and statistical analysis show the efficacy of the proposed KWRUTSVM-CIL model. As an application, we use the proposed KWRUTSVM-CIL model for the classification of Alzheimer's disease and breast cancer subjects. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
9. A reduced universum twin support vector machine for class imbalance learning.
- Author
-
Richhariya, B. and Tanveer, M.
- Subjects
- *
SUPPORT vector machines , *DATA distribution - Abstract
• A universum based algorithm is proposed for class imbalance learning. • Universum learning is used for the first time to solve class imbalance problem. • Reduced kernel is incorporated for reducing storage and computation cost. • Proposed approach is useful for large scale class imbalanced datasets. In most of the real world datasets, there is an imbalance in the number of samples belonging to different classes. Various pattern classification problems such as fault or disease detection involve class imbalanced data. The support vector machine (SVM) classifier becomes biased towards the majority class due to class imbalance. Moreover, in the existing SVM based techniques for class imbalance, there is no information about the distribution of data. Motivated by the idea of prior information about data distribution, a reduced universum twin support vector machine for class imbalance learning (RUTSVM-CIL) is proposed in this paper. For the first time, universum learning is incorporated with SVM to solve the problem of class imbalance. Oversampling and undersampling of data is performed to remove the imbalance in the classes. The universum data points are used to give prior information about the data. To reduce the computation time of our universum based algorithm, we use a small sized rectangular kernel matrix. The reduced kernel matrix needs less storage space, and thus applicable for large scale imbalanced datasets. Comprehensive experimentation is performed on various synthetic, real world and large scale imbalanced datasets. In comparison to the existing approaches for class imbalance, the proposed RUTSVM-CIL gives better generalization performance for most of the benchmark datasets. Also, the computation cost of RUTSVM-CIL is very less, making it suitable for real world applications. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
10. Inverse free reduced universum twin support vector machine for imbalanced data classification.
- Author
-
Moosaei, Hossein, Ganaie, M.A., Hladík, Milan, and Tanveer, M.
- Subjects
- *
SUPPORT vector machines , *MACHINE learning , *CLASSIFICATION algorithms , *MATRIX inversion , *LAGRANGIAN functions - Abstract
Imbalanced datasets are prominent in real-world problems. In such problems, the data samples in one class are significantly higher than in the other classes, even though the other classes might be more important. The standard classification algorithms may classify all the data into the majority class, and this is a significant drawback of most standard learning algorithms, so imbalanced datasets need to be handled carefully. One of the traditional algorithms, twin support vector machines (TSVM), performed well on balanced data classification but poorly on imbalanced datasets classification. In order to improve the TSVM algorithm's classification ability for imbalanced datasets, recently, driven by the universum twin support vector machine (UTSVM), a reduced universum twin support vector machine for class imbalance learning (RUTSVM) was proposed. The dual problem and finding classifiers involve matrix inverse computation, which is one of RUTSVM's key drawbacks. In this paper, we improve the RUTSVM and propose an improved reduced universum twin support vector machine for class imbalance learning (IRUTSVM). We offer alternative Lagrangian functions to tackle the primal problems of RUTSVM in the suggested IRUTSVM approach by inserting one of the terms in the objective function into the constraints. As a result, we obtain new dual formulation for each optimization problem so that we need not compute inverse matrices neither in the training process nor in finding the classifiers. Moreover, the smaller size of the rectangular kernel matrices is used to reduce the computational time. Extensive testing is carried out on a variety of synthetic and real-world imbalanced datasets, and the findings show that the IRUTSVM algorithm outperforms the TSVM, UTSVM, and RUTSVM algorithms in terms of generalization performance. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.