1,634 results
Search Results
2. Quality design based on kernel trick and Bayesian semiparametric model for multi-response processes with complex correlations.
- Author
-
Yang, Shijuan, Wang, Jianjun, Cheng, Xiaoying, Wu, Jiawei, and Liu, Jinpei
- Subjects
PRINCIPAL components analysis ,EVOLUTIONARY algorithms ,RANDOM forest algorithms ,LEAST squares - Abstract
Processes or products are typically complex systems with numerous interrelated procedures and interdependent components. This results in complex relationships between responses and input factors, as well as complex nonlinear correlations among multiple responses. If the two types of complex correlations in the quality design cannot be properly dealt with, it will affect the prediction accuracy of the response surface model, as well as the accuracy and reliability of the recommended optimal solutions. In this paper, we combine kernel trick-based kernel principal component analysis, spline-based Bayesian semiparametric additive model, and normal boundary intersection-based evolutionary algorithm to address these two types of complex correlations. The effectiveness of the proposed method in modeling and optimisation is validated through a simulation study and a case study. The results show that the proposed Bayesian semiparametric additive model can better describe the process relationships compared to least squares regression, random forest regression, and support vector basis regression, and the proposed multi-objective optimisation method performs well on several indicators mentioned in the paper. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. Digital transformation and sustainable development in higher education in a post-pandemic world.
- Author
-
Leal Filho, Walter, Lange Salvia, Amanda, Beynaghi, Ali, Fritzen, Barbara, Ulisses, Azeiteiro, Avila, Lucas Veiga, Shulla, Kalterina, Vasconcelos, Claudio R. P., Moggi, Sara, Mifsud, Mark, Anholon, Rosley, Rampasso, Izabela Simon, Kozlova, Valerija, Iliško, Dzintra, Skouloudis, Antonis, and Nikolaou, Ioannis
- Subjects
DIGITAL transformation ,DIGITAL technology ,SUSTAINABLE development ,HIGHER education ,UNIVERSITIES & colleges ,PRINCIPAL components analysis - Abstract
Digital technologies are now part of our daily lives, and the speed of their implementation and use has been accelerated as a result of the COVID-19 pandemic. Digital transformation, seen in the past as a problem, is now perceived as an important component in the future of sustainable development (SD), especially at higher education institutions whose operations have been adversely affected by the pandemic in many ways. The purpose of this paper is to analyse the subject matter of digital transformation and how it relates to a SD context. It reports on the results of a worldwide survey at higher education institutions, which identified some areas where the pandemic impacted and/or influenced their activities. The survey received 158 responses and a principal component analysis was performed to model the items associated with digital tools boosting SD, innovative business opportunities and ideas, and needs for improvement at HEIs. The results indicate that most part of the respondents developed digital skills and increased their involvement with e-learning and distance learning; however, more digital training is needed. Findings also support the role played by digital technologies in boosting SD at HEIs, and the role of institutions in promoting innovation through digital tools. Apart from an analysis of the extent to which the COVID-19 pandemic has contributed to digital transformation in an SD context in higher education institutions, the paper provides an assessment of trends and recommendations that may guide future developments in a post-pandemic work. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Evaluation of Chemical Composition of Eucalyptus Wood Extracts after Different Storage Times Using Principal Component Analysis.
- Author
-
Silverio, Flaviano O., Barbosa, Luiz C. A., Fidencio, Paulo H., Cruz, Mariluze P., Maltha, Celia R. A., and Pilo-Veloso, Dorila
- Subjects
CHEMICAL composition of plants ,EUCALYPTUS ,PRINCIPAL components analysis ,PAPER industry ,PULPING ,CHEMICAL reduction ,LUMBER drying ,CHEMOMETRICS ,WOOD - Abstract
Deposits of wood extractives, commonly referred to as pitch, cause significant problems for the pulp and paper industries. The reduction of these extractives is an important strategy aspect used to minimize problems. The present work studied the effects of wood storage time on the amount and variation of chemical composition of extracts, before and after alkaline hydrolysis. Fatty acids, sterols, long-chain aliphatic alcohols, and aromatic compounds were the main chemical classes of substances found in all analyzed extractives, before and after hydrolysis. Hierarchical cluster analysis (HCA) and principal component analysis (PCA) were used to confirm the similarity of wood samples using groups directly correlated with the chemical composition. PC1 explains approximately 99% of the total variance, and β-sitosterol was the major compound responsible for the groupings. The exploratory analysis of the extract samples after 20, 40, 60, 100, 140, and 180 days of storage, before and after alkaline hydrolysis, showed that they were affected by the strong influence of β-sitosterol, which is derived from glucosteroids degradation by water present in the wood was the compound present in the largest amount in the extracts, before and after hydrolysis, as well as octadec-9-enoic acid can be oxidized at the double bond becoming more soluble in water and 3,4,5-trihydroxybenzoic acid, derived from lignin degradation by water present in the wood or microorganism attack, which were responsible for differentiating and clustering the storage time. These studies demonstrate that the best period of storage of the wood in the field is 60 days, because the reduction of the main compounds present in extracts (mainly β-sitosterol and octadec-9-enoic acid) was more significant. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
5. Rapid Determination of Betulin in Betula platyphylla Outer Bark Using Near-Infrared Spectroscopy.
- Author
-
Kim, Nanyoung, Yu, Miao, Lee, DongYoung, Hahn, YoungHee, Kim, YoungChoong, Sung, SangHyun, and Kim, SeungHyun
- Subjects
- *
BETULIN , *PAPER birch , *PLANT extracts , *NEAR infrared spectroscopy , *PRINCIPAL components analysis , *LIQUID chromatography , *HARVESTING time - Abstract
A simple, rapid, and nondestructive method for the determination of betulin in the outer birch bark was developed using near infrared spectroscopy (NIRS). NIRS data of the outer birch bark collected throughout the year was preprocessed and analyzed by principal component analysis, which led to clear discrimination of the samples according to their harvest times. The reference content of betulin, a major constituent of the outer birch bark, was determined using ultra performance liquid chromatography with a diode array detector (UPLC-DAD). The optimized and validated analytical conditions of UPLC-DAD provided better separation and faster analysis time compared to a conventional HPLC method. Betulin content also showed seasonal variation and was higher in the samples collected during the summer season. Partial least squares calibration techniques were employed to estimate the relationship between the NIRS data and betulin contents. The spectral data showed high correlation coefficients (over 0.700) with betulin content. These results indicate that NIRS combined with UPLC can be used to determine the quality and to quantify the betulin content of the outer birch bark. [ABSTRACT FROM AUTHOR]
- Published
- 2013
- Full Text
- View/download PDF
6. FrWT-PPCA-Based R-peak Detection for Improved Management of Healthcare System.
- Author
-
Gupta, Varun, Mittal, Monika, and Mittal, Vikas
- Subjects
PRINCIPAL components analysis ,WAVELET transforms ,HEART abnormalities ,MEDICAL care ,DATABASES - Abstract
Fourier analysis is well known to provide complete information of the frequencies present in a signal. But in the process, time information is lost. Therefore, its time-frequency representation is required for depicting both time and frequency information simultaneously. Therefore in this paper, fractional wavelet transform (FrWT) is proposed to be used for the first time for extracting the features of various datasets in a standard ECG database by combining the advantages of both fractional domain techniques and wavelets as case-II. Afterwards, Probabilistic Principal Component Analysis (PPCA) is used for detecting R-peaks for diagnosing heart abnormalities in various morphologies of the ECG signal. The proposed technique has been evaluated on the basis of sensitivity (SEN), detection error rate (DER), and positive predictivity (PPR) (of the detected ECG beats) for MIT-BIH Arrhythmia database (M/B Ar DB). Even though both FrFT and FrWT techniques exhibit a high degree of robustness, but SEN of 99.99%, DER of 0.026%, & PPR of 99.99% obtained by latter in case-II are better than SEN of 99.97%, DER of 0.053%, & PPR of 99.98% obtained by the former in case-I for M/B Ar DB. In this paper, average time error (ATE) is also obtained for the considered datasets establishing the effectiveness of the proposed technique further. These encouraging results suggest that the proposed methodology will go a long way in assisting the cardiologists to detect temporal patterns in a wide variety of electrophysiological cases, which is important for improved management of healthcare system. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Returns co-movement and interconnectedness: Evidence from Indonesia banking system.
- Author
-
Zuhrohtun, Zuhrohtun, Salim, M. Zulkifli, Sunaryo, Kunti, and Astuti, Sri
- Subjects
PRINCIPAL components analysis ,GOVERNMENT ownership of banks ,SYSTEMIC risk (Finance) ,FINANCIAL risk ,BANKING industry - Abstract
In this paper, we explore how asset returns used as a proxy to detect interconnectedness of systemic risk in the financial system. Our sample employs a mixture of Indonesian banks' public and prudential data over the 2012–2019 period. Using the Principal Component Analysis and Granger causality the core banks in the network could explain the variance, risk co-movement, and show shocks propagation. Further, the results are also in line with Basel indicator-based to score the interconnectedness. The dominance of big size banks in the centrality measures raises issue of substitutability. This paper outstretched theories and their application provides a basis for policy makers to develop supervision frameworks to mitigate systemic risk. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
8. A proposal to measure the circular economy implementation and sustainable development goals achievement using objectively weighted indices.
- Author
-
Alfaro Navarro, José-Luis and Andrés Martínez, María-Encarnación
- Subjects
CIRCULAR economy ,SUSTAINABLE development ,PRINCIPAL components analysis ,PERCENTILES ,ECONOMIC expansion - Abstract
Governments, companies and citizens around the world consider necessary to adopt a new circular economy (CE) model that allows solving the planet's environmental challenges and guaranteeing sustainable economic growth. Europe advocates this philosophy, but there is no widely accepted index to measure CE implementation at a macro level. This paper proposes a new index based on principal component analysis for European Union countries that use all information available without losing any information by the dimensionality reduction and consider objective weights based on the percentage of variance that each one retains. Moreover, we develop a disaggregated analysis considering the CE dimensions set out in the 'CE monitoring framework', allowing a more comprehensive analysis than when using a single indicator of CE implementation. This method is also used to build an index of the degree of achievement of the sustainable development goals (SDGs) to see how they relate to the CE; the relationships between CE dimensions; and between SDGs. The results by geographical areas reveal a higher level of CE implementation in western European and EU-15 countries, with Luxembourg, Austria, Denmark, the Netherlands and Belgium alternately holding the top positions depending on the CE dimension considered. Therefore, the new European countries and the countries in the east must encourage measures to improve the implementation of the economy. In addition, there is a positive, strong and significant relationship with SDGs 8, 9 and 11, with both the overall CE implementation index and the disaggregated indices, and a negative one with 7 and 15. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. A varying coefficient model with matrix valued covariates.
- Author
-
Zhang, Hong-Fan
- Subjects
BILIARY liver cirrhosis ,PRINCIPAL components analysis ,ASYMPTOTIC distribution ,TIME measurements ,MATRICES (Mathematics) - Abstract
Modern data are often collected in a matrix form. In this paper, we consider modelling the varying coefficient regression with matrix valued covariate X and scalar index variable U. The proposed model simultaneously makes principal component analysis for both the row and column dimensions of the matrix objects, maintaining the matrix structure while achieving substantial dimension reduction. We develop an iterative estimation method for the involved principal parameters and nonparametric functions. Under regularity conditions, the asymptotic distributions of the estimators are derived. In addition, by incorporating the estimation with the adaptive group Lasso and the group SCAD penalties, variables of X in entire rows or columns are selected. The proximal gradient algorithm is further utilised to solve the regularised optimisation problems. The asymptotic properties of the penalised estimators are also studied. Our model and estimation methods are demonstrated by simulated experiments. Real applications to the primary biliary cirrhosis (PBC) data reveal that the effects of the blood measurements to the survival time vary with levels of age. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Toward the identification of laggard rural areas: an evolutionary resilience approach.
- Author
-
Hierro, María and Maza, Adolfo
- Subjects
- *
RURAL geography , *PRINCIPAL components analysis - Abstract
Based on the evolutionary resilience approach and using Principal Component Analysis, this paper proposes a composite weighted index to measure levels of rural resilience and, in addition, to unravel why some rural areas are less resilient than others. Taking the region of Cantabria (northern Spain) at the municipal level as a sort of ‘laboratory’, the overall results unveil poor levels of resilience and suggest that the main factors limiting resilience are related to cultural interest, rural potential, natural endowment, and connectivity. Our findings also reveal considerable spatial heterogeneity across the different domains of resilience. As a specific contribution, the study brings together rural resilience levels and population growth (as population decline is the first direct consequence of a lack of resilience) in order to define different categories of rural territories, among which laggard rural areas are singled out as areas to be targeted for political intervention. When focusing on such areas, a crucial finding of the paper is that some municipalities that are lagging behind have not been included by the Regional Government as being at severe risk of depopulation, meaning that they cannot benefit from close public monitoring and fiscal support. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Pavement crack detection and classification based on fusion feature of LBP and PCA with SVM.
- Author
-
Chen, Cheng, Seo, Hyungjoon, Jun, Chang Hyun, and Zhao, Y.
- Subjects
CRACKING of pavements ,SUPPORT vector machines ,PRINCIPAL components analysis - Abstract
A new crack detection approach based on local binary patterns (LBP) with support vector machine (SVM) was proposed in this paper. The propsed algorithm can extract the LBP feature from each frame of the video taken from the road. Then, the dimension of the LBP feature spaces can be reduced by Principal Component Analysis(PCA). The simplified samples are trained to be decided the type of crack using Support Vector Machine(SVM). In order to reflect the directional imformation in detail, the LBP processed image is devided into nine sub-blocks. In this paper, driving tests were performed 10 times and 12,000 image data were applied to the proposed algorithm. The average accuracy of the proposed algorithm with sub-blocks is 91.91%, which is about 6.6% higher than the algorithm without sub-blocks. The LBP-PCA with SVM applying sub-blocks reflects the directional information of the crack so that it has high accuracy of 89.41% and 88.24%, especially in transverse and longitudinal cracks. In the performance analysis of different crack classifiers, the F-Measure, which considered balance between the precision and the recall, of alligator cracks classifier was the highest at 0.7601 and hence crack detection performance is higher than others. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
12. A pseudo principal component analysis method for multi-dimensional open-high-low-close data in candlestick chart.
- Author
-
Huang, Wenyang, Wang, Huiwen, and Wang, Shanshan
- Subjects
- *
PRINCIPAL components analysis , *CANDLESTICKS , *MULTIPLE correspondence analysis (Statistics) , *ECONOMIC impact , *STATISTICAL models - Abstract
As the most widely-used data form in the field of finance, the open-high-low-close (OHLC) data is being collected by all kinds of financial trading systems all the time. This paper puts forward a pseudo-principal component analysis (PCA) for multi-dimensional OHLC data, which can extract their useful information in a comprehensible way for visualization and easy interpretation. Firstly, a novel feature-based representation for OHLC data is proposed, which contains fruitful and explicit economic implications. Next, we define a full set of numerical characteristics and variance-covariance structures for the feature-based OHLC data. Then, the pseudo-PCA procedure for OHLC data is deduced based on the proposed algebraic operators. Finally, the effectiveness and interpretability of the proposed pseudo-PCA method are verified through finite simulations and three typical empirical experiments. This paper enriches the application scenarios of classical PCA and contributes to the multivariate statistical modeling of symbolic data. The proposed applications can serve as models for related studies. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. The abnormal traffic detection scheme based on PCA and SSH.
- Author
-
Wang, Zhenhui, Han, Dezhi, Li, Ming, Liu, Han, and Cui, Mingming
- Subjects
TRAFFIC monitoring ,PRINCIPAL components analysis ,COMPUTER network security ,FEATURE extraction - Abstract
Network abnormal traffic detection can monitor the network environment in real time by extracting and analysing network traffic characteristics, and plays an important role in network security protection. In order to solve the problems that the existing detection methods cannot fully learn the spatio-temporal characteristics of data, the classification accuracy is not high, and the detection time and accuracy are susceptible to the influence of redundant data in the sample. Thus, this paper proposes a network abnormal detection method (PCSS) integrating principal component analysis (PCA) and single-stage headless face detector algorithms (SSH). PCSS applies the PCA algorithm to the data preprocessing to eliminate the interference of redundant data. At the same time, PCSS also combines feature fusion and SSH to enhance the feature extraction of unclear features data, and effectively improve the detection speed and accuracy. Simulation experiments based on IDS2017 and IDS2012 data sets are carried out in this paper. Experimental results show that PCSS is obviously superior to other detection models in detection speed and accuracy, which provides a new method for efficiently detecting traffic attacks. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
14. Import technology sophistication and high-quality economic development: evidence from city-level data of China.
- Author
-
Chen, Ming and Wang, Hongbo
- Subjects
ECONOMIC development ,IMPORTS ,REGIONAL development ,SUSTAINABLE development ,PRINCIPAL components analysis ,ECONOMIC research - Abstract
This paper adopts five dimensions and 15 indexes of green development, people's life, innovation ability, economic vitality and coordinated development to establish an evaluation system of high-quality economic development. It uses principal component analysis to measure the economic high-quality development of 233 prefecture-level cities from 2003 to 2016, and empirically studies the impact of import sophistication on China's high-quality economic development. The results show that the increase in the sophistication of imported technology can significantly promote the high-quality development of the regional economy, and this effect is applicable to both imported intermediate and final products. In regions with higher and lower levels of economic development, eastern areas, and regions with high-quality development above 90% quantiles, the increase in imported technology content can significantly drive the high-quality development of the local economy. However, it has a great negative impact on the areas with a high-quality development index below 10% quantile. The robustness and endogeneity check support the above viewpoint. Further mechanism analysis shows that the final product import competition and intermediate product import spillover play a mediating role in the process of import sophistication affecting the high-quality economic development. The conclusion of this paper has important theoretical value and practical significance for the use of import trade to achieve high-quality economic development. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Prioritizing action plans to save resources and better achieve municipal solid waste management KPIs: An urban case study.
- Author
-
Moreno Solaz, Héctor, Artacho-Ramírez, Miguel-Ángel, Cloquell-Ballester, Víctor-Andrés, and Badenes Catalán, Cristóbal
- Subjects
SOLID waste management ,KEY performance indicators (Management) ,EMPLOYEE savings plans ,URBAN studies ,PRINCIPAL components analysis ,WASTE management - Abstract
The management of municipal solid waste (MSW) in cities is one of the most complex tasks facing local administrations. For this reason, waste management performance measurement structures are increasingly implemented at local and national levels. These performance structures usually contain strategic objectives and associated action plans, as well as key performance indicators (KPIs) for organizations investing their resources in action plans. This study presents the results of applying a methodology to find a quantitative-based prioritization of MSW action plans for the City Council of Castelló de la Plana in Spain. In doing so, cause-effect relationships between the KPIs have been identified by applying the principal component analysis technique, and from these relationships it was possible to identify those action plans which should be addressed first to manage public services more efficiently. This study can be useful as a tool for local administrations when addressing the actions included in their local waste plans as it can lead to financial savings. Implications: This paper introduces and implements a methodology that uses principal component analysis to analyze real data from waste management KPIs and provide municipal solid waste managers with a decision-making tool for prioritizing action plans. The methodology saves financial resources and time, as well as reinforcing the probability of reaching the meta values of the main performance system KPIs. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
16. The influence of SIFs managers' characteristics on fund performance: an empirical study in China.
- Author
-
Wang, Liang, Liang, Meiqi, Cao, Wenyan, and Jing, Handi
- Subjects
SHARPE ratio ,EMPIRICAL research ,PRINCIPAL components analysis ,PERFORMANCE theory ,SECURITIES - Abstract
This paper adopts principal component analysis to assign weights to three fund performances obtained independently by the Sharpe index, Jensen index, and Treynor index. Therefore, a comprehensive model for the performance of Chinese securities investment funds (SIFs) is derived by combining these three indexes. The empirical study shows that the regression between SIFs managers' characteristics and fund performance is better when using this comprehensive measurement model. Moreover, SIFs managers' experience and tenure have negative and positive effects on fund performance, respectively. However, their gender, education level, professional qualification holdings, research experience, and abroad experience, do not significantly influence the fund performance. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
17. Robust estimation for function-on-scalar regression models.
- Author
-
Miao, Zi and Wang, Lihong
- Subjects
REGRESSION analysis ,REGULARIZATION parameter ,ESTIMATION theory ,PARAMETER estimation ,PRINCIPAL components analysis ,METEOROLOGICAL stations - Abstract
For the functional linear models in which the dependent variable is functional and the predictors are scalar, robust regularization for simultaneous variable selection and regression parameter estimation is an important yet challenging issue. In this paper, we propose two types of regularized robust estimation methods. The first estimator adopts the ideas of reproducing kernel Hilbert space, least absolute deviation and group Lasso techniques. Based on the first method, the second estimator applies the pre-whitening technique and estimates the error covariance function by using functional principal component analysis. Simulation studies are conducted to examine the performance of the proposed methods in small sample sizes. The method is also applied to the Canadian weather data set, which consists of the daily average temperature and precipitation observed by 35 meteorological stations across Canada from 1960 to 1994. Numerical simulations and real data analysis show a good performance of the proposed robust methods for function-on-scalar models. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Joint L2,p-norm and random walk graph constrained PCA for single-cell RNA-seq data.
- Author
-
Wang, Tai-Ge, Shang, Jun-Liang, Liu, Jin-Xing, Li, Feng, Yuan, Shasha, and Wang, Juan
- Subjects
RANDOM walks ,RANDOM graphs ,RNA sequencing ,PRINCIPAL components analysis ,NUCLEOTIDE sequencing - Abstract
The development and widespread utilization of high-throughput sequencing technologies in biology has fueled the rapid growth of single-cell RNA sequencing (scRNA-seq) data over the past decade. The development of scRNA-seq technology has significantly expanded researchers' understanding of cellular heterogeneity. Accurate cell type identification is the prerequisite for any research on heterogeneous cell populations. However, due to the high noise and high dimensionality of scRNA-seq data, improving the effectiveness of cell type identification remains a challenge. As an effective dimensionality reduction method, Principal Component Analysis (PCA) is an essential tool for visualizing high-dimensional scRNA-seq data and identifying cell subpopulations. However, traditional PCA has some defects when used in mining the nonlinear manifold structure of the data and usually suffers from over-density of principal components (PCs). Therefore, we present a novel method in this paper called joint L 2 , p -norm and random walk graph constrained PCA (RWPPCA). RWPPCA aims to retain the data's local information in the process of mapping high-dimensional data to low-dimensional space, to more accurately obtain sparse principal components and to then identify cell types more precisely. Specifically, RWPPCA combines the random walk (RW) algorithm with graph regularization to more accurately determine the local geometric relationships between data points. Moreover, to mitigate the adverse effects of dense PCs, the L 2 , p -norm is introduced to make the PCs sparser, thus increasing their interpretability. Then, we evaluate the effectiveness of RWPPCA on simulated data and scRNA-seq data. The results show that RWPPCA performs well in cell type identification and outperforms other comparison methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. A novel Index-based quantification approach for port performance measurement: a case from Indian major ports.
- Author
-
Nayak, Nikesh, Pant, Pushpesh, Sarmah, Sarada Prasad, Jenamani, Mamata, and Sinha, Deepankar
- Subjects
UNITIZED cargo systems ,SHIPPING containers ,PRINCIPAL components analysis ,SOCIOECONOMIC factors ,PANEL analysis - Abstract
This paper develops a unified port performance index (PPI) considering different cargo categories and the multi-dimensional nature of port performance indicators/dimensions. This study has used the quintile method to construct PPI. Further, the PPI obtained from the quintile method is compared with weighted unified index (PPI
PCA ) using principal component analysis (PCA), extensively used for index development in the literature. A pilot index development is demonstrated using secondary panel data for 12 major Indian ports on five significant dimensions, namely, operations, physical infrastructure, technical infrastructure, finance, and socio-economic. Results show that the JNPT port outperforms all other ports under the container cargo category. Likewise, Kandla port in the liquid port category and Paradip port in the other (dry & break bulk) cargo category are on top. Also, qualitatively similar results and insights are obtained with the PPIPCA . Subsequently, the panel data regression and efficiency analysis are performed to demonstrate the utility of the proposed index. The results affirm that the operations, physical infrastructure, and socio-economic dimensions have a positive and significant impact on port financial performance. The present study operationalises some key unexplored port performance indicators/dimensions that can enable effective decision-making. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
20. An initially robust minimum simplex volume-based method for linear hyperspectral unmixing.
- Author
-
Li, Yanyan and Tan, Tao
- Subjects
- *
SIMPLEX algorithm , *SISAL (Fiber) , *PRINCIPAL components analysis , *NONNEGATIVE matrices , *MOMENTS method (Statistics) - Abstract
Initialization plays an important role in the accuracy of endmember extraction algorithms (EEAs) in linear hyperspectral unmixing (LHU). Random initialization can lead to varying endmembers generated by EEAs. To address this challenge, an initialization strategy has been introduced, encompassing vertex component analysis (VCA), automatic target generation process (ATGP), among others. These techniques significantly contribute to enhancing the accuracy of EEAs. However, complex initialization is sometimes less preferable, prompting the unexplored question of whether there exists an EEA robust to initialization. This paper focuses on analyzing this issue within the context of minimum simplex volume-based (MV) methods, which have received considerable attention in the past two decades due to their robustness against the absence of pure pixels. MV methods typically formulate LHU as an optimization problem, most of which includes a non-convex volume term. Additionally, many MV methods use VCA as an initialization strategy. Firstly, this paper demonstrates that the variable splitting augmented Lagrangian approach (SISAL), as a representative non-convex MV method, heavily depends on initialization. To our knowledge, the impact of initialization for MV methods has not been thoroughly analyzed before. Furthermore, this paper proposes an initially robust MV method by introducing a new convex MV term. Numerical experiments conducted on simulated and real datasets demonstrate its outstanding performance in accuracy and robustness to initialization. Throughout the experiments the proposed method proves to be the most stable, which is crucial in real scene where the ground truth is unknown beforehand. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. SuperPCA and 2-D compact variational mode decomposition for feature extraction from hyperspectral images.
- Author
-
Zhuo, Renxiong, Guo, Yunfei, Guo, Baofeng, Liu, Baoyang, and Dai, Fan
- Subjects
- *
PRINCIPAL components analysis , *DATA mining , *SUPERPOSITION (Optics) , *DIMENSION reduction (Statistics) - Abstract
The 2-D compact variational mode decomposition (2-D-C-VMD) has potential research significance in data mining of hyperspectral images (HSIs). The paper proposes a method of concatenation of features. First, the 2-D-C-VMD technique is used to decompose the HSIs to obtain smooth and clear boundary sub-band images when the 2-D-C-VMD takes the most preferred a priori parameter K ×. Second, the optimal low-frequency component features of the image and the optimal high-frequency component constructed features are concatenated first and last, and the structure of the concatenation is constructed into a new fused data. Finally, in order to avoid the redundancy of the fused data band information from affecting the classification result, combine a recently developed simple and practical superpixelwise principal component analysis (SuperPCA) dimensionality reduction method to effectively reduce the dimensionality of the fused data. By comparing the analysis with relying only on the optimal low-frequency image feature information and with the direct linear superposition fusion that has been employed, it is verified that the concatenation of features approach proposed in this paper can obtain better recognition results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Agricultural interactive knowledge models: researchers' perceptions about farmers' knowledges and information sources in Spain.
- Author
-
Cruz, J. L., Albisu, L. M., Zamorano, J. P., and Sayadi, S.
- Subjects
AGRICULTURAL technology ,INFORMATION resources ,FARMERS' attitudes ,PRINCIPAL components analysis ,AGRICULTURAL innovations ,SENSORY perception ,FARMERS - Abstract
Agricultural innovation implies sharing information between researchers and farmers. Acknowledging the value of the other partner's knowledge is a preliminary step to facilitate agricultural interactive models. Traditionally, researchers' knowledges have been dominant, and farmers' knowledges have been underestimated. A large number of innovation papers pay attention to farmers' attitudes, interests, perceptions, barriers or drivers to change their practices. However, there is not so much information about researchers' point of view. The present paper analyses researchers' perceptions about farmers' knowledges and information sources. It is based on a survey of 156 agricultural researchers mostly from public institutions in Spain. Descriptive statistics and principal component analysis (PCA) focused on researchers' perceptions of farmers' knowledges and information sources. This paper finds that there are two distinct profiles of researchers according to their perceptions on the relevance of knowledges and information sources for farmers. However, both profiles shared the view of 'own experience' as a highly relevant source for farmers' knowledges. It advises how to promote knowledge sharing according to different researchers' perceptions of farmers' knowledges and information sources. This paper classifies agricultural researchers according to their perceptions of farmers' knowledges and information sources. This classification enriches the discussion about agricultural interactive knowledge models. This paper pays attention to researchers' perceptions of farmers' knowledges and information sources. It particularly focuses on multi-actor approaches and the integration of farmers' and researchers' knowledges. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
23. Analysis of adulterated milk based on a long short-term memory network.
- Author
-
Li, Xin and Liu, Jiangping
- Subjects
PRINCIPAL components analysis - Abstract
Taking adulterated milk as the research object, the principal component analysis method combined with long short-term memory network was used to study, aiming to find a simple and efficient rapid detection method for adulterated milk. In this paper, qualitative and quantitative analysis of adulterated milk was carried out based on near-infrared hyperspectral data (400–1000 nm). The experimental results verified the feasibility of using near-infrared hyperspectral technology to identify adulterated milk. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Determinants of adoption of electronic payment by small and medium-sized enterprises (SMEs) in Cameroon.
- Author
-
Kadjie, Christelle Flore, Hikouatcha, Prince, Njamen Kengdo, Arsène Aurelien, and Nchofoung, Tii N.
- Subjects
SMALL business ,PAYMENT systems ,PRINCIPAL components analysis ,PAYMENT ,LOGISTIC regression analysis - Abstract
This paper aims to identify the determinants of the adoption of electronic payment in SMEs in Cameroon. It considers data collected from 117 SMEs. The methodology involves principal component analysis and ordinal logistic regression. The results obtained are at least twofold. First, the leading electronic payment tools adopted by these companies are mobile money, card and Internet payments. Second, their choice can be explained on the one hand by the characteristics of these payment tools such as convenience and cost of use, and on the other hand by contingency factors such as integration level and ICT's mastery level of the manager. Accordingly, companies should be encouraged to use electronic payment tools based on the driving role they could play in providing basic infrastructure and guarantee financially secure transactions between economic agents. The original contribution of this paper is at least twofold. Firstly, methodologically, this study uses a combination of qualitative and quantitative methods and ordinal logistic regression. Secondly, while previous studies were limited to mobile money adoptions, this study further integrated online and card payments in the analyses. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
25. Big data technology: developments in current research and emerging landscape.
- Author
-
Singh, Nitin
- Subjects
BIG data ,PRINCIPAL components analysis ,CITATION analysis ,THEMATIC analysis ,RESEARCH & development ,INFORMATION resources management ,PRIVATE sector - Abstract
In this study, big data studies (01/2015–6/2018) are reviewed and several highly cited papers are identified, which indicates a growing interest in the area of big data. The papers and proceedings from international peer-reviewed journals and ranked conferences were reviewed. We employed Principal component analysis and citation and co-citation analysis to identify themes of research emanating from these studies. Citation and co-citation analysis reveals that there is cross-functional nature of big data research, which permeates different business sectors and is influenced by themes in engineering and information management. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
26. AGIM-net based subject-sensitive hashing algorithm for integrity authentication of HRRS images.
- Author
-
Ding, Kaimeng, Zeng, Yue, Wang, Yingying, Lv, Dong, and Yan, Xinyun
- Subjects
IMAGE compression ,PRINCIPAL components analysis ,ALGORITHMS ,DECODING algorithms ,DATA integrity ,REMOTE sensing ,PETRI nets - Abstract
The premise of effective use of high-resolution remote sensing (HRRS) images is that the data integrity and authenticity of HRRS images must be guaranteed. This paper proposes a new subject-sensitive hashing algorithm for the integrity authentication of HRRS images. This algorithm takes AGIM-net (Attention Gate-based improved M-net) proposed in this paper to extract the subject-sensitive features of the HRRS images, and uses Principal Component Analysis (PCA) based method to compress and encode the extracted features. AGIM-net is an improved U-net based on attention mechanism, adding multi-scale input in the encoder stage to extract rich image features; adding multi-scale output in the decoder stage, and suppressing the features irrelevant to the subject through Attention Gate to improve the robustness of the algorithm. Experiments show that the proposed algorithm has improved robustness compared with existing algorithms, and the tamper sensitivity and security are basically equivalent to the existing algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
27. Data-driven approach to characterize urban vitality: how spatiotemporal context dynamically defines Seoul's nighttime.
- Author
-
Kim, Young-Long
- Subjects
VITALITY ,PRINCIPAL components analysis ,COLLECTIVE behavior ,HUMAN behavior - Abstract
This study takes a data-driven approach to define urban nighttime by examining the spatiotemporal dynamics of urban vitality. Using micro-scale spatiotemporal analysis, this paper empirically provides a comprehensive, yet granular, picture of collective human behaviors in cities. Using Seoul, South Korea as a case study site, it prioritizes the spatiotemporal context in order to mitigate uncertain contextual effects inherent in such forms of data-driven analysis. Instead of leaving the data re-grouping up to researcher's arbitrary decision, this paper employs a functional principal component analysis (FPCA) to systematically transform a set of discrete data to a continuous functional form. This paper applies FPCA on 24-hour-based dataset of pedestrian traffic in Seoul in order to make a data-driven extraction of principal components that characterize the city's unique patterns of urban vitality. Extracting principal components allows for less statistically obvious phenomena to be measured that would have otherwise been hidden within the data. This approach proved successful in capturing nighttime vitality patterns that are eclipsed by the overwhelming trend of daytime patterns. Additionally, this paper compares differences between regions and seasons to examine what the differences can tell about the definition of nighttime. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
28. Rapid and ultrasensitive detection of food contaminants using surface-enhanced Raman spectroscopy-based methods.
- Author
-
Guo, Yahui, Girmatsion, Mogos, Li, Hung-Wing, Xie, Yunfei, Yao, Weirong, Qian, He, Abraha, Bereket, and Mahmud, Abdu
- Subjects
POLLUTANTS ,PRINCIPAL components analysis ,MULTIVARIATE analysis ,SERS spectroscopy ,HAZARDOUS substances ,IMPRINTED polymers ,SINGLE molecules ,ADULTERATIONS - Abstract
With the globalization of food and its complicated networking system, a wide range of food contaminants is introduced into the food system which may happen accidentally, intentionally, or naturally. This situation has made food safety a critical global concern nowadays and urged the need for effective technologies capable of dealing with the detection of food contaminants as efficiently as possible. Hence, Surface-enhanced Raman spectroscopy (SERS) has been taken as one of the primary choices for this case, due to its extremely high sensitivity, rapidity, and fingerprinting interpretation capabilities which account for its competency to detect a molecule up to a single level. Here in this paper, we present a comprehensive review of various SERS-based novel approaches applied for direct and indirect detection of single and multiple chemical and microbial contaminants in food, food products as well as water. The aim of this paper is to arouse the interest of researchers by addressing recent SERS-based, novel achievements and developments related to the investigation of hazardous chemical and microbial contaminants in edible foods and water. The target chemical and microbial contaminants are antibiotics, pesticides, food adulterants, Toxins, bacteria, and viruses. In this paper, different aspects of SERS-based reports have been addressed including synthesis and use of various forms of SERS nanostructures for the detection of a specific analyte, the coupling of SERS with other analytical tools such as chromatographic methods, combining analyte capture and recognition strategies such as molecularly imprinted polymers and aptasensor as well as using multivariate statistical analyses such as principal component analysis (PCA)to distinguish between results. In addition, we also report some strengths and limitations of SERS as well as future viewpoints concerning its application in food safety. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
29. R-Peak Detection in ECG Signal Using Yule–Walker and Principal Component Analysis.
- Author
-
Gupta, Varun and Mittal, Monika
- Subjects
SIGNAL detection ,PRINCIPAL components analysis ,VENTRICULAR tachycardia ,HILBERT-Huang transform ,FEATURE extraction ,ARRHYTHMIA ,NOISE control ,BANDPASS filters - Abstract
Proper diagnosis of clinical Electrocardiogram (ECG) is still a challenge. The minor variations in the attributes of ECG signal cannot be examined properly by simple visualization, rather an efficient technique is required to increase the chances of early prediction of the diseases. R-peak detection is one such important attribute. It plays an important role in the detection of Arrhythmias (heart diseases). Proper detection of Arrhythmias using R-peak requires two things: long time recording of the ECG and noise reduction. But, long time recording of ECG requires proper modeling for extracting features from long data records. In this paper, noise reduction is accomplished using a digital bandpass filter (DBPF), since its filtering characteristics are invariant with drift and temperature, and the features are extracted using Yule–Walker (YW) autoregressive modeling technique which is most appropriate for modeling non-stationary signals recorded for long times. So, databases of AHA (American Heart Association), Ventricular Tachyarrhythmia and MIT-BIH Arrhythmia have been investigated. A total of 18 ECG records were made for implementing the proposed methodology using MATLAB R2008b. The feature extraction step is performed by finding AR coefficients on the basis of the selected model order. YW method is applied for finding AR coefficients and Principal Component Analysis (PCA) is used for R-peak detection. In this paper, normal and abnormal signals have been considered during the detection process. The PCA without YW yields a sensitivity of 99.73%, a specificity of 99.80%, a detection rate (DR) of 99.73%, and an accuracy (ACC) of 99.66%, whereas the proposed PCA with YW (PCA + YW) yields a sensitivity (SE) of 99.88%, a specificity (SP) of 99.92%, a detection rate (DR) of 99.90%, and an accuracy (ACC) of 99.81%. Suitable comparisons of the results obtained using the proposed method have been carried out with those obtained using existing methods for illustration. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
30. A new process capability index for multiple quality characteristics based on principal components.
- Author
-
Dharmasena, L.S. and Zeephongsekul, P.
- Subjects
MANUFACTURING processes ,PRINCIPAL components analysis ,REAL numbers ,MULTIVARIATE analysis ,INDUSTRIAL capacity ,HYPOTHESIS - Abstract
This paper presents a new multivariate process capability index (MPCI) which is based on the principal component analysis (PCA) and is dependent on a parameterwhich can take on any real number. This MPCI generalises some existing multivariate indices based on PCA proposed by several authors whenor. One of the key contributions of this paper is to show that there is a direct correspondence between this MPCI and process yield for a unique value of. This result is used to establish a relationship between the capability status of the process and to show that under some mild conditions, the estimators of this MPCI is consistent and converge to a normal distribution. This is then applied to perform tests of statistical hypotheses and in determining sample sizes. Several numerical examples are presented with the objective of illustrating the procedures and demonstrating how they can be applied to determine the viability and capacity of different manufacturing processes. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
31. A study on the hydrogeochemical mechanisms controlling groundwater fluoride enrichment in Jaipur: a semi-arid terrain in India.
- Author
-
Saini, Aruna, Kanwar, Priya, Kumar, Suresh, Tembhurne, Sayelli, and Roy, Indranil
- Subjects
FLUORIDES ,GROUNDWATER ,GEOCHEMICAL modeling ,GROUNDWATER flow ,PRINCIPAL components analysis ,AQUIFER pollution ,HIERARCHICAL clustering (Cluster analysis) - Abstract
The main objective of this research paper was to find out the major governing factors controlling fluoride enrichment in groundwater resources in the Jaipur region of India. Chemical analysis of collected water samples revealed that 36% of the collected groundwater samples exhibit fluoride concentration of more than 1.5 mg/L as per BIS, 10,500 and WHO, 2017. An attempt has been made to discuss occurrence of fluoride alongside its spatial distribution in the study area with respect to geology and groundwater flow direction. Chloroalkaline indices, Gibb's plot, Piper diagram and various inter-ionic bivariant plots have been applied to recognize hydrochemical processes and dissolution trends resulting in high concentration of fluoride in groundwater. There exist five water types in the study area: Ca-HCO
3 , Na-HCO3 , Na-Cl, Ca-Mg-Cl and Ca-Na-HCO3 . Due to ion association of excess Cl− emanating from wastewater, Na-HCO3 type water finally gets changed as Na-Cl type in aquifer. In the study area, 82% of water samples enriched in F− concentration (>1 ppm) pertain to Na-Cl type. Geochemical modelling confirms that reduced Ca2+ ion activity due to oversaturation of calcite with respect to fluorite might have triggered the favourable condition for dissolution of fluoride bearing minerals leading to fluoride enrichment in groundwater. To further assess the extent of natural and anthropogenic processes, the data was subjected to multivariate statistical analysis by performing correlation analysis, principal component analysis and hierarchical cluster analysis. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
32. Assessing method performance for polycyclic aromatic hydrocarbons analysis in sediment using GC-MS: method validation and principal component analysis for quality control.
- Author
-
Molnar Jazić, Jelena, Kragulj Isakovski, Marijana, Maletić, Snežana, Tubić, Aleksandra, Apostolović, Tamara, Beljin, Jelena, and Agbaba, Jasmina
- Subjects
HYDROCARBON analysis ,PRINCIPAL components analysis ,SEDIMENT analysis ,PERSISTENT pollutants ,GAS chromatography/Mass spectrometry (GC-MS) ,POLYCYCLIC aromatic hydrocarbons ,PERYLENE ,QUALITY control - Abstract
Polycyclic aromatic hydrocarbons (PAHs) belong to the persistent organic pollutant class and are ubiquitously present in the environment. As a consequence of their high hydrophobicity, PAHs in aquatic environments tend to rapidly sink down to the bottom sediments which are the most important reservoirs of PAHs in the aquatic environment. This paper presents the validation and further performance evaluation of a method modified in-house for PAHs' sample preparation and analysis in sediment using gas chromatography-mass spectrometry (GC-MS). The method validation covers determining the calibration linear range, method detection and quantitation limits, method recovery and precision, method uncertainty and further internal and external quality control steps. Principal component analysis was used to identify the influence of different variables on method performance and indicated main clustering by PAH molecule size. The results of a two-year quality control programme show that the variations in method performance were not statistically significant in comparison to the parameters set during the validation experiments. Trueness expressed as recovery was very similar for spiked sediment samples and the analysed certified reference material (CRM) (70.6–113%). Expanded uncertainty ranged from 40.4% to 50.9% with higher values noted for the highly hydrophobic PAHs benzo(a)pyrene and benzo(g,h,i)perylene. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
33. Modeling and optimisation of magnetic field assisted electrochemical spark drilling using hybrid technique.
- Author
-
Singh, Roopa, Singh, Dhirendra Kumar, and Singh, Jeeoot
- Subjects
MAGNETIC flux density ,MAGNETIC fields ,GREY relational analysis ,RESPONSE surfaces (Statistics) ,PRINCIPAL components analysis - Abstract
The present paper focuses on the application of a hybrid methodology for multi-objective optimisation (MOO) of an in-house deigned and fabricated magnetic field-assisted electrochemical spark drilling (MF-ECSD) process where an electromagnetic unit is added in the setup to create magnetic field of different intensities. The process combines Taguchi methodology (TM) with response surface methodology (RSM) for modelling and grey relational analysis (GRA) with principal component analysis (PCA) methodology for MOO. TM is utilised as the core values in RSM to create the second-order response model, and it is used to find the optimum level of input parameters, namely, voltage (V), electrolyte concentration (EC), tool rotational speed (TRS) and magnetic field intensity (MFI). Material removal rate (MRR), machining depth (MD) and overcut (OC) are the responses. PCA is used to calculate the weight associated with each quality feature. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. Harnessing machine learning for landscape character management in a shallow relief region of China.
- Author
-
Huang, Tingting, Zhang, Ying, Li, Sha, Griffiths, Geoffrey, Lukac, Martin, Zhao, Haiyue, Yang, Xin, Wang, Jiwei, Liu, Wei, and Zhu, Jianning
- Subjects
MACHINE learning ,LANDSCAPE assessment ,GAUSSIAN mixture models ,PRINCIPAL components analysis ,FIELD research - Abstract
Due to China's rapid human activity expansion, landscapes have lost their distinctive and typical characteristics. This paper addresses this issue by proposing a landscape character management framework for the Beijing shallow relief area. The framework utilises machine learning techniques to assess and enhance landscape integrity. The process involves landscape character identification through Principal Component Analysis, Gaussian Mixture Model clustering, and Canny Edge Detection. Additionally, a comprehensive landscape sensitivity evaluation considers both landscape character and visual sensitivity. The study develops five landscape management strategies based on field surveys and employs a Transformer Matrix Process and a multi-expert decision-making mechanism. Extensive validation confirms the framework's effectiveness in improving the recognition accuracy of Landscape Character Types. The findings reveal that over 30% of the landscape characters in the study area require improvement. Importantly, the machine learning techniques employed in this study can be transferred to other regions, facilitating landscape characterisation, evaluation, and management. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. Single nucleotide polymorphisms (SNPs) and indels identified from whole-genome re-sequencing of four Chinese donkey breeds.
- Author
-
Chen, Jianxing, Zhang, Shuer, Liu, Shuqin, Dong, Jianbao, Cao, Yanhang, and Sun, Yujiang
- Subjects
DONKEYS ,ANIMAL coloration ,SINGLE nucleotide polymorphisms ,PRINCIPAL components analysis ,NUCLEOTIDE sequencing ,GENETIC variation ,BODY size - Abstract
This paper represents the fundamental report of the survey of genome-wide changes of four Chinese indigenous donkey breeds, Dezhou (DZ), Guangling (GL), North China (NC), and Shandong Little donkey (SDL), and the findings will prove usefully for identification of biomarkers that perhaps predict or characterize the growth and coat color patterns. Three genomic regions in CYP3A12, TUBGCP5, and GSTA1 genes, were identified as putative selective sweeps in all researched donkey populations. The loci of candidate genes that may have contributed to the phenotypes in body size (ACSL4, MSI2, ADRA1B, and CDKL5) and coat color patterns (KITLG and TBX3) in donkey populations would be found in underlying strong selection signatures when compared between large and small donkey types, and between different coat colors. The results of the phylogenetic analysis, F
ST , and principal component analysis (PCA) supported that each population cannot clearly deviate from each other, showing no obvious population structure. We can conclude from the population history that the formation processes between DZS and NC, GL, and SDL are completely different. The genetic variants discovered here provide a rich resource to help identify potential genomic markers and their associated molecular mechanisms that impact economically important traits for Chinese donkey breeding programs. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
36. Prospects of successful blended pedagogies in South Africa: Planning, governance and infrastructure considerations.
- Author
-
Ramoroka, Tlou Millicent
- Subjects
DEVELOPING countries ,INFORMATION economy ,PRINCIPAL components analysis ,HUMAN Development Index ,INTERNATIONAL competition ,EDUCATIONAL technology - Abstract
The purpose of this paper is to evaluate South Africa's implementation of educational Information and Communication Technology (ICT), which is led by Gauteng and the Western Cape Provinces, for participation in the global knowledge economy. Accordingly, the two provinces are in the forefront of educational ICT implementation aimed at preparing learners for participation in the global knowledge economy and national development. The paper uses Principal Component Analysis (PCA) to examine South Africa in comparison with fourteen developing countries to establish that its approach towards implementation of educational technology is not appropriate, sustainable nor effective for a developing country. Experiences of developing countries such as Vietnam, Zambia and Kenya, which are in the medium and low Human Development Index (HDI) categories, show that national technological cultures of people have not evolved into what is characterised as the 'Net Natives', which is one of the primary driving force for the adoption of blended pedagogies as an approach for implementing educational technology, to enhance participation in the global knowledge economy. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
37. An active-set proximal quasi-Newton algorithm for ℓ1-regularized minimization over a sphere constraint.
- Author
-
Shen, Chungen, Mi, Ling, and Zhang, Lei-Hong
- Subjects
QUASI-Newton methods ,PRINCIPAL components analysis ,SPHERES ,RIEMANNIAN manifolds ,QUADRATIC programming ,IMAGE processing - Abstract
The ℓ 1 -regularized minimization has been widely used in many data science applications, and certain special constrained ℓ 1 -regularized minimizations have also been proposed in some recent applications. In this paper, we consider a sphere constrained ℓ 1 -regularized minimization, which can arise in image processing, signal recognition and sparse principal component analysis. Viewing the sphere as a simple Riemannian manifold, manifold-based methods for non-smooth minimization can be applied to solve such a problem, but may still encounter slow convergence in some situations. Our objective of this paper is to propose a new and efficient active-set proximal quasi-Newton method for this problem. The idea behind is to speed up the convergence by separately handling the convergence of both the active and inactive variables. In particular, our method invokes a procedure to effectively estimate active and inactive variables, and then designs the search directions based on proximal gradients and quasi-Newton directions to efficiently treat the convergence of the active and inactive variables, respectively. We show that under some mild conditions, the global convergence is guaranteed, and the complexity is also performed to reveal the computational efficiency. Numerical experience on the ℓ 1 -regularized quadratic programming and sparse principal component analysis on both synthetic and real data demonstrates its robustness and efficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
38. Improved folded-PCA for efficient remote sensing hyperspectral image classification.
- Author
-
Uddin, Md. Palash, Mamun, Md. Al, Hossain, Md. Ali, and Afjal, Masud Ibn
- Subjects
IMAGE recognition (Computer vision) ,FEATURE extraction ,PRINCIPAL components analysis ,FEATURE selection ,REMOTE sensing ,HYPERSPECTRAL imaging systems ,AGRICULTURE - Abstract
Hyperspectral images (HSIs) contain notable information of land objects by acquiring an immense set of narrow and contiguous spectral bands. Feature extraction (FE) and feature selection (FS) as dimensionality (band) reduction strategies are performed to enhance the classification result of HSI. Principal component analysis (PCA) is frequently exploited for the FE of HSI. However, it often possesses the inability to extract local and subtle HSI structures. As such, segmented-PCA (SPCA), spectrally segmented-PCA (SSPCA) and folded-PCA (FPCA) are presented for local and useful FE from the HSI. In this paper, we propose two FE methods called segmented-FPCA (SFPCA) and spectrally segmented-FPCA (SSFPCA). SFPCA exploits SPCA and FPCA while SSFPCA exploits SSPCA and FPCA together. In particular, SFPCA and SSFPCA apply FPCA on highly correlated and spectrally grouped HSI bands, respectively. We consider nonlinear methods Kernel-PCA (KPCA) and Kernel entropy component analysis (KECA) for extended comparison. For the experimented agricultural Indian Pine and urban Washington DC Mall HSIs, the results manifest that SFPCA (95.6262% for the agricultural HSI and 97.4782% for the urban HSI) and SSFPCA (96.3221% for the agricultural HSI and 98.0116% for the urban HSI) outperform the conventional methods. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
39. Bivariate kernel density estimation for environmental contours at two offshore sites.
- Author
-
Wang, Yingguang
- Subjects
PROBABILITY density function ,PRINCIPAL components analysis ,BIVARIATE analysis ,OCEAN waves - Abstract
This paper proposes a novel environmental contour line approach based on measured ocean wave data at two offshore sites. For implementing the novel environmental contour line approach, we propose the use of bivariate kernel density estimation with rigorous bandwidth selection based on the Scott-Tapia-Thompson estimation method. The environmental contours obtained by using the proposed novel approach have been compared with those obtained by using the traditional Inverse-First-Order Reliability Method (IFORM) with Rosenblatt transformation or principal component analysis, and the effectiveness of our proposed novel approach have been clearly substantiated. The research results in this paper demonstrate that our proposed novel approach can be utilised as an effective tool for generating environmental contours if one has sufficient data extending beyond the return period of interest. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
40. Study on egg sorting model based on visible-near infrared spectroscopy.
- Author
-
Han, Xiaoping, Liu, Yan-Hong, Zhang, Xuyuan, Zhang, Zhiyong, and Yang, Hua
- Subjects
EGGSHELLS ,PRINCIPAL components analysis ,NONDESTRUCTIVE testing ,HENS ,EGGS - Abstract
To realize the automatic sorting of eggs, the sorting models are established in this paper by using the visible-near infrared spectroscopy technique and taking the eggshell colour, integrity, as well as the feeding mode as sorting indexes. A variety of methods are selected to remove the noise and systematic error by preprocess the spectral information. The backpropagation neural network (BP), the Principal Component Analysis (PCA) coupled with BP and the Soft Independent Modeling of Class Analogy (SIMCA) sorting method are used to identify the eggshell colours (white, pink, green), eggshell integrity (intact, cracked) and laying hen feeding mode (caged and cage-free) by their characteristic band, respectively. The prediction correlation coefficient (Rv), the prediction mean square error (RMSEP), the prediction standard error (SEP), the recognition rate ( R r 1 ) and the rejection rate ( R r 2 ) are used to evaluate the established models. The results show that the established classification models have high prediction accuracy and small errors. The non-destructive testing (NDT) technology has great potential for large-scale intelligent laying hen farms. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
41. Predicting Diabetes Mellitus Using Modified Support Vector Machine with Cloud Security.
- Author
-
Thenappan, S., Rajkumar, M. Valan, and Manoharan, P. S.
- Subjects
SUPPORT vector machines ,MACHINE learning ,DIABETES ,PRINCIPAL components analysis ,DATA mining ,HONEYBEES ,CLOUD storage - Abstract
Diabetes mellitus is one of the major concerned diseases that cause a large number of deaths every year. It is considered as the chronic disease which is caused by an increase in blood sugar. If diabetes remains unidentified and untreated, it creates more complexities. So, the early prediction of diabetes can reduce the fatal rate of a human. The data mining concept assists to diagnose diabetes. Various research studies are presented with various data mining algorithms for early prediction and disease diagnosis but still with lack of accuracy. At the same time, mining the diabetes data in a secure manner is one of the critical issues. To recover this issue, this paper designs the new model for early prediction of diabetes with high accuracy. This research explores the enhanced principal component analysis for efficient feature extraction from the dataset. To achieve the highest accuracy of classification, it has proposed the machine learning algorithm, namely, modified support vector machine (MSVM) which is used to detect the diabetes disease at an early stage. The main contribution of this research is to mining the patient's disease results in cloud security. For this security purpose, honey bee encryption and decryption algorithm is used. The performance measures of the proposed method are evaluated on various measures of accuracy, sensitivity, specificity, precision, and negative predictive value. Results obtained show the proposed MSVM classifier outperforms with the highest accuracy of 97.13%. We have compared the proposed methods with existing methods for proving our method has better performance. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
42. Thyroid Disorder Diagnosis by Optimal Convolutional Neuron based CNN Architecture.
- Author
-
Namdeo, Rajole Bhausaheb and Janardan, Gond Vitthal
- Subjects
THYROID diseases ,FEATURE extraction ,CONVOLUTIONAL neural networks ,PRINCIPAL components analysis ,SEARCH algorithms ,THYROID gland - Abstract
The diagnosis of thyroid via appropriate interpretation of thyroid data is the vital classification issue. Only little contributions are made so far in the automatic diagnosis of thyroid disease. In order to solve Thyroid disorder this paper intends to propose a new thyroid diagnosis model, utilising two-phases includes Feature Extraction and Classification. In the first phase, two sorts of features are extracted that include image features like neighbourhood-based and gradient features, and Principal Component Analysis (PCA) is used to extract the data features as well. Subsequently, two sorts of classification processes are performed. Specifically, Convolutional Neural Network (CNN) is used for image classification by extracting deep features. Neural Network (NN) is used for classifying the disease by obtaining both the image and data features as the input. Finally, both the classified results (CNN and NN) are combined to increase the accuracy rate of diagnosis. Further, as the main aim of this work is to increase the accuracy rate, this paper aims to trigger the optimisation concept. The convolutional layer of CNN is optimally selected, and while classifying under NN the given features should be the optimal one. Hence, the required features are optimally selected. For these optimisations, a new modified algorithm is proposed in this work namely Worst Fitness-based Cuckoo Search (WF-CS) which is the modified form of Cuckoo Search Algorithm (CS). Finally, the performance of proposed WF-CS is compared over other conventional methods like Conventional CS, Genetic Algorithm (GA), FireFly (FF), Artificial Bee Colony (ABC), and Particle Swarm Optimisation (PSO) and proves the superiority of proposed work in detecting the presence of thyroid. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
43. A Lagrange–Newton algorithm for tensor sparse principal component analysis.
- Author
-
Li, Shuai, Luo, Ziyan, and Chen, Yang
- Subjects
- *
PRINCIPAL components analysis , *LAGRANGE equations , *CALCULUS of tensors , *NONLINEAR programming , *MATHEMATICAL models - Abstract
This paper is concerned with the tensor sparse principal component analysis (TSPCA) by obtaining principal components which are linear combinations of a small subset of the original features for tensorial data. The core mathematical model can be formulated as a nonsmooth nonconvex optimization problem with a polynomial objective function, and with the sparsity constraint and the unit Euclidean spherical constraint. By employing the tools in tensor analysis, along with the variational properties for the involved $ \ell _{0} $ ℓ 0 -norm, the optimality condition of TSPCA is analysed in terms of stationary points. To well resolve the problem, we reformulate the stationary conditions into the Lagrange stationary equation system via the property of the projection operator onto the sparsity constraint set. With special emphasis on the Jacobian nonsingularity of the corresponding nonlinear system, we propose the Lagrange–Newton algorithm for pursuing the stationary point, which serves as a promising approximation of the optimal solution to TSPCA. The locally quadratic convergence rate is also established under mild conditions. Numerical experiments illustrate the effectiveness of our proposed TSPCA approach in terms of the solution accuracy as well as the computation time. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. A longitudinal study of the influence of air pollutants on children: a robust multivariate approach.
- Author
-
Meneghel Danilevicz, Ian, Bondon, Pascal, Anselmo Reisen, Valdério, and Sarquis Serpa, Faradiba
- Subjects
- *
AIR pollutants , *PARTICULATE matter , *PRINCIPAL components analysis , *AIR pollution , *STATISTICAL association - Abstract
This paper aims to evaluate the statistical association between exposure to air pollution and forced expiratory volume in the first second (FEV1) in both asthmatic and non-asthmatic children and teenagers, in which the response variable FEV1 was repeatedly measured on a monthly basis, characterizing a longitudinal experiment. Due to the nature of the data, an robust linear mixed model (RLMM), combined with a robust principal component analysis (RPCA), is proposed to handle the multicollinearity among the covariates and the impact of extreme observations (high levels of air contaminants) on the estimates. The Huber and Tukey loss functions are considered to obtain robust estimators of the parameters in the linear mixed model (LMM). A finite sample size investigation is conducted under the scenario where the covariates follow linear time series models with and without additive outliers (AO). The impact of the time-correlation and the outliers on the estimates of the fixed effect parameters in the LMM is investigated. In the real data analysis, the robust model strategy evidenced that RPCA exhibits three principal component (PC), mainly related to relative humidity (Hmd), particulate matter with a diameter smaller than 10 μm (PM10) and particulate matter with a diameter smaller than 2.5 μm (PM2.5). [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
45. Detection of early bruises in plum using hyperspectral imaging combination with machine learning algorithm.
- Author
-
Qiu, Zouquan, Meng, Qinghua, Wu, Zhefeng, Pei, Shiying, Ni, Chunyu, Chang, Hongjuan, Sang, Liting, Yao, Jiawei, Fang, Juncheng, Chu, Jiahui, Ma, Yuwen, Huang, Yuqing, and Li, Yu
- Subjects
- *
HYPERSPECTRAL imaging systems , *MACHINE learning , *MULTIPLE scattering (Physics) , *PRINCIPAL components analysis , *GRAYSCALE model , *ADAPTIVE sampling (Statistics) , *SUPPORT vector machines - Abstract
AbstractEarly detection of plum bruising is important in the online postharvest quality sorting process. This paper explores the rapid detection of bruises in plum at five stages (1, 3, 6, 24, and 48 h after bruising) using hyperspectral imaging system. Spectral preprocessing was performed using five methods (unit vector normalization, multiplicative scattering correction, standard normal variate, detrending, and baseline). The support vector machine was established to discriminate the spectral samples of healthy and bruised plums at stage 1 on full wavelengths. The results indicated that the spectral data pretreated by multiple scattering correction yielded better results. The characteristic wavelengths of the spectra were selected by six algorithms (random frog, competitive adaptive reweighted sampling, variable combination population analysis, successive projection algorithm, uninformative variable elimination and bootstrapping soft shrinkage). Subsequently, the support vector machine was established at stage 1 based on these selected characteristic wavelengths. Comparing the results of models, the support vector machine based on the characteristic wavelengths selected by competitive adaptive reweighted sampling generated a satisfied effect with an accuracy of 95% for the calibration set and 98% for the prediction set. This model was selected for bruise detection in plums at the residual stage. The characteristic wavelength grayscale images selected based on the model were used to successfully visualize the plum bruise regions in five stages using minimum noise fraction transformation and the principal component analysis method. The results showed that the hyperspectral imaging technique combined with machine learning algorithm could be used to identify plum bruises at different stages. This study contributes to the development of an online detection system for bruises in plum. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. Effect of Different Approaches of Nutrient Application on Soil Quality Index Under Maize-Wheat Cropping System in Mollisol Region of Uttarakhand.
- Author
-
Pandey, Varsha, Srivastava, Ajaya, Singh, Veer, Pachauri, S. P., Bhatnagar, Amit, Kumar, Deepak, and Bahadur, Raj
- Subjects
- *
CROPPING systems , *SOIL fertility , *SOIL quality , *ORGANIC fertilizers , *PRINCIPAL components analysis - Abstract
A field experiment was conducted at GBPUA&T, Pantnagar, to study the effect of different approaches of nutrient application on soil fertility and soil quality under maize-wheat cropping system. Nine treatment combinations were compared namely, Recommended Doses of Fertilizers (RDF), Soil Test Crop Response (STCR), and various combinations of organic and inorganic fertilizers. This paper aims to develop Soil Quality Index (SQI) based on Minimum Data Set (MDS) using Principal Component Analysis (PCA). Different indicators were employed to formulate the SQI, derived from surface soil layer measurements (0–15 cm). Each MDS indicator was then converted into a dimensionless score using linear scoring function and then integrated into SQI. Results showed that the key soil quality indicators identified as MDS using PCA under maize-wheat cropping system were Water Holding Capacity, organic carbon, available N and dehydrogenase activity. These soil quality indicators were found to be best for monitoring soil health status. After
rabi wheat 2019–2020, SQI varied from 1.22 to 2.21 across the treatments whereas, afterrabi wheat 2020–2021, SQI varied from 1.13 to 2.23 across the treatments. Among different approaches of nutrient application, STCR-based use of fertilizers along with FYM (T4) helped in maintaining better soil physical, chemical and biological properties and ultimately sustaining soil quality, which was followed by the treatment receiving 75% STCR dose of N (inorganic mode) + full P and K (T5). Integration of organics with inorganic fertilizers maintained soil quality, environmental health and reduced greater dependency on chemical fertilizers. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
47. A customized inertial proximal alternating minimization for SVD-free robust principal component analysis.
- Author
-
Wang, Qingsong, Han, Deren, and Zhang, Wenxing
- Subjects
- *
OPTIMIZATION algorithms , *PRINCIPAL components analysis , *SINGULAR value decomposition , *MATRIX decomposition , *MATHEMATICS - Abstract
Robust principal component analysis (RPCA) is devoted to tackling grossly corrupted datasets with noise. However, the performance of RPCA is usually circumscribed by the lack of efficiency of singular value decomposition (SVD), which rules out its potential applications to many large-scale real-world problems. In this paper, we develop a nonconvex optimization algorithm customized to SVD-free RPCA models. The proposed algorithm, which is built upon proximal alternating linearized minimization Bolte et al. [Proximal alternating linearized minimization for nonconvex and nonsmooth problems. Math Program. 2014;146(1–2):459–494], can reduce computational efforts by partially linearizing data fidelity and increase efficiency by leveraging inertial techniques. Under the Kurdyka-Łojasiewicz assumption on the objective function and some mild premises on stepsizes, the sequence produced by the proposed algorithm converges globally to a critical point of SVD-free RPCA models. Numerical simulations on synthetic and real datasets demonstrate the compelling performance of the proposed algorithm. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. A novel two-way functional linear model with applications in human mortality data analysis.
- Author
-
Yan, Xingyu, Yu, Jiaqian, Ding, Weiyong, Wang, Hao, and Zhao, Peng
- Subjects
- *
MORTALITY , *DATA analysis , *PRINCIPAL components analysis , *LEAST squares , *FUNCTIONAL analysis - Abstract
Recently, two-way or longitudinal functional data analysis has attracted much attention in many fields. However, little is known on how to appropriately characterize the association between two-way functional predictor and scalar response. Motivated by a mortality study, in this paper, we propose a novel two-way functional linear model, where the response is a scalar and functional predictor is two-way trajectory. The model is intuitive, interpretable and naturally captures relationship between each way of two-way functional predictor and scalar-type response. Further, we develop a new estimation method to estimate the regression functions in the framework of weak separability. The main technical tools for the construction of the regression functions are product functional principal component analysis and iterative least square procedure. The solid performance of our method is demonstrated in extensive simulation studies. We also analyze the mortality dataset to illustrate the usefulness of the proposed procedure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. A method for predicting photovoltaic output power based on PCC-GRA-PCA meteorological elements dimensionality reduction method.
- Author
-
Yang, Lingsheng, Cui, Xiangyu, and Li, Wei
- Subjects
DIMENSION reduction (Statistics) ,PEARSON correlation (Statistics) ,PRINCIPAL components analysis ,PHOTOVOLTAIC power generation ,DATA reduction ,PHOTOVOLTAIC power systems - Abstract
Photovoltaic (PV) power generation forecasting models require a large amount of meteorological data, which may include irrelevant and redundant information. As the volume of data increases, the dataset is likely to contain a significant amount of irrelevant and redundant information. This paper proposes a method for reducing dimensionality based on PCC-GRA-PCA method, which aims to simplify the model and reduce computational complexity. Firstly, the dimension reduction method analyzes the feature importance of various meteorological elements by using Pearson Correlation Coefficient (PCC) and Grey Relation Analysis (GRA), which can achieve the preliminary dimension reduction of data by selecting the most relevant features. Next, the data is processed using Principal Component Analysis (PCA) to achieve a secondary dimension reduction of meteorological data through feature transformation. Finally, a photovoltaic power prediction model has been established using the OVMD-tSSA-LSSVM algorithm. After analysis, it was found that the prediction model showed improvements in R
2 , MAE, RMSE, and MAPE after PCC-GRA-PCA dimensionality reduction compared to the prediction model before dimensionality reduction, as well as the prediction model after LDA and PCA dimensionality reduction. This demonstrates the effectiveness of reducing data dimensionality. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
50. A multidimensional assessment of global flourishing: Differential rankings of 145 Countries on 38 wellbeing indicators in the Gallup World Poll, with an accompanying principal components analyses of the structure of flourishing.
- Author
-
Lomas, Tim, Padgett, R. Noah, Lai, Alden Yuanhong, Pawelski, James O., and VanderWeele, Tyler J
- Subjects
- *
PRINCIPAL components analysis , *WELL-being - Abstract
For over ten years the World Happiness Report has influentially ranked nations on self-reported life evaluation as measured by the Gallup World Poll. Inspired by this endeavour, this paper aims to broaden our understanding of global flourishing by assessing an expansive battery of 38 items relating to wellbeing in the World Poll, encompassing 386,654 people in 145 countries over three years (2020–2022). The variation in the respective placing of countries across different items reveals a complex picture of flourishing, with many nations ranking highly on certain metrics but faring poorly on others. Additionally, principal components analyses of the items produced a conceptualization of flourishing featuring numerous dimensions (with both a three- and six-factor solution being viable). Together, these findings paint a nuanced picture of both the multifaceted nature of flourishing and its complex manifestations around the world. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.