4,111 results on '"INFORMATION ENTROPY"'
Search Results
2. Exploring the interconnections between total cloud water content and water vapor mixing ratio with other cloud microphysical variables in northward-moving typhoon precipitation via information entropy: A hybrid causal analysis approach using wavelet coherence and Liang–Kleeman information flow
- Author
-
Wu, Xianghua, Ren, Miaomiao, Zhou, Linyi, Li, Yashao, Chen, Jinghua, Li, Wanting, Yang, Kai, and Wang, Weiwei
- Published
- 2025
- Full Text
- View/download PDF
3. Assessing geological structure uncertainties in groundwater models using transition probability-based realizations
- Author
-
Huang, Shiqi, Hu, Litang, Li, Binghua, Wu, Xia, Gan, Lin, Sun, Jianchong, and Zhang, Menglin
- Published
- 2025
- Full Text
- View/download PDF
4. Optimal scale combination selection based on genetic algorithm in generalized multi-scale decision systems for classification
- Author
-
Yang, Ying, Zhang, Qinghua, Zhao, Fan, Cheng, Yunlong, Xie, Qin, and Wang, Guoyin
- Published
- 2025
- Full Text
- View/download PDF
5. TIEOD: Three-way concept-based information entropy for outlier detection
- Author
-
Hu, Qian, Zhang, Jun, Mi, Jusheng, Yuan, Zhong, and Li, Meizheng
- Published
- 2025
- Full Text
- View/download PDF
6. Natural low-illumination image enhancement based on dual-channel prior information
- Author
-
Wang, Lingyun
- Published
- 2024
- Full Text
- View/download PDF
7. Entropy-metric estimation of the small data models with stochastic parameters
- Author
-
Kovtun, Viacheslav, Altameem, Torki, Al-Maitah, Mohammed, and Kempa, Wojciech
- Published
- 2024
- Full Text
- View/download PDF
8. Defining multiple layers of intratumor heterogeneity based on variations of perturbations in multi-omics profiling
- Author
-
Ai, Hongjing, Song, Dandan, and Wang, Xiaosheng
- Published
- 2023
- Full Text
- View/download PDF
9. Decision Trees
- Author
-
Liu, Zhen “Leo” and Liu, Zhen 'Leo"
- Published
- 2025
- Full Text
- View/download PDF
10. ORA-Trans: Object Region Attention Transformer Based on Key Tokens Selector with Structure Feature Modeling for Fine-Grained Visual Classification
- Author
-
Xia, Yulong, Zhang, Jianwei, Goos, Gerhard, Series Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Antonacopoulos, Apostolos, editor, Chaudhuri, Subhasis, editor, Chellappa, Rama, editor, Liu, Cheng-Lin, editor, Bhattacharya, Saumik, editor, and Pal, Umapada, editor
- Published
- 2025
- Full Text
- View/download PDF
11. A Quality Assessment Method of Few-Shot Datasets Based on the Fusion of Quantity and Quality
- Author
-
Zhang, Zhengchao, Zhou, Lianke, Sun, Junzheng, Wang, Nianbin, Goos, Gerhard, Series Editor, Hartmanis, Juris, Founding Editor, Bertino, Elisa, Editorial Board Member, Gao, Wen, Editorial Board Member, Steffen, Bernhard, Editorial Board Member, Yung, Moti, Editorial Board Member, Hadfi, Rafik, editor, Anthony, Patricia, editor, Sharma, Alok, editor, Ito, Takayuki, editor, and Bai, Quan, editor
- Published
- 2025
- Full Text
- View/download PDF
12. WPG-CAM: A Novel Weighted Feature Fusion CAM Method Based on Information Entropy Using Pooling and Gaussian Upsampling
- Author
-
Zhang, Feifei, Xiang, Xiaohong, Zhang, Fuyuan, Zhao, Jun, Ghosh, Ashish, Editorial Board Member, Zhou, Lizhu, Editorial Board Member, Zhang, Haijun, editor, Li, Xianxian, editor, Hao, Tianyong, editor, Meng, Weizhi, editor, Wu, Zhou, editor, and He, Qian, editor
- Published
- 2025
- Full Text
- View/download PDF
13. Complexity of Molecular Ensembles with Basak’s Indices: Applying Structural Information Content
- Author
-
Sabirov, Denis, Zimina, Alexandra, Shepelevich, Igor, Krantz, Steven G., Series Editor, and Basak, Subhash C., editor
- Published
- 2025
- Full Text
- View/download PDF
14. Nowcasting Earthquakes With Stochastic Simulations: Information Entropy of Earthquake Catalogs
- Author
-
Rundle, John B, Baughman, Ian, and Zhang, Tianjian
- Subjects
Earth Sciences ,Geophysics ,Machine Learning and Artificial Intelligence ,earthquakes ,nowcasting ,information entropy ,Earth sciences ,Environmental sciences ,Physical sciences - Abstract
Earthquake nowcasting has been proposed as a means of tracking the change in large earthquake potential in a seismically active area. The method was developed using observable seismic data, in which probabilities of future large earthquakes can be computed using Receiver Operating Characteristic methods. Furthermore, analysis of the Shannon information content of the earthquake catalogs has been used to show that there is information contained in the catalogs, and that it can vary in time. So an important question remains, where does the information originate? In this paper, we examine this question using stochastic simulations of earthquake catalogs. Our catalog simulations are computed using an Earthquake Rescaled Aftershock Seismicity (“ERAS”) stochastic model. This model is similar in many ways to other stochastic seismicity simulations, but has the advantage that the model has only 2 free parameters to be set, one for the aftershock (Omori-Utsu) time decay, and one for the aftershock spatial migration away from the epicenter. Generating a simulation catalog and fitting the two parameters to the observed catalog such as California takes only a few minutes of wall clock time. While clustering can arise from random, Poisson statistics, we show that significant information in the simulation catalogs arises from the “non-Poisson” power-law aftershock clustering, implying that the practice of de-clustering observed catalogs may remove information that would otherwise be useful in forecasting and nowcasting. We also show that the nowcasting method provides similar results with the ERAS model as it does with observed seismicity.
- Published
- 2024
15. A high-altitude geomagnetic matching area selection approach based on geomagnetic information entropy and geomagnetic direction entropy.
- Author
-
Han, Yongqiang and Liang, Ruirui
- Subjects
- *
ENTROPY (Information theory) , *GEOMAGNETISM , *ENTROPY , *ALTITUDES - Abstract
Aiming at the obvious trend of the contours on the high-altitude geomagnetic map, this paper presents a method of high-altitude geomagnetic matching area selection combining geomagnetic information entropy and geomagnetic direction entropy based on the study of geomagnetic information entropy and geomagnetic direction entropy. The method first uses geomagnetic information entropy to select the region with rich geomagnetic information, and then determines the direction of the flight trajectory according to geomagnetic direction entropy to obtain the optimal matching trajectory. Finally, the method compares and analyzes the matching localization results of three flight trajectories in different directions on the geomagnetic map at an altitude of 30,000 m by using semi-physical simulation. The experimental results show that the flight navigation error along the trajectory with small information entropy is small, and its positioning error is 12.7% of the trajectory positioning error along the maximum information entropy direction. Selecting the flight trajectory according to the value of the geomagnetic direction entropy can greatly improve the precision and reliability of the geomagnetic matching localization. The method in this paper can provide a basis for path planning of the geomagnetic matching navigation. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
16. Key Node Identification Method Based on Multilayer Neighbor Node Gravity and Information Entropy.
- Author
-
Fu, Lidong, Ma, Xin, Dou, Zengfa, Bai, Yun, and Zhao, Xi
- Subjects
- *
CUMULATIVE distribution function , *ENTROPY (Information theory) , *INFORMATION networks , *STATISTICAL correlation , *GRAVITY - Abstract
In the field of complex network analysis, accurately identifying key nodes is crucial for understanding and controlling information propagation. Although several local centrality methods have been proposed, their accuracy may be compromised if interactions between nodes and their neighbors are not fully considered. To address this issue, this paper proposes a key node identification method based on multilayer neighbor node gravity and information entropy (MNNGE). The method works as follows: First, the relative gravity of the nodes is calculated based on their weights. Second, the direct gravity of the nodes is calculated by considering the attributes of neighboring nodes, thus capturing interactions within local triangular structures. Finally, the centrality of the nodes is obtained by aggregating the relative and direct gravity of multilayer neighbor nodes using information entropy. To validate the effectiveness of the MNNGE method, we conducted experiments on various real-world network datasets, using evaluation metrics such as the susceptible-infected-recovered (SIR) model, Kendall τ correlation coefficient, Jaccard similarity coefficient, monotonicity, and complementary cumulative distribution function. Our results demonstrate that MNNGE can identify key nodes more accurately than other methods, without requiring parameter settings, and is suitable for large-scale complex networks. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. General Adaptable Design and Evaluation Using Markov Processes.
- Author
-
Zhilin Sun, Kaifeng Wang, and Peihua Gu
- Subjects
- *
MARKOV processes , *ENTROPY (Information theory) , *INFORMATION theory , *SATISFACTION , *EVALUATION methodology - Abstract
Facing the challenges posed by increasingly complex, dynamic, and unforeseen requirements, the design process is grappling with the critical issue of ensuring sustained product satisfaction amid changing demands. This paper introduces an approach for evaluating design adaptability, considering potential future requirements. Entropy serves as a crucial indicator to quantify design effort and the Markov process is employed to simulate potential requirement changes. The information contents of design requirements and design solutions are defined based on information entropy theory, and the design adaptability of a design candidate is evaluated by calculating the extra design effort for satisfying the design requirements, which is the difference in information content between the design candidate and design requirements. Moreover, a simulation method for requirement evolution is proposed, which integrates information entropy theory and the Markov process to accommodate potential future requirements. The general design adaptability of design solutions is then calculated based on conditional entropy, taking into account the evolving design requirements. Finally, the effectiveness of the proposed approach is validated through a case study involving the design and evaluation of a hybrid additive manufacturing device. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. A Novel Non-Unit Protection Method for MMC-HVDC Transmission Lines Based on the Ratio of Line-Mode Voltage Second Derivative †.
- Author
-
Wang, Yanting, Ouyang, Jiayuan, Shi, Zhaoyuan, and Fan, Shunyue
- Subjects
ELECTRIC lines ,TIME-domain analysis ,ENTROPY (Information theory) ,SYSTEM safety ,RENEWABLE energy sources - Abstract
The modular multilevel converter (MMC) high-voltage direct current (HVDC) transmission technology is essential for overcoming the challenges of large-scale renewable energy integration. Line protection is critical for ensuring system safety. However, existing protection methods for MMC-HVDC transmission lines face difficulties in withstanding both high resistance and noise interference, frequently leading to failures in detecting internal high-resistance faults or triggering false operations due to noise. This paper first derives the theoretical expression of the line-mode voltage through analytical methods. By analyzing the second derivative of the line-mode voltage under different fault conditions, this paper constructs a criterion based on the ratio of the integrals of the positive and negative components of the second derivative of the line-mode voltage. This criterion enables effective fault discrimination by utilizing the characteristic differences in the second-derivative waveform. The proposed criterion allows for precise fault identification, requiring only a 0.5 ms time window to detect faults. Additionally, this criterion is highly resistant to transition resistance, remaining unaffected by resistances up to 500 Ω. Moreover, an entropy-based auxiliary criterion is introduced to prevent false operations caused by noise interference. Simulation results using PSCAD/EMTDC demonstrate that the proposed protection scheme can swiftly and reliably detect faults, with a detection time of 0.5 ms and robust performance against both high transition resistance and noise interference. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
19. An improved image enhancement algorithm: Radial contrast-limited adaptive histogram equalization.
- Author
-
Hu, Chunsheng, Li, Hao, Ma, Teng, Zeng, Cailian, and Ji, Xiaoli
- Subjects
IMAGE intensifiers ,HISTOGRAMS ,IMAGE processing ,ENTROPY (Information theory) ,CONTRAST effect ,IMAGE segmentation - Abstract
A commonly used method in image enhancement is Contrast-Limited Adaptive Histogram Equalization, which is simple and fast. However, this algorithm is to segment the image into multiple rectangular areas, and then perform the image processing, this method works poorly when working on images with poor radial contrast, prone to local image information loss, and the characteristics of the whole image information are of poor quality after being processed. In this paper, we proposed a histogram equalization method of radial ring segmentation image, aiming at the image enhancement processing of less obvious radial contrast. Compared with the classical HE and CLAHE algorithms, the algorithm proposed in this paper can effectively enhance the information entropy of the processed images by an average of 12%. As a result, the images carry more information and exhibit richer texture features. Moreover, the contrast enhancement effect is improved, and the brightness of the images is more moderate. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Information Entropy Evaluation Method for the Disassemblability of Smartphones.
- Author
-
Huang, Haihong, Xue, Yuhao, Zhu, Libin, and Cui, Chuangchuang
- Abstract
Smartphones are a vast category of consumer electronic products. Design For Disassembly (DFD) is a methodology that can effectively reduce the disassembly, recycling, and maintenance costs of smartphones. However, due to the diverse constraints posed by the connections among smartphone components, the variability of disassembling tools makes it challenging to objectively characterize the disassemblability of the smartphone. Therefore, information entropy is introduced to characterize the complex state of the system. Disassemblability can be expressed by calculating the operating time of the disassembling tool through information entropy. After multiplying by a constrained quantity factor, the Improved Disassembling Tool Entropy (IDTE) responds to changes in disassemblability as the structural level changes. According to the structure and recycling direction, the disassembling level of smartphones can be divided into module level, part level, and hybrid part-module level. Based on the Maynard Operational Sequencing Technique (MOST), the basic unit of operating time of the disassembling tool is calculated. For the hybrid selective Disassembly Sequence Planning (DSP) of the part and module levels, the Improved Double Genetic Algorithm (IDGA) model is established to compute the optimal disassembly sequence corresponding to each program level. This model calculates the IDTE corresponding to the optimal disassembly sequences at each disassembling level. The validity of IDTE was verified by a coupled comparison test between IDTE and theoretical disassembly time. Finally, an analysis of the disassemblability variations was conducted for two different models of smartphones based on their structure. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. Enhanced EDAS technique for quality evaluation of online games interface design based on three types of probabilistic linguistic similarity measures.
- Author
-
Wei, Kang and Huang, Yuhan
- Subjects
- *
GROUP decision making , *VIDEO games , *ENTROPY (Information theory) , *DECISION making , *AESTHETICS - Abstract
The highest level of interface design for online games is to achieve invisibility, which means that the interface is very natural and easy to use. Players are proficient in using it and fully immerse themselves in the world of online games. It seems that the interface is transparent and they no longer feel its existence. Excellent online game interface design is the requirement of users that runs through the entire design process. But this is not to say that the usability of online game software can surpass other factors. All great designs seek a balance and harmony between artistic beauty, reliability, security, usability, cost, and performance. The quality evaluation of online games interface design is a multiple-attribute group decision-making (MAGDM) problem. Recently, the Evaluation Based on Distance from Average Solution (EDAS) technique, cosine similarity measure (CSM), Dice similarity measure (DSM), Jaccard similarity measure (JSM) and entropy technique has been separately employed to cope with MAGDM issues. The probabilistic linguistic term sets (PLTSs) are employed as a tool for conveying uncertain information during the quality evaluation of online games interface design. In this paper, the EDAS technique is expanded to the PLTSs and the probabilistic linguistic EDAS (PL-EDAS) technique based on cosine similarity measure (CSM), Dice similarity measure (DSM) and Jaccard similarity measure (JSM) is constructed to manage MAGDM issue. The information entropy technique is employed to implement the weight values based on CSM technique, DSM technique and JSM technique under PLTSs. Finally, the quality evaluation of online games interface design is employed to demonstrate the PL-EDAS technique and some comparative analysis is employed to demonstrate the PL-EDAS technique. Thus, the main research contribution of this work is constructed: (1) the information entropy technique is constructed to implement the attribute weight values based on CSM technique, DSM technique and JSM technique; (2) the PL-EDAS technique is constructed under PLTSs based on CSM technique, DSM technique and JSM technique; (3) an example for quality evaluation of online games interface design is employed to verify PL-EDAS technique and several decision comparative analysis is employed to verify the PL-EDAS technique. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Exploring the Multiplication of Resonant Modes in Off-Center-Driven Chladni Plates from Maximum Entropy States.
- Author
-
Lin, Song-Qing, Hsu, Yu-Hsin, Su, Kuan-Wei, Liang, Hsing-Chih, and Chen, Yung-Fu
- Subjects
- *
GREEN'S functions , *DISPERSION relations , *ENTROPY (Information theory) , *WAVENUMBER , *ENTROPY , *HELMHOLTZ equation - Abstract
In this study, the resonant characteristics of the off-center-driven Chladni plates were systematically investigated for the square and equilateral triangle shapes. Experimental results reveal that the number of the resonant modes is considerably increased for the plates under the off-center-driving in comparison to the on-center-driving. The Green's functions derived from the nonhomogeneous Helmholtz equation are exploited to numerically analyze the information entropy distribution and the resonant nodal-line patterns. The experimental resonant modes are clearly confirmed to be in good agreement with the maximum entropy states in the Green's functions. Furthermore, the information entropy distribution of the Green's functions can be used to reveal that more eigenmodes can be triggered in the plate under the off-center-driving than the on-center-driving. By using the multiplication of the resonant modes in the off-center-driving, the dispersion relation between the experimental frequency and the theoretical wave number can be deduced with more accuracy. It is found that the deduced dispersion relations agree quite well with the Kirchhoff–Love plate theory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. A Small-Scale Object Detection Algorithm in Intelligent Transportation Scenarios.
- Author
-
Song, Junzi, Han, Chunyan, and Wu, Chenni
- Subjects
- *
OBJECT recognition (Computer vision) , *K-means clustering , *ENTROPY (Information theory) , *ALGORITHMS , *PYRAMIDS - Abstract
In response to the problem of poor detection ability of object detection models for small-scale targets in intelligent transportation scenarios, a fusion method is proposed to enhance the features of small-scale targets, starting from feature utilization and fusion methods. The algorithm is based on the YOLOv4 tiny framework and enhances the utilization of shallow and mid-level features on the basis of Feature Pyramid Network (FPN), improving the detection accuracy of small and medium-sized targets. In view of the problem that the background of the intelligent traffic scene image is cluttered, and there is more redundant information, the Convolutional Block Attention Module (CBAM) is used to improve the attention of the model to the traffic target. To address the problem of data imbalance and prior bounding box adaptation in custom traffic data sets that expand traffic images in COCO and VOC, we propose a Copy-Paste method with an improved generation method and a K-means algorithm with improved distance measurement to enhance the model's detection ability for corresponding categories. Comparative experiments were conducted on a customized 260-thousand traffic data set containing public traffic images, and the results showed that compared to YOLOv4 tiny, the proposed algorithm improved mAP by 4.9% while still ensuring the real-time performance of the model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Text-Enhanced Graph Attention Hashing for Cross-Modal Retrieval.
- Author
-
Zou, Qiang, Cheng, Shuli, Du, Anyu, and Chen, Jiayi
- Subjects
- *
TRANSFORMER models , *FEATURE extraction , *ENTROPY (Information theory) , *INFORMATION networks , *ANNOTATIONS - Abstract
Deep hashing technology, known for its low-cost storage and rapid retrieval, has become a focal point in cross-modal retrieval research as multimodal data continue to grow. However, existing supervised methods often overlook noisy labels and multiscale features in different modal datasets, leading to higher information entropy in the generated hash codes and features, which reduces retrieval performance. The variation in text annotation information across datasets further increases the information entropy during text feature extraction, resulting in suboptimal outcomes. Consequently, reducing the information entropy in text feature extraction, supplementing text feature information, and enhancing the retrieval efficiency of large-scale media data are critical challenges in cross-modal retrieval research. To tackle these, this paper introduces the Text-Enhanced Graph Attention Hashing for Cross-Modal Retrieval (TEGAH) framework. TEGAH incorporates a deep text feature extraction network and a multiscale label region fusion network to minimize information entropy and optimize feature extraction. Additionally, a Graph-Attention-based modal feature fusion network is designed to efficiently integrate multimodal information, enhance the affinity of the network for different modes, and retain more semantic information. Extensive experiments on three multilabel datasets demonstrate that the TEGAH framework significantly outperforms state-of-the-art cross-modal hashing methods. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Investigation of Laser Ablation Quality Based upon Entropy Analysis of Data Science.
- Author
-
Tsai, Chien-Chung and Yiu, Tung-Hon
- Subjects
- *
ENERGY levels (Quantum mechanics) , *LASER ablation , *DATA science , *ENTROPY (Information theory) , *MANUFACTURING processes - Abstract
Laser ablation is a vital material removal technique, but current methods lack a data-driven approach to assess quality. This study proposes a novel method, employing information entropy, a concept from data science, to evaluate laser ablation quality. By analyzing the randomness associated with the ablation process through the distribution of a probability value (reb), we quantify the uncertainty (entropy) of the ablation. Our research reveals that higher energy levels lead to lower entropy, signifying a more controlled and predictable ablation process. Furthermore, using an interval time closer to the baseline value improves the ablation consistency. Additionally, the analysis suggests that the energy level has a stronger correlation with entropy than the baseline interval time (bit). The entropy decreased by 6.32 from 12.94 at 0.258 mJ to 6.62 at 0.378 mJ, while the change due to the bit was only 2.12 (from 10.84 at bit/2 to 8.72 at bit). This indicates that energy is a more dominant factor for predicting ablation quality. Overall, this work demonstrates the feasibility of information entropy analysis for evaluating laser ablation, paving the way for optimizing laser parameters and achieving a more precise material removal process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Entropy‐based hybrid sampling (EHS) method to handle class overlap in highly imbalanced dataset.
- Author
-
Kumar, Anil, Singh, Dinesh, and Yadav, Rama Shankar
- Subjects
- *
RESEARCH personnel , *ENTROPY (Information theory) , *MINORITIES - Abstract
Class imbalance and class overlap create difficulties in the training phase of the standard machine learning algorithm. Its performance is not well in minority classes, especially when there is a high class imbalance and significant class overlap. Recently it has been observed by researchers that, the joint effects of class overlap and imbalance are more harmful as compared to their direct impact. To handle these problems, many methods have been proposed by researchers in past years that can be broadly categorized as data‐level, algorithm‐level, ensemble learning, and hybrid methods. Existing data‐level methods often suffer from problems like information loss and overfitting. To overcome these problems, we introduce a novel entropy‐based hybrid sampling (EHS) method to handle class overlap in highly imbalanced datasets. The EHS eliminates less informative majority instances from the overlap region during the undersampling phase and regenerates high informative synthetic minority instances in the oversampling phase near the borderline. The proposed EHS achieved significant improvement in F1‐score, G‐mean, and AUC performance metrics value by DT, NB, and SVM classifiers as compared to well‐established state‐of‐the‐art methods. Classifiers performances are tested on 28 datasets with extreme ranges in imbalance and overlap. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Fuzzy Dynamic Responses of Train–Bridge Coupled System Based on Information Entropy.
- Author
-
Xiang, Ping, Zeng, Yingying, Jiang, Lizhong, Zhao, Han, Hu, Huifang, Zhang, Peng, and Liu, Xiaochun
- Subjects
- *
ENTROPY (Information theory) , *FIX-point estimation , *RANDOM variables , *SYSTEM dynamics , *FUZZY systems - Abstract
In the analysis of a train–bridge coupled system, fuzzy uncertainty is a factor that must be considered in the prediction of coupled vibration response, but it has not been considered so far. In this work, the concept of information entropy is used to unify the fuzzy uncertainty and random variables into the train–bridge coupled system, and the fuzzy random train–bridge coupled system is established. The fuzzy dynamic response of trains and bridges with fuzzy parameters of the bridge structures and the mass of the carriage were studied, and the mean and variance of the response quantities were calculated using the new point estimation method (NPEM). The combined effect of the fuzziness is considered and the fuzzy value of the system dynamics is obtained. The feasibility of applying this method to train–bridge problems was verified. The calculation results indicated that the maximum amplitude of the fuzzy vertical displacement of the bridge exceeded the conventional vertical displacement by 25.57%, and the maximum amplitude of the fuzzy vertical acceleration of the train exceeded the conventional vertical acceleration by 23.42%. Obviously, in this case, the traditional deterministic calculation method cannot comprehensively and accurately analyze the dynamic response of the train–bridge system. The method in this paper can provide theoretical guidance for evaluating the safety of bridge structures and running safety research in the future. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. 珠江区用水量及用水结构时空演变分析.
- Author
-
薛娇, 刘喜燕, 许征, and 韩亚鑫
- Abstract
Copyright of Pearl River is the property of Pearl River Editorial Office and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
29. Quantifying the Emergence of Basic Research Capabilities in Cluster Enterprises: An Analytical Framework Based on Information Entropy.
- Author
-
Zhang, Hongsi, He, Zhongbing, and Zheng, Wenjiang
- Subjects
SYSTEMS theory ,TAX incentives ,GOVERNMENT business enterprises ,CHEMICAL energy ,SUBSIDIES - Abstract
This study looks at how basic research capabilities develop within enterprise clusters, focusing on the complex and adaptive nature of these systems. It builds a conceptual model using systems theory and applies information entropy to measure how much these capabilities have emerged. This study introduces an innovative application of information entropy to model and quantify the emergence of research capabilities within enterprise clusters, offering a novel framework for assessing research development. To dive deeper, China Pingmei Shenma Group (Henan, China) was used as a case study. A case study approach was used to gather empirical data. This case—focused on a state-owned enterprise cluster in China's coal-based energy and chemical industries—highlights the key factors that influence research capability growth. These factors include support from external systems, how internal resources are used, and their renewal over time. From 2017 to 2022, the study tracked how the organization of research capabilities evolved over time by tracking changes in entropy, revealing the process of research development driven by both internal and external forces. The methodology involves measuring system entropy to evaluate the degree of orderliness and innovation performance, incorporating entropy generation and exchange metrics, which allows for a more precise understanding of system emergence and complexity. The interactions within the system, such as knowledge exchange, research collaboration, and external input from government subsidies or tax incentives, are modeled to track how they influence the system's overall entropy. This study finds that the ability of an enterprise cluster to bring in external resources and reduce internal inefficiencies is critical for enhancing research capabilities. This model can help policymakers and enterprises in strategic decision-making, particularly in industries undergoing technological transformation. This framework also provides practical insights for improving research collaboration and innovation in enterprise clusters, especially in rapidly evolving industries like energy and chemicals. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. Spatio-Temporal Evolution of Water Consumption and Water Utilization Structure in Pearl River District
- Author
-
XUE Jiao, LIU Xiyan, XU Zheng, and HAN Yaxin
- Subjects
water consumption ,water utilization structure ,information entropy ,gravity center ,spatio-temporal variations ,Pearl River District ,River, lake, and water-supply engineering (General) ,TC401-506 - Abstract
Research on spatio-temporal evolution characteristics of water consumption and water utilization structure is significant to integrated management and optimal allocation of water resources. Based on the water consumption data from 1980 to 2019, the spatio-temporal evolution characteristics of water consumption and water utilization structure in Pearl River District were studied by introducing the information entropy and gravity center models. The results show that the water consumption in the Pearl River District increases first and then decreases slightly. The gravity center of water consumption moves first to the east and then to the west. The agricultural water consumption is greatly reduced, and the gravity center moves to the west. Water consumption of other parts increases overall, and the gravity centers move to the east. These variations are consistent with the economic and social development in the Pearl River District. The variations of water utilization structure in the Pearl River District can be divided into two stages, before 2010 and after 2010. In the first stage, the information entropy and equilibrium of water utilization structure increase significantly. The information entropy gravity center of the water utilization structure moves eastward gradually. The variation of water utilization structure in the eastern region is greater than that in the western region. In the second stage, the information entropy and equilibrium of the water utilization structure tend to be stable. The information entropy gravity center of the water utilization structure moves to the west slightly. The water utilization structure is stable. The change in water utilization structure in the western region is larger than that in the eastern region. In terms of population, GDP, industrial added value, and actual irrigated area of farmland, population and actual irrigated area of farmland are the main driving factors for water consumption and water utilization structure changes. The results are helpful to overall understand the variations of water consumption and water utilization structure in the Pearl River District in the past 40 years, which can provide technological support for integrated management and optimal allocation of water resources.
- Published
- 2024
- Full Text
- View/download PDF
31. A case study on entropy-aware block-based linear transforms for lossless image compression
- Author
-
Borut Žalik, David Podgorelec, Ivana Kolingerová, Damjan Strnad, and Štefan Kohek
- Subjects
Computer science ,Information entropy ,Prediction ,Inverse distance transform ,String transformations ,Medicine ,Science - Abstract
Abstract Data compression algorithms tend to reduce information entropy, which is crucial, especially in the case of images, as they are data intensive. In this regard, lossless image data compression is especially challenging. Many popular lossless compression methods incorporate predictions and various types of pixel transformations, in order to reduce the information entropy of an image. In this paper, a block optimisation programming framework $$\Phi$$ Φ is introduced to support various experiments on raster images, divided into blocks of pixels. Eleven methods were implemented within $$\Phi$$ Φ , including prediction methods, string transformation methods, and inverse distance weighting, as a representative of interpolation methods. Thirty-two different greyscale raster images with varying resolutions and contents were used in the experiments. It was shown that $$\Phi$$ Φ reduces information entropy better than the popular JPEG LS and CALIC predictors. The additional information associated with each block in $$\Phi$$ Φ is then evaluated. It was confirmed that, despite this additional cost, the estimated size in bytes is smaller in comparison to the sizes achieved by the JPEG LS and CALIC predictors.
- Published
- 2024
- Full Text
- View/download PDF
32. Diffusion at the interface of laser welded polyamide-6.6 and aluminum assemblies
- Author
-
P. Hirchenhahn, A. Al-Sayyad, J. Bardon, P. Plapper, and L. Houssiau
- Subjects
Diffusion ,Interface ,Laser welding ,ToF-SIMS ,Information entropy ,Mining engineering. Metallurgy ,TN1-997 - Abstract
Polymer/metal assemblies are widely used in industry, especially the automotive industry, to get more cost efficient and light weight structures. Although they present many advantages, their assembly remains challenging. Laser welding is an effective solution. Indeed, it is fast, presents high design freedom, and does not require any interstitial material. Furthermore, surface pretreatment can tune mechanical resistance. However, the root causes of adhesion remain partly unknown. The existence of chemical bonding at the interface has already been established, but other adhesion phenomena, such as diffusion, remain to be investigated. The aim of this study is to investigate the existence of diffusion at the interface between the polymer and the metal after laser welding. Therefore, a common material combination was used: polyamide-6.6 and aluminum. A thin film of polyamide-6.6 was deposited on mirror-polished aluminum, and two depth profiles out and in the weld were acquired by ToF-SIMS and compared. The results show that the diffusion of aluminum occurs at the interface of polyamide-6.6, with a diffusion length of approximately 22 nm.
- Published
- 2024
- Full Text
- View/download PDF
33. How popular is a topic on social media? A multi-criteria decision-making framework based on user engagement.
- Author
-
Güner, Samet, Cebeci, Halil Ibrahim, and Aydemir, Emrah
- Subjects
- *
MULTIPLE criteria decision making , *MULTI-objective optimization , *CITIZENS , *ENTROPY (Information theory) - Abstract
Purpose: Social media is widely used to capture citizens' opinions and topics deemed important. The importance or interest social media users attribute to a topic is traditionally measured by tweet frequency. This approach is practical but overlooks other user engagement tools such as retweets, likes, quotes, and replies. As a result, it may lead to a misinterpretation of social media signals. This paper aims to propose a method that considers all user engagement indicators and ranks the topics based on the interest attributed by social media users. Design/methodology/approach: A multi-criteria decision-making framework was proposed, which calculates the relative importance of user engagement tools using objective (information entropy) and subjective (Bayesian Best-Worst Method) methods. The results of the two methods are aggregated with a combinative method. Then, topics are ranked based on their user engagement levels using Multi-Objective Optimization by Ratio Analysis. Findings: The proposed approach was used to determine citizens' priorities in transport policy, and the findings are compared with those obtained solely based on tweet frequency. The results revealed that the proposed multi-criteria decision-making framework generated more comprehensive and robust results. Practical implications: The proposed method provides a systematic way to interpret social media signals and guide institutions in making better policies, hence ensuring that the demands of users/society are properly addressed. Originality/value: This study presents a systematic method to prioritize user preferences in social media. It is the first in the literature to discuss the necessity of considering all user engagement indicators and proposes a reliable method that calculates their relative importance. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
34. Combined Group Decision-Making Framework for Employment Quality Evaluation in Higher Education Institutions under Single-Valued Neutrosophic Sets.
- Author
-
Hongzhi Zhao and Fengqiang Zhang
- Subjects
- *
EMPLOYMENT statistics , *UNIVERSITIES & colleges , *GROUP decision making , *CAREER development , *LABOR demand - Abstract
The evaluation of employment quality in higher education institutions refers to assessing graduates' performance and employment status through multiple dimensions. It mainly includes employment rate, job satisfaction, salary levels, industry distribution, and career development prospects. This evaluation not only reflects the educational quality and social recognition of the institution but also provides valuable insights for optimizing teaching and training programs, helping graduates better meet the demands of the labor market. The employment quality evaluation in higher education institutions is a multi-attribute group decision-making (MAGDM) problem. Recently, the Exponential TODIM (ExpTODIM) and VIKOR methods have been applied to address MAGDM challenges. Singlevalued neutrosophic sets (SVNSs) are employed as a tool to represent uncertain data in the employment quality evaluation in higher education institutions. In this paper, we propose the singlevalued neutrosophic number Exponential TODIM-VIKOR (SVNN-ExpTODIM-VIKOR) method to solve MAGDM problems under SVNSs. Finally, a numerical case study is presented to validate the effectiveness of the proposed method in evaluating the employment quality in higher education institutions. [ABSTRACT FROM AUTHOR]
- Published
- 2025
35. Enhanced CoCoSo Framework for Computer Network Security Evaluation Through Utilizing the Type-2 Neutrosophic Multi-Attribute Group Decision-Making.
- Author
-
Ziqiao Wang, Xiaomu Cai, and Zhefeng Yin
- Subjects
- *
GROUP decision making , *ENTROPY (Information theory) , *SECURITY systems , *CONFIDENTIAL communications - Abstract
Computer network security (CNS) evaluation is a comprehensive analysis and assessment of a network system's security to identify potential vulnerabilities and threats, ensuring the system complies with relevant security standards. The evaluation includes the effectiveness of technical measures such as firewalls, intrusion detection, access control, and encryption, to ensure the confidentiality, integrity, and availability of data, ultimately enhancing overall security defenses. The evaluation of CNS is a multi-attribute group decision-making (MAGDM) problem. Recently, both the CoCoSo method and the information entropy approach have been applied to solve MAGDM challenges. Type-2 Neutrosophic Sets (T2NSs) are utilized to represent uncertain data during the network security evaluation process. In this study, the CoCoSo method is adapted for MAGDM with T2NSs. Furthermore, the type-2 Neutrosophic Number with CoCoSo (T2NN-CoCoSo) approach based on T2NN is developed for MAGDM. Finally, a numerical example is provided to demonstrate the application of the T2NN-CoCoSo approach in CNS evaluation. The key contributions of this study include: (1) the development of a MAGDM method using the T2NN-CoCoSo approach with T2NSs, and (2) the proposal of a novel MAGDM approach for CNS evaluation using the T2NN-CoCoSo method. [ABSTRACT FROM AUTHOR]
- Published
- 2025
36. An efficient Model for Satisfaction Evaluation of College Students Online Ideological and Political Education with Single-Valued Neutrosophic Numbers.
- Author
-
Kelei Shi
- Subjects
- *
STUDENT attitudes , *SATISFACTION , *GROUP decision making , *ENTROPY (Information theory) , *DATA mining - Abstract
Under the background of big data, the evaluation of college students' satisfaction with online ideological and political education (IAPE) is primarily achieved through data mining and analysis techniques, allowing for a more comprehensive and accurate reflection of students' attitudes and feedback. By collecting data through online surveys, social media interactions, and other channels, educators can adjust the content and methods of teaching in real-time to better meet students' needs and improve the effectiveness and satisfaction of IAPE. The satisfaction evaluation of college students' online IAPE in the context of big data is a multiattribute group decision-making (MAGDM) problem. Recently, VIKOR method have been applied to address MAGDM challenges. Single-valued neutrosophic sets (SVNSs) are employed as a tool to represent uncertain data in the satisfaction evaluation of college students' online IAPE within the big data context. In this paper, we propose the single-valued neutrosophic number VIKOR (SVNN-VIKOR) method to solve MAGDM problems under SVNSs. Finally, a numerical case study is presented to validate the effectiveness of the proposed method in evaluating the satisfaction of college students' online IAPE in the context of big data. [ABSTRACT FROM AUTHOR]
- Published
- 2025
37. Research on diagnosis for vibration faults in steam turbines using IRF algorithm
- Author
-
LI Wei, WU Yifan, MAO Jingyu, CHANG Zengjun, LI Zhongbo, and WANG Fangzhou
- Subjects
steam turbine ,vibration fault diagnosis ,irf ,ahp ,information entropy ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
The random forest algorithm, known for its robustness against noise and powerful computational capabilities, is widely employed in diagnosing vibration faults in rotating machinery. However, when applied in industrial settings, the algorithm encounters challenges such as limited sample sizes, the inability to integrate prior knowledge, and comparatively lower accuracy. To tackle these issues, a method for diagnosing vibration faults in steam turbines using an improved random forest (IRF) algorithm is proposed. This approach incorporates prior knowledge to optimize decision trees, utilizing analytical hierarchy process (AHP) and information entropy. Genuine operational datasets from the data center of a million-kW thermal power plant are utilized to validate the efficacy and reliability of the proposed method. Computational findings indicate that, in comparison to the traditional random forest algorithm, IRF achieves higher accuracy and a reduced miss rate, with a 33% decrease in the number of decision trees. Moreover, the operational time is slashed to just 11.4% of that taken by the traditional random forest algorithm. These results suggest that IRF holds significant practical value for real-time, precise vibration fault diagnosis in thermal power units.
- Published
- 2024
- Full Text
- View/download PDF
38. Exploring Digital Mobility in China’s Tourism Metaverse.
- Author
-
Xu, Jusi, Peng, Kang-Lin, and Weltmann, Dan
- Subjects
- *
SHARED virtual environments , *ENTROPY (Information theory) , *INFORMATION storage & retrieval systems , *DATA analysis , *TOURISM - Abstract
This study aims to explore digital mobility in the tourism metaverse through dual factors of information entropy and media functionality moderated by structural differentiation, including China’s institutional governance of information systems. The research model was constructed through influential factors to determine the digital mobility of metaverse systems using quantitative data analysis methods. Results showed that information entropy negatively influenced digital mobility. However, media functionality could be harnessed to reduce information entropy and thus enhance digital mobility in China’s tourism metaverse systems. Structural differentiation had significant moderation and mediation effects in the research model that influenced the digital mobility of Chinese tourists. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
39. The Interlink between Stakeholder Influence and Sustainable Practices: A Case Study of Thai Agriculture Enterprise.
- Author
-
Onbhuddha, Ruethai, Ma, Bingying, Chindavijak, Chavatip, and Ogata, Seiichi
- Abstract
Nowadays, agriculture businesses have been significantly impacted by rapid global changes, compelling the agro-industry to adopt sustainable development practices to remain resilient. Moreover, the application of stakeholder theory has become essential in business management to achieve inclusive growth and fulfill sustainable business. Understanding the interlink between stakeholder pressure and the motivation to transform an enterprise's practices into sustainable development is imperative. Therefore, this study aims to evaluate the direct pressure of stakeholder groups on sustainable practices in agriculture enterprises in Thailand through a questionnaire survey. This paper focused on the influence of primary and secondary stakeholders and evaluated the weighting of sustainability practices. The survey was conducted on employees who work in enterprises that apply Thailand's Sufficiency Economy Philosophy (SEP). The research adopted the regression and information entropy methods for result analysis. The results showed that employees, shareholders, and competitors are significant stakeholder groups that drive sustainable capital covering economics, nature, society, and human capital. Last, stakeholder management is an outstanding practice in a SEP thinking enterprise. Concurrently, human capital is the highest priority to fulfill this alternative pathway to be successful in enterprise sustainability. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
40. On the Combinatorial Acceptability Entropy Consensus Metric for Multi-Criteria Group Decisions.
- Author
-
Goers, Jana and Horton, Graham
- Subjects
- *
GROUP decision making , *MULTIPLE criteria decision making , *ENTROPY (Information theory) , *ENTROPY , *ALGORITHMS - Abstract
In group decisions, achieving consensus is important, because it increases commitment to the result. For cooperative groups, Combinatorial Multicriteria Acceptability Analysis (CMAA) is a group decision framework that can achieve consensus efficiently. It is based on a novel Combinatorial Acceptability Entropy (CAE) consensus metric. As an output measure, the CAE metric is unique in its ability to identify the evaluations that have the greatest impact on consensus and to prevent premature consensus. This paper is intended to complement the original CMAA publication by providing additional insights into the CAE consensus metric. The design requirements for the CAE algorithm are presented, and it is shown how these requirements follow from the properties of cooperative decisions. The CAE-based consensus-building algorithm is contrasted both qualitatively and quantitatively with a representative example of the conventional input distance and input averaging approach to multi-criteria consensus-building. A simulation experiment illustrates the ability of the CAE-based algorithm to converge quickly to the correct decision as defined for cooperative decisions. The metric is able to meet a new, more stringent definition of hard consensus. The CAE approach highlights the need to distinguish between competitive and cooperative group decisions. Attention in the literature has been paid almost exclusively to the former type; the CAE approach demonstrates the greater efficiency and effectiveness that can be achieved with an approach that is designed specifically for the latter. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. A Novel Hypersonic Target Trajectory Estimation Method Based on Long Short-Term Memory and a Multi-Head Attention Mechanism.
- Author
-
Xu, Yue, Pan, Quan, Wang, Zengfu, and Hu, Baoquan
- Subjects
- *
HYPERSONIC aerodynamics , *PROCESS capability , *ENTROPY (Information theory) , *INFORMATION theory , *FEATURE selection - Abstract
To address the complex maneuvering characteristics of hypersonic targets in adjacent space, this paper proposes an LSTM trajectory estimation method combined with the attention mechanism and optimizes the model from the information-theoretic perspective. The method captures the target dynamics by using the temporal processing capability of LSTM, and at the same time improves the efficiency of information utilization through the attention mechanism to achieve accurate prediction. First, a target dynamics model is constructed to clarify the motion behavior parameters. Subsequently, an LSTM model incorporating the attention mechanism is designed, which enables the model to automatically focus on key information fragments in the historical trajectory. In model training, information redundancy is reduced, and information validity is improved through feature selection and data preprocessing. Eventually, the model achieves accurate prediction of hypersonic target trajectories with limited computational resources. The experimental results show that the method performs well in complex dynamic environments with improved prediction accuracy and robustness, reflecting the potential of information theory principles in optimizing the trajectory prediction model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. 信息熵估计辅助的域自适应多源遥感影像地表 覆盖分类.
- Author
-
王定盼, 董小环, 黄令勇, 王晓华, 李庆君, and 季顺平
- Subjects
- *
LAND cover , *REMOTE sensing , *IMAGE analysis , *ENTROPY (Information theory) , *LEARNING ability - Abstract
Objectives: In the land cover classification study from multi-source remote sensing images, domain adaptation method can align images or extracted image features from source and target images, thus improves the generalization ability of deep learning models and plays an important role in intelligent remote sensing image interpretation. Methods: A self-training domain adaptation method based on information entropy uncertainty estimation for pseudo label correction is proposed, its core is an entropy uncertainty loss function for land cover classification between cross source remote sensing images. First, a land cover classification model is pretrained on the source domain training set with ground truth, and is applied on the target domain images without ground truth labels to generate pseudo labels. Then, the pseudo labels are used to further train the model, the information entropy of the prediction result is calculated and used as the uncertainty estimation of the pseudo labels to further correct the pseudo labels with self-training, so as to obtain weights of the classification model more suitable for the target domain dataset. Finally, a cross domain classification experiment was conducted on three data sets, namely, the WHU building change detection data set, the ISPRS 2D semantic annotation competition data set, and the Wuhan land cover classification data set. Results: Experimental results show that:(1) The proposed method improved the mean intersection over union(mIoU) and overall accuracy(OA) of semantic segmentation network by 0.3%-3.1% and 1.2%-4.5%, respectively. (2) Compared with the traditional self-training method, the proposed method can improve the mIoU and OA by 0.1%-1.5%. (3) Compared with the most recent uncertainty estimation method based on Kullback-Leibler divergence, the proposed method can improve the mIoU and OA about 0.6% in average. Conclusions: The proposed method can further improve the performance of a trained segmentation model for the target domain images without the requirement of target labels. There is also no need to introduce additional modules or parameters on the existing segmentation model. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
43. Designing a novel image encryption algorithm based on a 2D-SCLC hyperchaotic map.
- Author
-
Chen, Qianqian and Lv, Xiaoying
- Subjects
ENTROPY (Information theory) ,CRYPTOGRAPHY ,ALGORITHMS ,IMAGE encryption ,DESIGN - Abstract
Due to the initial value sensitivity and high pseudo-randomness of the chaotic system. Chaotic systems are often applied as a Pseudo-Random Number Generator (PRNG) in related fields. Especially in cryptography based on chaos, the Pseudo-Random Numbers (PRNs) produced by PRNG are often used as an interference in the design of encryption algorithm. However, due to lower chaos performance of PRNG such as chaotic output is discontinuous and lower complexity, it may bring the negative impact to its application in related field. In this study, a novel model of 2D sine–cosine-logistic coupling (2D-SCLC) hyperchaotic map is proposed. In comparison with several existing chaos map, the proposed system has better chaotic performance. It can effectively improve the security performance of encryption algorithm, if the proposed chaos map is applied in the design of image encryption algorithm. Further, the PRNs generated by 2D-SCLC is applied to the new image encryption algorithms. By using the security test tools, we can obtain the values of NPCR and UACI have reach 99.61% and 33.44%, they are close to the ideal value. In particular, the information entropy is 7.9996. The security test shows that the algorithms has higher security. This research provides a theoretical guidance for related fields based on chaotic cryptography. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
44. 基于改进随机森林算法的汽轮机振动故障诊断研究.
- Author
-
李 蔚, 吴懿范, 毛静宇, 常增军, 李仲博, and 王方舟
- Subjects
RANDOM forest algorithms ,ANALYTIC hierarchy process ,DECISION trees ,FAULT diagnosis ,STEAM-turbines - Abstract
Copyright of Zhejiang Electric Power is the property of Zhejiang Electric Power Editorial Office and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
45. 基于内置式声发射装置的金刚滚轮磨损 在线监测方法.
- Author
-
于光宁, 史慧楠, 迟杰, and 迟玉伦
- Abstract
In order to solve the diamond roller dressing wear situation is difficult to accurately online judgment problem. A set of built-in acoustic emission on-line monitoring diamond roller wear device was designed and developed. Based on Shannon information entropy theory and wavelet packet model, a method of acoustic emission signal processing for diamond wheel dressing was proposed. The method enabled the information entropy of wavelet packet coefficients to be calculated for each layer, and the optimal number of wavelet packet decomposition layers could be determined according to the variation rule of information entropy. Then, the wavelet packet decomposition feature parameter was downsized using principal component analysis, and the feature parameter that best characterizes the wear of diamond rollers was extracted. Finally, PSO-SVM (particle swarm optimization algorithm support vector machine) based on this feature parameter was established. The wear state of diamond rollers was monitored experimentally. The results show that the PSO-SVM model has the highest classification accuracy, with an average correct rate of more than 95.24%, and the effectiveness of PSO-SVM is verified by a large number of experiments. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
46. IEA-DP: Information Entropy-driven Adaptive Differential Privacy Protection Scheme for social networks.
- Author
-
Zhang, Jing, Si, Kunliang, Zeng, Zuanyang, Li, Tongxin, and Ye, Xiucai
- Subjects
- *
DATA privacy , *SOCIAL networks , *ENTROPY (Information theory) , *DATA protection , *DATA release , *PARALLEL processing - Abstract
With the ever-increasing intertwining of social networks and daily existence, the accumulation of personal privacy information is steadily mounting. However, the exposure of such data could lead to disastrous consequences. Current graph data protection algorithms lack sufficient research on the characteristics of social users, while simultaneously incurring significant time and space overhead. Additionally, the strategies lack adaptability in incorporating noise, often resulting in a subsequent decrease in data availability. To address these issues, a novel approach called the Information Entropy-driven Adaptive Differential Privacy Protection Scheme (IEA-DP) is presented for the release of social data in this study. The proposed solution initially designs the InfomapMerge algorithm to divide the data into communities based on the characteristics of social networks, thereby enabling parallel processing, and mitigating the time and space overhead. Subsequently, the Adaptive Edge Modification Algorithm (AEMA) is proposed to optimize the noise addition process by adaptively adding noise based on the score of node importance. This effectively reduces the amount of added noise, increasing data availability. Finally, the experimental results conducted on six public datasets demonstrate that the IEA-DP scheme strikes a desirable balance between data availability and privacy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
47. Research on Critical Quality Feature Recognition and Quality Prediction Method of Machining Based on Information Entropy and XGBoost Hyperparameter Optimization.
- Author
-
Qu, Dongyue, Gu, Chaoyun, Zhang, Hao, Liang, Wenchao, Zhang, Yuting, and Zhan, Yong
- Subjects
ENTROPY (Information theory) ,MARINE engines ,MANUFACTURING processes ,INFORMATION theory ,PREDICTION models - Abstract
To address the problem of predicting machining quality for critical features in the manufacturing process of mechanical products, a method that combines information entropy and XGBoost (version 2.1.1) hyperparameter optimization is proposed. Initially, machining data of mechanical products are analyzed based on information entropy theory to identify critical quality characteristics. Subsequently, a quality prediction model for these critical features is established using the XGBoost machine learning framework. The model's hyperparameters are then optimized through Bayesian optimization. This method is applied as a case study to a medium-speed marine diesel engine piston. After the critical quality characteristics in the machining process are identified, the machining quality of these vital characteristics is predicted, and the results are compared with those obtained from a machine learning model without hyperparameter optimization. The findings demonstrate that the proposed method effectively predicts the machining quality of mechanical products. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
48. Exploring the Diversity of Nuclear Density through Information Entropy.
- Author
-
Ma, Wei-Hu and Ma, Yu-Gang
- Subjects
- *
NUCLEAR structure , *ENTROPY (Information theory) , *NUCLEAR density , *CLUSTER theory (Nuclear physics) , *PARTICLES (Nuclear physics) - Abstract
This study explores the role of information entropy in understanding nuclear density distributions, including both stable configurations and non-traditional structures such as neutron halos and α -clustering. By quantifying the uncertainty and disorder inherent in nucleon distributions in nuclear many-body systems, information entropy provides a macroscopic measure of the physical properties of the system. A more dispersed and disordered density distribution results in a higher value of information entropy. This intrinsic relationship between information entropy and system complexity allows us to quantify uncertainty and disorder in nuclear structures by analyzing various geometric parameters such as nuclear radius, diffuseness, neutron skin, and cluster structural features. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Electroencephalography Emotion Recognition Based on Rhythm Information Entropy Extraction.
- Author
-
Liu, Zhen-Tao, Xu, Xin, She, Jinhua, Yang, Zhaohui, and Chen, Dan
- Subjects
- *
BRAIN waves , *EMOTION recognition , *ENTROPY (Information theory) , *CENTRAL nervous system , *DIFFERENTIAL entropy - Abstract
Electroencephalography (EEG) is a physiological signal directly generated by the central nervous system. Brain rhythm is closely related to a person's emotional state and is widely used for EEG emotion recognition. In previous studies, the rhythm specificity between different brain channels was seldom explored. In this paper, the rhythm specificity of brain channels is studied to improve the accuracy of EEG emotion recognition. Variational mode decomposition is used to decompose rhythm signals and enhance features, and two kinds of information entropy, i.e., differential entropy (DE) and dispersion entropy (DispEn) are extracted. The rhythm being used to get the best result of single channel emotion recognition is selected as the representative rhythm, and the remove one method is employed to obtain rhythm information entropy feature. In the experiment, the DEAP database was used for EEG emotion recognition in valence-arousal space. The results showed that the best result of rhythm DE feature classification in the valence dimension is 77.04%, and the best result of rhythm DispEn feature classification in the arousal dimension is 79.25%. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. Local entropy-based feature-preserving simplification and evaluation for large field point cloud.
- Author
-
Wang, Zhicheng and Yang, Huijun
- Subjects
- *
POINT cloud , *CROP growth , *POINT set theory , *SURFACE area , *PROBLEM solving , *COMPUTATIONAL neuroscience - Abstract
With the development of point cloud-based telemetry technology in recent years, the point cloud data of large field scenes acquired by various sensors have been applied to farmland boundary division, crop growth monitor, area surveying, etc. However, the large field point cloud will cost huge amounts of computational resources in the following transmission, storage and processing, which make it more important to simplify the field point cloud appropriately. In light of limitation of existing algorithms in point cloud simplification of large field scenes, we propose a novel feature-preserved simplification algorithm for large field point cloud data. By introducing the average local entropy as the threshold for area division, our algorithm effectively solves the problem of fuzzy boundary division, as well as preserving the field features and reducing the simplification errors. In view of the problem that the evaluation of current simplification algorithm is mainly focused on qualitative assessment, a quantitative evaluation index for the point cloud simplification is proposed by employing the point-average local entropy, which takes both model retention and simplification efficiency into account. Finally, comparable experiments are performed on four sets of point clouds. The results show that, compared with the statistics of six typical algorithms, the proposed algorithm increases the local entropy by 0.029%, 0.146% and 0.088% on our datasets, and increases 0.029% on the public dataset. The method accurately evaluates the simplification effect. Additionally, the surface area change rate is also used to further evaluate the performance of proposed algorithm, and the quantitative evaluation index is lower than others, which verify the advantages of proposed algorithm in feature protection and large field simplification. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.