302 results on '"INFORMATION ENTROPY"'
Search Results
2. Exploring the Multiplication of Resonant Modes in Off-Center-Driven Chladni Plates from Maximum Entropy States
- Author
-
Song-Qing Lin, Yu-Hsin Hsu, Kuan-Wei Su, Hsing-Chih Liang, and Yung-Fu Chen
- Subjects
Chladni patterns ,vibrating plates ,information entropy ,geometry ,Green’s function ,Mathematics ,QA1-939 - Abstract
In this study, the resonant characteristics of the off-center-driven Chladni plates were systematically investigated for the square and equilateral triangle shapes. Experimental results reveal that the number of the resonant modes is considerably increased for the plates under the off-center-driving in comparison to the on-center-driving. The Green’s functions derived from the nonhomogeneous Helmholtz equation are exploited to numerically analyze the information entropy distribution and the resonant nodal-line patterns. The experimental resonant modes are clearly confirmed to be in good agreement with the maximum entropy states in the Green’s functions. Furthermore, the information entropy distribution of the Green’s functions can be used to reveal that more eigenmodes can be triggered in the plate under the off-center-driving than the on-center-driving. By using the multiplication of the resonant modes in the off-center-driving, the dispersion relation between the experimental frequency and the theoretical wave number can be deduced with more accuracy. It is found that the deduced dispersion relations agree quite well with the Kirchhoff–Love plate theory.
- Published
- 2024
- Full Text
- View/download PDF
3. Quantifying the Emergence of Basic Research Capabilities in Cluster Enterprises: An Analytical Framework Based on Information Entropy
- Author
-
Hongsi Zhang, Zhongbing He, and Wenjiang Zheng
- Subjects
systems theory ,information entropy ,corporate basic research ,cluster enterprises ,innovation systems ,emergence ,Systems engineering ,TA168 ,Technology (General) ,T1-995 - Abstract
This study looks at how basic research capabilities develop within enterprise clusters, focusing on the complex and adaptive nature of these systems. It builds a conceptual model using systems theory and applies information entropy to measure how much these capabilities have emerged. This study introduces an innovative application of information entropy to model and quantify the emergence of research capabilities within enterprise clusters, offering a novel framework for assessing research development. To dive deeper, China Pingmei Shenma Group (Henan, China) was used as a case study. A case study approach was used to gather empirical data. This case—focused on a state-owned enterprise cluster in China’s coal-based energy and chemical industries—highlights the key factors that influence research capability growth. These factors include support from external systems, how internal resources are used, and their renewal over time. From 2017 to 2022, the study tracked how the organization of research capabilities evolved over time by tracking changes in entropy, revealing the process of research development driven by both internal and external forces. The methodology involves measuring system entropy to evaluate the degree of orderliness and innovation performance, incorporating entropy generation and exchange metrics, which allows for a more precise understanding of system emergence and complexity. The interactions within the system, such as knowledge exchange, research collaboration, and external input from government subsidies or tax incentives, are modeled to track how they influence the system’s overall entropy. This study finds that the ability of an enterprise cluster to bring in external resources and reduce internal inefficiencies is critical for enhancing research capabilities. This model can help policymakers and enterprises in strategic decision-making, particularly in industries undergoing technological transformation. This framework also provides practical insights for improving research collaboration and innovation in enterprise clusters, especially in rapidly evolving industries like energy and chemicals.
- Published
- 2024
- Full Text
- View/download PDF
4. Text-Enhanced Graph Attention Hashing for Cross-Modal Retrieval
- Author
-
Qiang Zou, Shuli Cheng, Anyu Du, and Jiayi Chen
- Subjects
cross-modal hashing ,graph attention ,feature fusion ,vision transformer ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Deep hashing technology, known for its low-cost storage and rapid retrieval, has become a focal point in cross-modal retrieval research as multimodal data continue to grow. However, existing supervised methods often overlook noisy labels and multiscale features in different modal datasets, leading to higher information entropy in the generated hash codes and features, which reduces retrieval performance. The variation in text annotation information across datasets further increases the information entropy during text feature extraction, resulting in suboptimal outcomes. Consequently, reducing the information entropy in text feature extraction, supplementing text feature information, and enhancing the retrieval efficiency of large-scale media data are critical challenges in cross-modal retrieval research. To tackle these, this paper introduces the Text-Enhanced Graph Attention Hashing for Cross-Modal Retrieval (TEGAH) framework. TEGAH incorporates a deep text feature extraction network and a multiscale label region fusion network to minimize information entropy and optimize feature extraction. Additionally, a Graph-Attention-based modal feature fusion network is designed to efficiently integrate multimodal information, enhance the affinity of the network for different modes, and retain more semantic information. Extensive experiments on three multilabel datasets demonstrate that the TEGAH framework significantly outperforms state-of-the-art cross-modal hashing methods.
- Published
- 2024
- Full Text
- View/download PDF
5. Investigation of Laser Ablation Quality Based upon Entropy Analysis of Data Science
- Author
-
Chien-Chung Tsai and Tung-Hon Yiu
- Subjects
laser ablation ,information entropy ,data science ,ablation quality ,material removal ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Laser ablation is a vital material removal technique, but current methods lack a data-driven approach to assess quality. This study proposes a novel method, employing information entropy, a concept from data science, to evaluate laser ablation quality. By analyzing the randomness associated with the ablation process through the distribution of a probability value (reb), we quantify the uncertainty (entropy) of the ablation. Our research reveals that higher energy levels lead to lower entropy, signifying a more controlled and predictable ablation process. Furthermore, using an interval time closer to the baseline value improves the ablation consistency. Additionally, the analysis suggests that the energy level has a stronger correlation with entropy than the baseline interval time (bit). The entropy decreased by 6.32 from 12.94 at 0.258 mJ to 6.62 at 0.378 mJ, while the change due to the bit was only 2.12 (from 10.84 at bit/2 to 8.72 at bit). This indicates that energy is a more dominant factor for predicting ablation quality. Overall, this work demonstrates the feasibility of information entropy analysis for evaluating laser ablation, paving the way for optimizing laser parameters and achieving a more precise material removal process.
- Published
- 2024
- Full Text
- View/download PDF
6. A Small-Scale Object Detection Algorithm in Intelligent Transportation Scenarios
- Author
-
Junzi Song, Chunyan Han, and Chenni Wu
- Subjects
intelligent transportation ,small object detection ,YOLOv4 tiny ,feature pyramid ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In response to the problem of poor detection ability of object detection models for small-scale targets in intelligent transportation scenarios, a fusion method is proposed to enhance the features of small-scale targets, starting from feature utilization and fusion methods. The algorithm is based on the YOLOv4 tiny framework and enhances the utilization of shallow and mid-level features on the basis of Feature Pyramid Network (FPN), improving the detection accuracy of small and medium-sized targets. In view of the problem that the background of the intelligent traffic scene image is cluttered, and there is more redundant information, the Convolutional Block Attention Module (CBAM) is used to improve the attention of the model to the traffic target. To address the problem of data imbalance and prior bounding box adaptation in custom traffic data sets that expand traffic images in COCO and VOC, we propose a Copy-Paste method with an improved generation method and a K-means algorithm with improved distance measurement to enhance the model’s detection ability for corresponding categories. Comparative experiments were conducted on a customized 260-thousand traffic data set containing public traffic images, and the results showed that compared to YOLOv4 tiny, the proposed algorithm improved mAP by 4.9% while still ensuring the real-time performance of the model.
- Published
- 2024
- Full Text
- View/download PDF
7. A Novel Hypersonic Target Trajectory Estimation Method Based on Long Short-Term Memory and a Multi-Head Attention Mechanism
- Author
-
Yue Xu, Quan Pan, Zengfu Wang, and Baoquan Hu
- Subjects
near space ,hypersonic speed ,target tracking ,information entropy ,information optimization ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
To address the complex maneuvering characteristics of hypersonic targets in adjacent space, this paper proposes an LSTM trajectory estimation method combined with the attention mechanism and optimizes the model from the information-theoretic perspective. The method captures the target dynamics by using the temporal processing capability of LSTM, and at the same time improves the efficiency of information utilization through the attention mechanism to achieve accurate prediction. First, a target dynamics model is constructed to clarify the motion behavior parameters. Subsequently, an LSTM model incorporating the attention mechanism is designed, which enables the model to automatically focus on key information fragments in the historical trajectory. In model training, information redundancy is reduced, and information validity is improved through feature selection and data preprocessing. Eventually, the model achieves accurate prediction of hypersonic target trajectories with limited computational resources. The experimental results show that the method performs well in complex dynamic environments with improved prediction accuracy and robustness, reflecting the potential of information theory principles in optimizing the trajectory prediction model.
- Published
- 2024
- Full Text
- View/download PDF
8. Research on Critical Quality Feature Recognition and Quality Prediction Method of Machining Based on Information Entropy and XGBoost Hyperparameter Optimization
- Author
-
Dongyue Qu, Chaoyun Gu, Hao Zhang, Wenchao Liang, Yuting Zhang, and Yong Zhan
- Subjects
critical quality characteristics ,machining quality prediction ,information entropy ,XGBoost ,Bayesian optimization ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
To address the problem of predicting machining quality for critical features in the manufacturing process of mechanical products, a method that combines information entropy and XGBoost (version 2.1.1) hyperparameter optimization is proposed. Initially, machining data of mechanical products are analyzed based on information entropy theory to identify critical quality characteristics. Subsequently, a quality prediction model for these critical features is established using the XGBoost machine learning framework. The model’s hyperparameters are then optimized through Bayesian optimization. This method is applied as a case study to a medium-speed marine diesel engine piston. After the critical quality characteristics in the machining process are identified, the machining quality of these vital characteristics is predicted, and the results are compared with those obtained from a machine learning model without hyperparameter optimization. The findings demonstrate that the proposed method effectively predicts the machining quality of mechanical products.
- Published
- 2024
- Full Text
- View/download PDF
9. Exploring the Diversity of Nuclear Density through Information Entropy
- Author
-
Wei-Hu Ma and Yu-Gang Ma
- Subjects
information entropy ,nuclear density distribution ,halo nuclei ,nuclear clustering ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This study explores the role of information entropy in understanding nuclear density distributions, including both stable configurations and non-traditional structures such as neutron halos and α-clustering. By quantifying the uncertainty and disorder inherent in nucleon distributions in nuclear many-body systems, information entropy provides a macroscopic measure of the physical properties of the system. A more dispersed and disordered density distribution results in a higher value of information entropy. This intrinsic relationship between information entropy and system complexity allows us to quantify uncertainty and disorder in nuclear structures by analyzing various geometric parameters such as nuclear radius, diffuseness, neutron skin, and cluster structural features.
- Published
- 2024
- Full Text
- View/download PDF
10. Research on Optimizing Low-Saturation Intersection Signals with Consideration for Both Efficiency and Fairness
- Author
-
Lingxiang Zhu, Lujing Yu, and Liang Zou
- Subjects
information entropy ,conversion rate ,average vehicle delay ,saturation ,sensitivity analysis ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
In response to the fairness issue arising from the unequal delay of vehicles in different phases at intersections and considering the actual situation of small and variable delays for vehicles in low-saturation intersection phases, this paper proposes the concept of “sacrificing efficiency for fairness”. Firstly, the universality of unfair delay phenomena at intersection phases is explained, especially at low-saturation intersections where the fluctuation in phase delays is 1.87 times higher than at other intersections. Then, a fairness evaluation index is constructed using information entropy, and the feasibility of the proposed approach is demonstrated. Subsequently, a signal optimization model that balances efficiency and fairness is proposed. Finally, the proposed model is validated through case studies, showing that it not only simultaneously considers efficiency and fairness but also has minimal impact on efficiency. Moreover, the changes to timing schemes in the efficiency model are much smaller compared to the model that only considers fairness. Sensitivity analysis reveals that the model performs better under low-saturation intersection conditions.
- Published
- 2024
- Full Text
- View/download PDF
11. Intelligent Fault Diagnosis Method for Rotating Machinery Based on Recurrence Binary Plot and DSD-CNN
- Author
-
Yuxin Shi, Hongwei Wang, Wenlei Sun, and Ruoyang Bai
- Subjects
fault diagnosis ,rotating machinery ,information entropy ,recurrence binary plot ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
To tackle the issue of the traditional intelligent diagnostic algorithm’s insufficient utilization of correlation characteristics within the time series of fault signals and to meet the challenges of accuracy and computational complexity in rotating machinery fault diagnosis, a novel approach based on a recurrence binary plot (RBP) and a lightweight, deep, separable, dilated convolutional neural network (DSD-CNN) is proposed. Firstly, a recursive encoding method is used to convert the fault vibration signals of rotating machinery into two-dimensional texture images, extracting feature information from the internal structure of the fault signals as the input for the model. Subsequently, leveraging the excellent feature extraction capabilities of a lightweight convolutional neural network embedded with attention modules, the fault diagnosis of rotating machinery is carried out. The experimental results using different datasets demonstrate that the proposed model achieves excellent diagnostic accuracy and computational efficiency. Additionally, compared with other representative fault diagnosis methods, this model shows better anti-noise performance under different noise test data, and it provides a reliable and efficient reference solution for rotating machinery fault-classification tasks.
- Published
- 2024
- Full Text
- View/download PDF
12. Infrared and Harsh Light Visible Image Fusion Using an Environmental Light Perception Network
- Author
-
Aiyun Yan, Shang Gao, Zhenlin Lu, Shuowei Jin, and Jingrong Chen
- Subjects
image fusion ,harsh light environment aware ,information entropy ,high-level vision tasks ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The complementary combination of emphasizing target objects in infrared images and rich texture details in visible images can effectively enhance the information entropy of fused images, thereby providing substantial assistance for downstream composite high-level vision tasks, such as nighttime vehicle intelligent driving. However, mainstream fusion algorithms lack specific research on the contradiction between the low information entropy and high pixel intensity of visible images under harsh light nighttime road environments. As a result, fusion algorithms that perform well in normal conditions can only produce low information entropy fusion images similar to the information distribution of visible images under harsh light interference. In response to these problems, we designed an image fusion network resilient to harsh light environment interference, incorporating entropy and information theory principles to enhance robustness and information retention. Specifically, an edge feature extraction module was designed to extract key edge features of salient targets to optimize fusion information entropy. Additionally, a harsh light environment aware (HLEA) module was proposed to avoid the decrease in fusion image quality caused by the contradiction between low information entropy and high pixel intensity based on the information distribution characteristics of harsh light visible images. Finally, an edge-guided hierarchical fusion (EGHF) module was designed to achieve robust feature fusion, minimizing irrelevant noise entropy and maximizing useful information entropy. Extensive experiments demonstrate that, compared to other advanced algorithms, the method proposed fusion results contain more useful information and have significant advantages in high-level vision tasks under harsh nighttime lighting conditions.
- Published
- 2024
- Full Text
- View/download PDF
13. Unruh Entropy of a Schwarzschild Black Hole
- Author
-
Maksym Teslyk, Olena Teslyk, Larissa Bravina, and Evgeny Zabrodin
- Subjects
Unruh effect ,black hole ,information entropy ,entanglement entropy ,Nuclear and particle physics. Atomic energy. Radioactivity ,QC770-798 - Abstract
The entropy produced by Unruh radiation is estimated and compared to the entropy of a Schwarzschild black hole. We simulate a spherical system of mass M by a set of Unruh horizons and estimate the total entropy of the outgoing radiation. Dependence on the mass and spin of the emitted particles is taken into account. The obtained results can be easily extended to any other intrinsic degrees of freedom of outgoing particles. The ratio of Unruh entropy to the Schwarzschild black hole entropy is derived in exact analytical form. For large black holes, this ratio exhibits high susceptibility to quantum numbers, e.g., spin s, of emitted quanta and varies from 0% for s=0 to 19.0% for s=5/2.
- Published
- 2023
- Full Text
- View/download PDF
14. Next-Generation Blockchain Technology: The Entropic Blockchain
- Author
-
Melvin M. Vopson, Serban G. Lepadatu, Anna Vopson, and Szymon Łukaszyk
- Subjects
information theory ,information entropy ,blockchain ,entropic barcoding ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
The storage, transmission, and processing of data become significant problems when large digital data files or databases are involved, as in the case of decentralized online global databases such as blockchain. Here, we propose a novel method that allows for the scalability of digital assets, including blockchain databases in the download, validation, and confidentiality processes, by developing a lightweight blockchain technology called Entropic Blockchain. This is a computer-implemented mathematical method by which to generate an information-entropic numerical barcode representation of a digital asset. Using this technique, a 1–2 Mb block of digital data can be represented by a few bytes, significantly reducing the size of a blockchain. The entropic barcode file can be utilized on its own or as an optically machine-readable entropic barcode for secure data transmission, processing, labeling, identification, and one-way encryption, as well as for compression, validation, and digital tamper-proof checks. The mathematics of this process and all the steps involved in its implementation are discussed in detail in this article.
- Published
- 2024
- Full Text
- View/download PDF
15. ESE-YOLOv8: A Novel Object Detection Algorithm for Safety Belt Detection during Working at Heights
- Author
-
Qirui Zhou, Dandan Liu, and Kang An
- Subjects
YOLOv8 ,object detection ,attention mechanism ,safety belt detection ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
To address the challenges associated with supervising workers who wear safety belts while working at heights, this study proposes a solution involving the utilization of an object detection model to replace manual supervision. A novel object detection model, named ESE-YOLOv8, is introduced. The integration of the Efficient Multi-Scale Attention (EMA) mechanism within this model enhances information entropy through cross-channel interaction and encodes spatial information into the channels, thereby enabling the model to obtain rich and significant information during feature extraction. By employing GSConv to reconstruct the neck into a slim-neck configuration, the computational load of the neck is reduced without the loss of information entropy, allowing the attention mechanism to function more effectively, thereby improving accuracy. During the model training phase, a regression loss function named the Efficient Intersection over Union (EIoU) is employed to further refine the model’s object localization capabilities. Experimental results demonstrate that the ESE-YOLOv8 model achieves an average precision of 92.7% at an IoU threshold of 50% and an average precision of 75.7% within the IoU threshold range of 50% to 95%. These results surpass the performance of the baseline model, the widely utilized YOLOv5 and demonstrate competitiveness among state-of-the-art models. Ablation experiments further confirm the effectiveness of the model’s enhancements.
- Published
- 2024
- Full Text
- View/download PDF
16. Evaluating the Attraction of Scenic Spots Based on Tourism Trajectory Entropy
- Author
-
Qiuhua Huang, Linyuan Xia, Qianxia Li, and Yixiong Xia
- Subjects
tourism trajectory ,scenic spot ,information entropy ,attraction evaluation ,experience index ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
With the development of positioning technology and the widespread application of mobile positioning terminal devices, the acquisition of trajectory data has become increasingly convenient. Furthermore, mining information related to scenic spots and tourists from trajectory data has also become increasingly convenient. This study used the normalization results of information entropy to evaluate the attraction of scenic spots and the experience index of tourists. Tourists and scenic spots were chosen as the probability variables to calculate information entropy, and the probability values of each variable were calculated according to certain methods. There is a certain competitive relationship between scenic spots of the same type. When the distance between various scenic spots is relatively close (less than 8 km), a strong cooperative relationship can be established. Scenic spots with various levels of attraction can generally be classified as follows: cultural heritage, natural landscape, and leisure and entertainment. Scenic spots with higher attraction are usually those with a higher A-level and convenient transportation. A considerable number of tourists do not choose to visit crowded scenic destinations but choose some spots that they are more interested in according to personal preferences and based on access to free travel.
- Published
- 2024
- Full Text
- View/download PDF
17. Coil Parameter Optimization Method for Wireless Power Transfer System Based on Crowding Distance Division and Adaptive Genetic Operators
- Author
-
Hua Zhang, Xin Sui, Peng Sui, Lili Wei, Yuanchun Huang, Zhenglong Yang, and Haidong Yang
- Subjects
MCR-WPT ,parameter optimization of magnetic coupling coil ,NSGA-II algorithm ,crowding distance calculation ,adaptive operator ,information entropy ,Technology - Abstract
In a Magnetically Coupled Resonant Wireless Power Transfer (MCR-WPT) system, the magnetic coupling coil is one of the key factors that determines the system’s output power, transmission efficiency, anti-offset capability, and so on. This article proposes a coil parameter optimization method for a wireless power transfer system based on crowding distance division and adaptive genetic operators. Through optimizing the design of decision variables, such as the numbers of transmitting and receiving coil turns, the spacings between transmitting and receiving coil turns, the inner radii of the transmitting and receiving coils, and the vertical distance of the coil, the best transmission performance can be achieved. This study improves the NSGA-II algorithm through proposing a genetic operator algorithm for average crowding and high crowding populations based on adaptive operators, as well as a genetic operator algorithm for low crowding populations based on information entropy. These improved algorithms avoid problems inherent to traditional genetic operators such as fixed genetic proportions, do not easily cause the algorithm to fall into a local optimal solution, and show better convergence in the ZDT1–ZDT3 test functions. The optimization design method in this article is not only independent of commercial software such as ANSYS Maxwell 2021 R1, but can also significantly improve the calculation speed compared with traditional simulation software.
- Published
- 2024
- Full Text
- View/download PDF
18. Avionics Module Fault Diagnosis Algorithm Based on Hybrid Attention Adaptive Multi-Scale Temporal Convolution Network
- Author
-
Qiliang Du, Mingde Sheng, Lubin Yu, Zhenwei Zhou, Lianfang Tian, and Shilie He
- Subjects
avionics module ,fault diagnosis ,adaptive convolution ,attention mechanism ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Since the reliability of the avionics module is crucial for aircraft safety, the fault diagnosis and health management of this module are particularly significant. While deep learning-based prognostics and health management (PHM) methods exhibit highly accurate fault diagnosis, they have disadvantages such as inefficient data feature extraction and insufficient generalization capability, as well as a lack of avionics module fault data. Consequently, this study first employs fault injection to simulate various fault types of the avionics module and performs data enhancement to construct the P2020 communications processor fault dataset. Subsequently, a multichannel fault diagnosis method, the Hybrid Attention Adaptive Multi-scale Temporal Convolution Network (HAAMTCN) for the integrated functional circuit module of the avionics module, is proposed, which adaptively constructs the optimal size of the convolutional kernel to efficiently extract features of avionics module fault signals with large information entropy. Further, the combined use of the Interaction Channel Attention (ICA) module and the Hierarchical Block Temporal Attention (HBTA) module results in the HAAMTCN to pay more attention to the critical information in the channel dimension and time step dimension. The experimental results show that the HAAMTCN achieves an accuracy of 99.64% in the avionics module fault classification task which proves our method achieves better performance in comparison with existing methods.
- Published
- 2024
- Full Text
- View/download PDF
19. Information Entropy Analysis of a PIV Image Based on Wavelet Decomposition and Reconstruction
- Author
-
Zhiwu Ke, Wei Zheng, Xiaoyu Wang, and Mei Lin
- Subjects
particle image velocimetry ,image processing ,wavelet decomposition and reconstruction ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In particle image velocimetry (PIV) experiments, background noise inevitably exists in the particle images when a particle image is being captured or transmitted, which blurs the particle image, reduces the information entropy of the image, and finally makes the obtained flow field inaccurate. Taking a low-quality original particle image as the research object in this research, a frequency domain processing method based on wavelet decomposition and reconstruction was applied to perform particle image pre-processing. Information entropy analysis was used to evaluate the effect of image processing. The results showed that useful high-frequency particle information representing particle image details in the original particle image was effectively extracted and enhanced, and the image background noise was significantly weakened. Then, information entropy analysis of the image revealed that compared with the unprocessed original particle image, the reconstructed particle image contained more effective details of the particles with higher information entropy. Based on reconstructed particle images, a more accurate flow field can be obtained within a lower error range.
- Published
- 2024
- Full Text
- View/download PDF
20. Linear System Identification-Oriented Optimal Tampering Attack Strategy and Implementation Based on Information Entropy with Multiple Binary Observations
- Author
-
Zhongwei Bai, Peng Yu, Yan Liu, and Jin Guo
- Subjects
linear system identification ,information entropy ,data tampering attack ,multiple binary observation ,Industrial engineering. Management engineering ,T55.4-60.8 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
With the rapid development of computer technology, communication technology, and control technology, cyber-physical systems (CPSs) have been widely used and developed. However, there are massive information interactions in CPSs, which lead to an increase in the amount of data transmitted over the network. The data communication, once attacked by the network, will seriously affect the security and stability of the system. In this paper, for the data tampering attack existing in the linear system with multiple binary observations, in the case where the estimation algorithm of the defender is unknown, the optimization index is constructed based on information entropy from the attacker’s point of view, and the problem is modeled. For the problem of the multi-parameter optimization with energy constraints, this paper uses particle swarm optimization (PSO) to obtain the optimal data tampering attack solution set, and gives the estimation method of unknown parameters in the case of unknown parameters. To implement the real-time improvement of online implementation, the BP neural network is designed. Finally, the validity of the conclusions is verified through numerical simulation. This means that the attacker can construct effective metrics based on information entropy without the knowledge of the defense’s discrimination algorithm. In addition, the optimal attack strategy implementation based on PSO and BP is also effective.
- Published
- 2024
- Full Text
- View/download PDF
21. Trajectory Compression with Spatio-Temporal Semantic Constraints
- Author
-
Yan Zhou, Yunhan Zhang, Fangfang Zhang, Yeting Zhang, and Xiaodi Wang
- Subjects
spatio-temporal trajectory ,trajectory compression ,geometric similarity ,semantic similarity ,information entropy ,Geography (General) ,G1-922 - Abstract
Most trajectory compression methods primarily focus on geometric similarity between compressed and original trajectories, lacking explainability of compression results due to ignoring semantic information. This paper proposes a spatio-temporal semantic constrained trajectory compression method. It constructs a new trajectory distance measurement model integrating both semantic and spatio-temporal features. This model quantifies semantic features using information entropy and measures spatio-temporal features with synchronous Euclidean distance. The compression principle is to retain feature points with maximum spatio-temporal semantic distance from the original trajectory until the compression rate is satisfied. Experimental results show these methods closely resemble each other in maintaining geometric similarity of trajectories, but our method significantly outperforms DP, TD-TR, and CascadeSync methods in preserving semantic similarity of trajectories. This indicates that our method considers both geometric and semantic features during compression, resulting in the compressed trajectory becoming more interpretable.
- Published
- 2024
- Full Text
- View/download PDF
22. Vesicle Morphogenesis in Amphiphilic Triblock Copolymer Solutions
- Author
-
Senyuan Liu, Mohammad Sadegh Samie, and Radhakrishna Sureshkumar
- Subjects
triblock copolymer ,micelle ,vesicle ,polymersome ,molecular dynamics ,information entropy ,Chemistry ,QD1-999 - Abstract
Coarse-grained molecular dynamics simulations are employed to investigate the spatiotemporal evolution of vesicles (polymersomes) through the self-assembly of randomly distributed amphiphilic BAB triblock copolymers with hydrophilic A and hydrophobic B blocks in an aqueous solution. The vesiculation pathway consists of several intermediate structures, such as an interconnected network of copolymer aggregates, a cage of cylindrical micelles, and a lamellar cage. The cage-to-vesicle transition occurs at a constant aggregation number and practically eliminates the hydrophobic interfacial area between the B block and solvent. Molecular reorganization underlying the sequence of morphology transitions from a cage-like aggregate to a vesicle is nearly isentropic. The end-to-end distances of isolated copolymer chains in solution and those within a vesicular assembly follow lognormal probability distributions. This can be attributed to the preponderance of folded chain configurations in which the two hydrophobic end groups of a given chain stay close to each other. However, the probability distribution of end-to-end distances is broader for chains within the vesicle as compared with that of a single chain. This is due to the swelling of the folded configurations within the hydrophobic bilayer. Increasing the hydrophobicity of the B block reduces the vesiculation time without qualitatively altering the self-assembly pathway.
- Published
- 2024
- Full Text
- View/download PDF
23. Global Semantic-Sense Aggregation Network for Salient Object Detection in Remote Sensing Images
- Author
-
Hongli Li, Xuhui Chen, Wei Yang, Jian Huang, Kaimin Sun, Ying Wang, Andong Huang, and Liye Mei
- Subjects
salient object detection ,remote sensing image ,semantic interaction ,semantic perception ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Salient object detection (SOD) aims to accurately identify significant geographical objects in remote sensing images (RSI), providing reliable support and guidance for extensive geographical information analyses and decisions. However, SOD in RSI faces numerous challenges, including shadow interference, inter-class feature confusion, as well as unclear target edge contours. Therefore, we designed an effective Global Semantic-aware Aggregation Network (GSANet) to aggregate salient information in RSI. GSANet computes the information entropy of different regions, prioritizing areas with high information entropy as potential target regions, thereby achieving precise localization and semantic understanding of salient objects in remote sensing imagery. Specifically, we proposed a Semantic Detail Embedding Module (SDEM), which explores the potential connections among multi-level features, adaptively fusing shallow texture details with deep semantic features, efficiently aggregating the information entropy of salient regions, enhancing information content of salient targets. Additionally, we proposed a Semantic Perception Fusion Module (SPFM) to analyze map relationships between contextual information and local details, enhancing the perceptual capability for salient objects while suppressing irrelevant information entropy, thereby addressing the semantic dilution issue of salient objects during the up-sampling process. The experimental results on two publicly available datasets, ORSSD and EORSSD, demonstrated the outstanding performance of our method. The method achieved 93.91% Sα, 98.36% Eξ, and 89.37% Fβ on the EORSSD dataset.
- Published
- 2024
- Full Text
- View/download PDF
24. Analysis of Vibration Characteristics of Bridge Structures under Seismic Excitation
- Author
-
Ling’ai Li and Shengxiang Huang
- Subjects
seismic wave ,vibration response ,time-frequency analysis ,singular value decomposition ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Bridges may undergo structural vibration responses when exposed to seismic waves. An analysis of structural vibration characteristics is essential for evaluating the safety and stability of a bridge. In this paper, a signal time-frequency feature extraction method (NTFT-ESVD) integrating standard time-frequency transformation, singular value decomposition, and information entropy is proposed to analyze the vibration characteristics of structures under seismic excitation. First, the experiment simulates the response signal of the structure when exposed to seismic waves. The results of the time-frequency analysis indicate a maximum relative error of only 1% in frequency detection, and the maximum relative errors in amplitude and time parameters are 5.9% and 6%, respectively. These simulation results demonstrate the reliability of the NTFT-ESVD method in extracting the time-frequency characteristics of the signal and its suitability for analyzing the seismic response of the structure. Then, a real seismic wave event of the Su-Tong Yangtze River Bridge during the Hengchun earthquake in Taiwan (2006) is analyzed. The results show that the seismic waves only have a short-term impact on the bridge, with the maximum amplitude of the vibration response no greater than 1 cm, and the maximum vibration frequency no greater than 0.2 Hz in the three-dimensional direction, indicating that the earthquake in Hengchun will not have any serious impact on the stability and security of the Su-Tong Yangtze River Bridge. Additionally, the reliability of determining the arrival time of seismic waves by extracting the time-frequency information from structural vibration response signals is validated by comparing it with results from seismic stations (SSE/WHN/QZN) at similar epicenter distances published by the USGS. The results of the case study show that the combination of dynamic GNSS monitoring technology and time-frequency analysis can be used to analyze the impact of seismic waves on the bridge, which is of great help to the manager in assessing structural seismic damage.
- Published
- 2024
- Full Text
- View/download PDF
25. An Energy-Based Complex Brain Network Model—Part 1: Local Electrophysiological Dynamics
- Author
-
Chun-Lin Yang, Nandan Shettigar, and C. Steve Suh
- Subjects
complex networks ,real-life complex network modeling ,dynamical complex networks ,neuronal brain network dynamics ,information entropy ,statistical mechanics ,Thermodynamics ,QC310.15-319 ,Biochemistry ,QD415-436 - Abstract
The human brain is a complex network of connected neurons whose dynamics are difficult to describe. Brain dynamics are the global manifestation of individual neuron dynamics and the synaptic coupling between neurons. Membrane potential is a function of synaptic dynamics and electrophysiological coupling, with the parameters of postsynaptic potential, action potential, and ion pump dynamics. By modelling synaptic dynamics using physical laws and the time evolution of membrane potential using energy, neuron dynamics can be described. This local depiction can be scaled up to describe mesoscopic and macroscopic hierarchical complexity in the brain. Modelling results are favorably compared with physiological observation and physically acquired action potential profiles as reported in the literature.
- Published
- 2023
- Full Text
- View/download PDF
26. Underwriter Discourse, IPO Profit Distribution, and Audit Quality: An Entropy Study from the Perspective of an Underwriter–Auditor Network
- Author
-
Songling Yang, Yafei Tai, Yu Cao, Yunzhu Chen, and Qiuyue Zhang
- Subjects
information entropy ,complex networks ,underwriters ,profit distribution ,audit quality ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Underwriters play a pivotal role in the IPO process. Information entropy, a tool for measuring the uncertainty and complexity of information, has been widely applied to various issues in complex networks. Information entropy can quantify the uncertainty and complexity of nodes in the network, providing a unique analytical perspective and methodological support for this study. This paper employs a bipartite network analysis method to construct the relationship network between underwriters and accounting firms, using the centrality of underwriters in the network as a measure of their influence to explore the impact of underwriters’ influence on the distribution of interests and audit outcomes. The findings indicate that a more pronounced influence of underwriters significantly increases the ratio of underwriting fees to audit fees. Higher influence often accompanies an increase in abnormal underwriting fees. Further research reveals that companies underwritten by more influential underwriters experience a decline in audit quality. Finally, the study reveals that a well-structured audit committee governance and the rationalization of market sentiments can mitigate the negative impacts of underwriters’ influence. The innovation of this paper is that it enriches the content related to underwriters by constructing the relationship network between underwriters and accounting firms for the first time using a bipartite network through the lens of information entropy. This conclusion provides new directions for thinking about the motives and possibilities behind financial institutions’ cooperation, offering insights for market regulation and policy formulation.
- Published
- 2024
- Full Text
- View/download PDF
27. Postnatal Development of Synaptic Plasticity at Hippocampal CA1 Synapses: Correlation of Learning Performance with Pathway-Specific Plasticity
- Author
-
Yuheng Yang, Yuya Sakimoto, and Dai Mitsushima
- Subjects
contextual learning ,synaptic diversity ,information entropy ,developmental critical periods ,AMPA receptor ,GABAA receptor ,Neurosciences. Biological psychiatry. Neuropsychiatry ,RC321-571 - Abstract
To determine the critical timing for learning and the associated synaptic plasticity, we analyzed developmental changes in learning together with training-induced plasticity. Rats were subjected to an inhibitory avoidance (IA) task prior to weaning. While IA training did not alter latency at postnatal day (PN) 16, there was a significant increase in latency from PN 17, indicating a critical day for IA learning between PN 16 and 17. One hour after training, acute hippocampal slices were prepared for whole-cell patch clamp analysis following the retrieval test. In the presence of tetrodotoxin (0.5 µM), miniature excitatory postsynaptic currents (mEPSCs) and inhibitory postsynaptic currents (mIPSCs) were sequentially recorded from the same CA1 neuron. Although no changes in the amplitude of mEPSCs or mIPSCs were observed at PN 16 and 21, significant increases in both excitatory and inhibitory currents were observed at PN 23, suggesting a specific critical day for training-induced plasticity between PN 21 and 23. Training also increased the diversity of postsynaptic currents at PN 23 but not at PN 16 and 21, demonstrating a critical day for training-induced increase in the information entropy of CA1 neurons. Finally, we analyzed the plasticity at entorhinal cortex layer III (ECIII)-CA1 or CA3-CA1 synapses for each individual rat. At either ECIII-CA1 or CA3-CA1 synapses, a significant correlation between mean α-amino-3-hydroxy-5-methyl-4-isoxazole propionic acid/N-methyl-D-aspartic acid (AMPA/NMDA) ratio and learning outcomes emerged at PN 23 at both synapses, demonstrating a critical timing for the direct link between AMPA receptor-mediated synaptic plasticity and learning efficacy. Here, we identified multiple critical periods with respect to training-induced synaptic plasticity and delineated developmental trajectories of learning mechanisms at hippocampal CA1 synapses.
- Published
- 2024
- Full Text
- View/download PDF
28. Research on a Framework for Chinese Argot Recognition and Interpretation by Integrating Improved MECT Models
- Author
-
Mingfeng Li, Xin Li, Mianning Hu, and Deyu Yuan
- Subjects
argot recognition and interpretation ,information entropy ,semantic space ,MECT model ,transformer architecture ,large language model ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In underground industries, practitioners frequently employ argots to communicate discreetly and evade surveillance by investigative agencies. Proposing an innovative approach using word vectors and large language models, we aim to decipher and understand the myriad of argots in these industries, providing crucial technical support for law enforcement to detect and combat illicit activities. Specifically, positional differences in semantic space distinguish argots, and pre-trained language models’ corpora are crucial for interpreting them. Expanding on these concepts, the article assesses the semantic coherence of word vectors in the semantic space based on the concept of information entropy. Simultaneously, we devised a labeled argot dataset, MNGG, and developed an argot recognition framework named CSRMECT, along with an argot interpretation framework called LLMResolve. These frameworks leverage the MECT model, the large language model, prompt engineering, and the DBSCAN clustering algorithm. Experimental results demonstrate that the CSRMECT framework outperforms the current optimal model by 10% in terms of the F1 value for argot recognition on the MNGG dataset, while the LLMResolve framework achieves a 4% higher accuracy in interpretation compared to the current optimal model.The related experiments undertaken also indicate a potential correlation between vector information entropy and model performance.
- Published
- 2024
- Full Text
- View/download PDF
29. Adaptive Space-Aware Infotaxis II as a Strategy for Odor Source Localization
- Author
-
Shiqi Liu, Yan Zhang, and Shurui Fan
- Subjects
odor source localization ,information entropy ,Bayesian inference ,adaptive navigation ,salp swarm algorithm ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Mobile robot olfaction of toxic and hazardous odor sources is of great significance in anti-terrorism, disaster prevention, and control scenarios. Aiming at the problems of low search efficiency and easily falling into a local optimum of the current odor source localization strategies, the paper proposes the adaptive space-aware Infotaxis II algorithm. To improve the tracking efficiency of robots, a new reward function is designed by considering the space information and emphasizing the exploration behavior of robots. Considering the enhancement in exploratory behavior, an adaptive navigation-updated mechanism is proposed to adjust the movement range of robots in real time through information entropy to avoid an excessive exploration behavior during the search process, which may lead the robot to fall into a local optimum. Subsequently, an improved adaptive cosine salp swarm algorithm is applied to confirm the optimal information adaptive parameter. Comparative simulation experiments between ASAInfotaxis II and the classical search strategies are carried out in 2D and 3D scenarios regarding the search efficiency and search behavior, which show that ASAInfotaxis II is competent to improve the search efficiency to a larger extent and achieves a better balance between exploration and exploitation behaviors.
- Published
- 2024
- Full Text
- View/download PDF
30. Research on Risk Contagion in ESG Industries: An Information Entropy-Based Network Approach
- Author
-
Chenglong Hu and Ranran Guo
- Subjects
ESG ,tail risk network ,information entropy ,risk contagion ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Sustainable development is a practical path to optimize industrial structures and enhance investment efficiency. Investigating risk contagion within ESG industries is a crucial step towards reducing systemic risks and fostering the green evolution of the economy. This research constructs ESG industry indices, taking into account the possibility of extreme tail risks, and employs VaR and CoVaR as measures of tail risk. The TENET network approach is integrated to to capture the structural evolution and direction of information flow among ESG industries, employing information entropy to quantify the topological characteristics of the network model, exploring the risk transmission paths and evolution patterns of ESG industries in an extreme tail risk event. Finally, Mantel tests are conducted to examine the existence of significant risk spillover effects between ESG and traditional industries. The research finds strong correlations among ESG industry indices during stock market crash, Sino–US trade frictions, and the COVID-19 pandemic, with industries such as the COAL, CMP, COM, RT, and RE playing key roles in risk transmission within the network, transmitting risks to other industries. Affected by systemic risk, the information entropy of the TENET network significantly decreases, reducing market information uncertainty and leading market participants to adopt more uniform investment strategies, thus diminishing the diversity of market behaviors. ESG industries show resilience in the face of extreme risks, demonstrating a lack of significant risk contagion with traditional industries.
- Published
- 2024
- Full Text
- View/download PDF
31. On Playing with Emotion: A Spatial Evolutionary Variation of the Ultimatum Game
- Author
-
D. Y. Charcon and L. H. A. Monteiro
- Subjects
emotional expression ,evolutionary game ,information entropy ,population dynamics ,social network ,spatial game ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The Ultimatum Game is a simplistic representation of bargaining processes occurring in social networks. In the standard version of this game, the first player, called the proposer, makes an offer on how to split a certain amount of money. If the second player, called the responder, accepts the offer, the money is divided according to the proposal; if the responder declines the offer, both players receive no money. In this article, an agent-based model is employed to evaluate the performance of five distinct strategies of playing a modified version of this game. A strategy corresponds to instructions on how a player must act as the proposer and as the responder. Here, the strategies are inspired by the following basic emotions: anger, fear, joy, sadness, and surprise. Thus, in the game, each interacting agent is a player endowed with one of these five basic emotions. In the modified version explored in this article, the spatial dimension is taken into account and the survival of the players depends on successful negotiations. Numerical simulations are performed in order to determine which basic emotion dominates the population in terms of prevalence and accumulated money. Information entropy is also computed to assess the time evolution of population diversity and money distribution. From the obtained results, a conjecture on the emergence of the sense of fairness is formulated.
- Published
- 2024
- Full Text
- View/download PDF
32. Event-Triggered Relearning Modeling Method for Stochastic System with Non-Stationary Variable Operating Conditions
- Author
-
Jiyan Liu, Yong Zhang, Yuyang Zhou, and Jing Chen
- Subjects
stochastic processes ,non-stationary and variable conditions ,event-triggered conditions ,sliding window algorithm ,information entropy ,Mathematics ,QA1-939 - Abstract
This study presents a novel event-triggered relearning framework for neural network modeling, designed to improve prediction precision in dynamic stochastic complex industrial systems under non-stationary and variable conditions. Firstly, a sliding window algorithm combined with entropy is applied to divide the input and output datasets across different operational conditions, establishing clear data boundaries. Following this, the prediction errors derived from the neural network under different operational states are harnessed to define a set of event-triggered relearning criteria. Once these conditions are triggered, the relevant dataset is used to recalibrate the model to the specific operational condition and predict the data under this operating condition. When the predicted data fall within the training input range of a pre-trained model, we switch to that model for immediate prediction. Compared with the conventional BP neural network model and random vector functional-link network, the proposed model can produce a better estimation accuracy and reduce computation costs. Finally, the effectiveness of our proposed method is validated through numerical simulation tests using nonlinear Hammerstein models with Gaussian noise, reflecting complex stochastic industrial processes.
- Published
- 2024
- Full Text
- View/download PDF
33. The Use of Information Entropy and Expert Opinion in Maximizing the Discriminating Power of Composite Indicators
- Author
-
Matheus Pereira Libório, Roxani Karagiannis, Alexandre Magno Alvez Diniz, Petr Iakovlevitch Ekel, Douglas Alexandre Gomes Vieira, and Laura Cozzi Ribeiro
- Subjects
composite indicators ,information entropy ,cost of doing business ,discriminating power ,hybrid weighting scheme ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This research offers a solution to a highly recognized and controversial problem within the composite indicator literature: sub-indicators weighting. The research proposes a novel hybrid weighting method that maximizes the discriminating power of the composite indicator with objectively defined weights. It considers the experts’ uncertainty concerning the conceptual importance of sub-indicators in the multidimensional phenomenon, setting maximum and minimum weights (constraints) in the optimization function. The hybrid weighting scheme, known as the SAW-Max-Entropy method, avoids attributing weights that are incompatible with the multidimensional phenomenon’s theoretical framework. At the same time, it reduces the influence of assessment errors and judgment biases on composite indicator scores. The research results show that the SAW-Max-Entropy weighting scheme achieves greater discriminating power than weighting schemes based on the Entropy Index, Expert Opinion, and Equal Weights. The SAW-Max-Entropy method has high application potential due to the increasing use of composite indicators across diverse areas of knowledge. Additionally, the method represents a robust response to the challenge of constructing composite indicators with superior discriminating power.
- Published
- 2024
- Full Text
- View/download PDF
34. A Multi-Point Geostatistical Modeling Method Based on 2D Training Image Partition Simulation
- Author
-
Yifei Zhao, Jianhong Chen, Shan Yang, Kang He, Hideki Shimada, and Takashi Sasaoka
- Subjects
multi-point geostatistics ,training image ,variogram ,FILTERSIM ,information entropy ,Mathematics ,QA1-939 - Abstract
In this paper, a multi-point geostatistical (MPS) method based on variational function partition simulation is proposed to solve the key problem of MPS 3D modeling using 2D training images. The new method uses the FILTERSIM algorithm framework, and the variational function is used to construct simulation partitions and training image sequences, and only a small number of training images close to the unknown nodes are used in the partition simulation to participate in the MPS simulation. To enhance the reliability, a new covariance filter is also designed to capture the diverse features of the training patterns and allow the filter to downsize the training patterns from any direction; in addition, an information entropy method is used to reconstruct the whole 3D space by selecting the global optimal solution from several locally similar training patterns. The stability and applicability of the new method in complex geological modeling are demonstrated by analyzing the parameter sensitivity and algorithm performance. A geological model of a uranium deposit is simulated to test the pumping of five reserved drill holes, and the results show that the accuracy of the simulation results of the new method is improved by 11.36% compared with the traditional MPS method.
- Published
- 2023
- Full Text
- View/download PDF
35. Entropy Change of Historical and Cultural Heritage in Traditional Tibetan Area of China Based on Spatial-Temporal Distribution Pattern
- Author
-
Xiwei Xu, Junyu Zhang, Shupeng Liu, Jiaqi Liu, Zhen Zhang, and Xiaoyuan Tian
- Subjects
historical and cultural heritage ,spatial distribution pattern ,information entropy ,traditional Tibetan area of China ,Building construction ,TH1-9745 - Abstract
The traditional Tibetan area of China is an ethnically and culturally significant region with a historical geographical connection. This study investigates the spatial-temporal distribution patterns and entropy changes of historical and cultural heritage by examining the association between cultural heritage and socio-historical factors. It utilizes analytical methods such as information entropy and incorporates temporal, spatial, and typological information from the data obtained in the Third National Cultural Relics Census. The findings are as follows: (1) The three major regions in the Tibetan area of China alternately serve as development cores for the traditional Tibetan area, exhibiting a fluctuating “dispersion-aggregation” trend of historical and cultural heritage, which also displays notable regional variations. (2) The quantity and entropy change of historical and cultural heritage exhibit correlations between different periods, but there are also some intergenerational differences. (3) The spatial-temporal distribution pattern of historical and cultural heritage demonstrates an inter-era correlation, indicating that socio-historical development is a nonlinear process characterized by both “transition” and “accumulation”. These findings are of significant importance for further understanding the social evolutionary process of human settlements in high-altitude areas and for the comprehensive protection of cultural heritage in ethnic regions.
- Published
- 2023
- Full Text
- View/download PDF
36. A New Transformation Technique for Reducing Information Entropy: A Case Study on Greyscale Raster Images
- Author
-
Borut Žalik, Damjan Strnad, David Podgorelec, Ivana Kolingerová, Luka Lukač, Niko Lukač, Simon Kolmanič, Krista Rizman Žalik, and Štefan Kohek
- Subjects
computer science ,algorithm ,string transformation ,information entropy ,Hilbert space filling curve ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This paper proposes a new string transformation technique called Move with Interleaving (MwI). Four possible ways of rearranging 2D raster images into 1D sequences of values are applied, including scan-line, left-right, strip-based, and Hilbert arrangements. Experiments on 32 benchmark greyscale raster images of various resolutions demonstrated that the proposed transformation reduces information entropy to a similar extent as the combination of the Burrows–Wheeler transform followed by the Move-To-Front or the Inversion Frequencies. The proposed transformation MwI yields the best result among all the considered transformations when the Hilbert arrangement is applied.
- Published
- 2023
- Full Text
- View/download PDF
37. The Question of Studying Information Entropy in Poetic Texts
- Author
-
Olga Kozhemyakina, Vladimir Barakhnin, Natalia Shashok, and Elina Kozhemyakina
- Subjects
quantitative text analysis ,information entropy ,author’s style features ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
One of the approaches to quantitative text analysis is to represent a given text in the form of a time series, which can be followed by an information entropy study for different text representations, such as “symbolic entropy”, “phonetic entropy” and “emotional entropy” of various orders. Studying authors’ styles based on such entropic characteristics of their works seems to be a promising area in the field of information analysis. In this work, the calculations of entropy values of the first, second and third order for the corpus of poems by A.S. Pushkin and other poets from the Golden Age of Russian Poetry were carried out. The values of “symbolic entropy”, “phonetic entropy” and “emotional entropy” and their mathematical expectations and variances were calculated for given corpora using the software application that automatically extracts statistical information, which is potentially applicable to tasks that identify features of the author’s style. The statistical data extracted could become the basis of the stylometric classification of authors by entropy characteristics.
- Published
- 2023
- Full Text
- View/download PDF
38. Real-Time Online Goal Recognition in Continuous Domains via Deep Reinforcement Learning
- Author
-
Zihao Fang, Dejun Chen, Yunxiu Zeng, Tao Wang, and Kai Xu
- Subjects
online goal recognition ,deep reinforcement learning ,continuous domain ,communication constraints ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The problem of goal recognition involves inferring the high-level task goals of an agent based on observations of its behavior in an environment. Current methods for achieving this task rely on offline comparison inference of observed behavior in discrete environments, which presents several challenges. First, accurately modeling the behavior of the observed agent requires significant computational resources. Second, continuous simulation environments cannot be accurately recognized using existing methods. Finally, real-time computing power is required to infer the likelihood of each potential goal. In this paper, we propose an advanced and efficient real-time online goal recognition algorithm based on deep reinforcement learning in continuous domains. By leveraging the offline modeling of the observed agent’s behavior with deep reinforcement learning, our algorithm achieves real-time goal recognition. We evaluate the algorithm’s online goal recognition accuracy and stability in continuous simulation environments under communication constraints.
- Published
- 2023
- Full Text
- View/download PDF
39. Energy, Trophic Dynamics and Ecological Discounting
- Author
-
Georgios Karakatsanis and Nikos Mamassis
- Subjects
ecosystem services ,Eltonian Pyramid ,discounting ,risk ,uncertainty ,information entropy ,Agriculture - Abstract
Ecosystems provide humanity with a wide variety and high economic value-added services, from biomass structuring to genetic information, pollutants’ decomposition, water purification and climate regulation. The foundation of ecosystem services is the Eltonian Pyramid, where via prey–predator relationships, energy metabolism and biomass building take place. In the context of existing ecosystem services classification and valuation methods (e.g., CICES, MEA, TEEB), financial investments in ecosystem services essentially address the conservation of trophic pyramids. Our work’s main target is to investigate how trophic pyramids’ dynamics (stability or instability) impact the long-run discounting of financial investments on ecosystem services’ value. Specifically, a trophic pyramid with highly fluctuating populations generates higher risks for the production of ecosystem services, hence for ecological finance instruments coupled to them, due to higher temporal uncertainty or information entropy that should be incorporated into their discount rates. As this uncertainty affects negatively the net present value (NPV) of financial capital on ecosystem services, we argue that the minimization of biomass fluctuations in trophic pyramids via population control should be among the priorities of ecosystem management practices. To substantiate our hypothesis, we construct a logistic predation model, which is consistent with the Eltonian Pyramid’s ecological energetics. As the logistic predator model’s parameters determine the tropic pyramid’s dynamics and uncertainty, we develop an adjusted Shannon entropy index (H(N)ADJ) to measure this effect as part of the discount rate. Indicatively, we perform a Monte Carlo simulation of a pyramid with intrinsic growth parameter values that yield oscillating population sizes. Finally, we discuss, from an ecological energetics standpoint, issues of competition and diversity in trophic pyramids, as special dimensions and extensions of our analytical framework.
- Published
- 2023
- Full Text
- View/download PDF
40. An Ensemble Outlier Detection Method Based on Information Entropy-Weighted Subspaces for High-Dimensional Data
- Author
-
Zihao Li and Liumei Zhang
- Subjects
high-dimensional data ,outlier detection ,information entropy ,subspaces ,ensemble ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Outlier detection is an important task in the field of data mining and a highly active area of research in machine learning. In industrial automation, datasets are often high-dimensional, meaning an effort to study all dimensions directly leads to data sparsity, thus causing outliers to be masked by noise effects in high-dimensional spaces. The “curse of dimensionality” phenomenon renders many conventional outlier detection methods ineffective. This paper proposes a new outlier detection algorithm called EOEH (Ensemble Outlier Detection Method Based on Information Entropy-Weighted Subspaces for High-Dimensional Data). First, random secondary subsampling is performed on the data, and detectors are run on various small-scale sub-samples to provide diverse detection results. Results are then aggregated to reduce the global variance and enhance the robustness of the algorithm. Subsequently, information entropy is utilized to construct a dimension-space weighting method that can discern the influential factors within different dimensional spaces. This method generates weighted subspaces and dimensions for data objects, reducing the impact of noise created by high-dimensional data and improving high-dimensional data detection performance. Finally, this study offers a design for a new high-precision local outlier factor (HPLOF) detector that amplifies the differentiation between normal and outlier data, thereby improving the detection performance of the algorithm. The feasibility of this algorithm is validated through experiments that used both simulated and UCI datasets. In comparison to popular outlier detection algorithms, our algorithm demonstrates a superior detection performance and runtime efficiency. Compared with the current popular, common algorithms, the EOEH algorithm improves the detection performance by 6% on average. In terms of running time for high-dimensional data, EOEH is 20% faster than the current popular algorithms.
- Published
- 2023
- Full Text
- View/download PDF
41. TMD Design by an Entropy Index for Seismic Control of Tall Shear-Bending Buildings
- Author
-
Yumei Wang and Zhe Qu
- Subjects
shear-bending building ,Grammians ,Hankel singular values (HSVs) ,information entropy ,optimal TMD ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This study proposes a new arrangement-tuning method to maximize the potential of tuned mass dampers (TMDs) in decreasing the seismic responses of tall buildings. The method relies on a Grammian-based entropy index with the physical meaning of covariance responses to white noise without the involvement of external inputs. A twelve-story RC frame-shear wall building was used as an example to illustrate the method. Indices were computed for the building with TMDs placed on different stories and tuning to different modes and were compared with responses to white noise (colored) time histories. Results showed that greater index reduction cases agree well with greater story-drift reductions cases, despite the differences in the time step of the white noises and structural model types (pure shear vs. shear-bending), and the optimal TMD is not necessarily the traditional “roof—1st mode tuning” case. Comparisons were also made for the shear-bending building under seven earthquake excitations. It is found that, though TMDs are not full-band effective controllers, the index-selected TMDs still perform the best in three out of seven earthquakes. So, the proposed internal-property-based entropy index provides a good controller design for large-scale structures under unpredictable none-stationary excitations.
- Published
- 2023
- Full Text
- View/download PDF
42. Quantifying Multi-Scale Performance of Geometric Features for Efficient Extraction of Insulators from Point Clouds
- Author
-
Jie Tang, Junxiang Tan, Yongyong Du, Haojie Zhao, Shaoda Li, Ronghao Yang, Tao Zhang, and Qitao Li
- Subjects
insulator extraction ,point clouds ,power inspection ,quantification ,information entropy ,Science - Abstract
Insulator extraction from images or 3D point clouds is an important part of automatic power inspection by unmanned airborne vehicles (UAVs), which is vital for improving the efficiency of inspection and the stability of power grids. However, for point cloud data, many challenges, such as the diversity of pylon shape and insulator type, complex topology, and similarity of structures, were not tackled with the study of power element extraction. To efficiently identify the small insulators from complex power transmission corridor (PTC) scenarios, this paper proposes a robust extraction method by fusing multi-scale neighborhood and multi-feature entropy weighting. The pylon head is segmented according to the aspect ratio of horizontal slices following the locating of the pylons based on the height difference and continuous vertical distribution firstly. Aiming to quantify the different contributions of features in decision-making and better segment insulators, a feature evaluation system combined with information entropy, eigen entropy-based optimal neighborhood selection, and designed multi-scale features is constructed to identify suspension insulators and tension insulators. In the optimization step, a region erosion and growing method is proposed to segment complete insulator strings by enlarging the perspectives to obtain more object representations. The extraction results of 82 pylons with 654 insulators demonstrate that the proposed method is suitable for different pylon shapes and sizes. The identification accuracy of the whole line achieves 98.23% and the average F1 score is 90.98%. The proposed method can provide technical support for automatic UAV inspection and pylon reconstruction.
- Published
- 2023
- Full Text
- View/download PDF
43. Identifying Influential Nodes in Complex Networks Based on Information Entropy and Relationship Strength
- Author
-
Ying Xi and Xiaohui Cui
- Subjects
complex networks ,information entropy ,influential node ,relationship strength ,SIR ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Identifying influential nodes is a key research topic in complex networks, and there have been many studies based on complex networks to explore the influence of nodes. Graph neural networks (GNNs) have emerged as a prominent deep learning architecture, capable of efficiently aggregating node information and discerning node influence. However, existing graph neural networks often ignore the strength of the relationships between nodes when aggregating information about neighboring nodes. In complex networks, neighboring nodes often do not have the same influence on the target node, so the existing graph neural network methods are not effective. In addition, the diversity of complex networks also makes it difficult to adapt node features with a single attribute to different types of networks. To address the above problems, the paper constructs node input features using information entropy combined with the node degree value and the average degree of the neighbor, and proposes a simple and effective graph neural network model. The model obtains the strength of the relationships between nodes by considering the degree of neighborhood overlap, and uses this as the basis for message passing, thereby effectively aggregating information about nodes and their neighborhoods. Experiments are conducted on 12 real networks, using the SIR model to verify the effectiveness of the model with the benchmark method. The experimental results show that the model can identify the influence of nodes in complex networks more effectively.
- Published
- 2023
- Full Text
- View/download PDF
44. A Chunked and Disordered Data Privacy Protection Algorithm: Application to Resource Platform Systems
- Author
-
Daike Zhang, Junyang Chen, Yihui He, Xiaoqing Lan, Xian Chen, Chunlin Dong, and Jun Li
- Subjects
data privacy ,information security ,information entropy ,randomness ,resource platform ,secret key space ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
This paper provides a systematic analysis of existing resource platforms, evaluating their advantages and drawbacks with respect to data privacy protection. To address the privacy and security risks associated with resource platform data, we propose a novel privacy protection algorithm based on chunking disorder. Our algorithm exchanges data within a specific range of chunk size for the position and combines the chunked data with the MD5 value in a differential way, thus ensuring data privacy. To ensure the security of the algorithm, we also discuss the importance of preventing client and server decompilation during its implementation. The findings of our experiments are as follows. Our proposed privacy-preserving algorithm is extremely secure and easy to implement. Our algorithm has a significant avalanche effect, maintaining values of 0.61–0.85, with information entropy being maintained at 4.5–4.9. This indicates that our algorithm is highly efficient without compromising data security. Furthermore, our algorithm has strong encryption and decryption time stability. The key length can be up to 594 bits, rendering it challenging to decrypt. Compared with the traditional DES algorithm, our algorithm has better security under the same conditions and approaches the levels of security offered by the AES and RC4 algorithms.
- Published
- 2023
- Full Text
- View/download PDF
45. Adaptive Fusion Sampling Strategy Combining Geotechnical and Geophysical Data for Evaluating Two-Dimensional Soil Liquefaction Potential and Reconsolidation Settlement
- Author
-
Huajian Yang, Zhikui Liu, Yan Yan, Yuantao Li, and Guozheng Tao
- Subjects
liquefaction potential ,cone penetration test ,Bayesian compressive sampling ,information entropy ,data fusion ,differential settlement ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
In engineering practice, properly characterizing the spatial distribution of soil liquefaction potential and induced surface settlement is essential for seismic hazard assessment and mitigation. However, geotechnical site investigations (e.g., cone penetration test (CPT)) usually provide limited and sparse data with high accuracy. Geophysical surveys provide abundant two-dimensional (2D) data, yet their accuracy is lower than that of geotechnical investigations. Moreover, correlating geotechnical and geophysical data can effectively reduce site investigation costs. This study proposes a data-driven adaptive fusion sampling strategy that automatically develops an assessment model of the spatial distribution of soil liquefaction potential from spatially sparse geotechnical data, performs monitoring of liquefaction-induced settlement, and integrates spatiotemporally unconstrained geophysical data to update the model systematically and quantitatively. The proposed strategy is illustrated using real data, and the results indicate that the proposed strategy overcomes the difficulty of generating high-resolution spatial distributions of liquefaction potential from sparse geotechnical data, enables more accurate judgment of settlement variations in local areas, and is an effective tool for site liquefaction hazard analysis.
- Published
- 2023
- Full Text
- View/download PDF
46. A New Method for Crop Type Mapping at the Regional Scale Using Multi-Source and Multi-Temporal Sentinel Imagery
- Author
-
Xiaohu Wang, Shifeng Fang, Yichen Yang, Jiaqiang Du, and Hua Wu
- Subjects
crop type mapping ,the regional scale ,multi-source ,multi-temporal ,time-series ,information entropy ,Science - Abstract
Crop type mapping at high resolution is crucial for various purposes related to agriculture and food security, including the monitoring of crop yields, evaluating the potential effects of natural disasters on agricultural production, analyzing the potential impacts of climate change on agriculture, etc. However, accurately mapping crop types and ranges on large spatial scales remains a challenge. For the accurate mapping of crop types at the regional scale, this paper proposed a crop type mapping method based on the combination of multiple single-temporal feature images and time-series feature images derived from Sentinel-1 (SAR) and Sentinel-2 (optical) satellite imagery on the Google Earth Engine (GEE) platform. Firstly, crop type classification was performed separately using multiple single-temporal feature images and the time-series feature image. Secondly, with the help of information entropy, this study proposed a pixel-scale crop type classification accuracy evaluation metric, i.e., the CA-score, which was used to conduct a vote on the classification results of multiple single-temporal images and the time-series feature image to obtain the final crop type map. A comparative analysis showed that the proposed classification method had excellent performance and that it can achieve accurate mapping of multiple crop types at a 10 m resolution for large spatial scales. The overall accuracy (OA) and the kappa coefficient (KC) were 84.15% and 0.80, respectively. Compared with the classification results that were based on the time-series feature image, the OA was improved by 3.37%, and the KC was improved by 0.03. In addition, the CA-score proposed in this study can effectively reflect the accuracy of crop identification and can serve as a pixel-scale classification accuracy evaluation metric, providing a more comprehensive visual interpretation of the classification accuracy. The proposed method and metrics have the potential to be applied to the mapping of larger study areas with more complex land cover types using remote sensing.
- Published
- 2023
- Full Text
- View/download PDF
47. Psychological Implicit Motives Construct as an Emergent Fractal Attractor from Intermittent Neurophysiological Responses: Simulation and Entropy-like Characterization
- Author
-
Miguel Ángel Martín, Celia Vara, and Carlos García-Gutiérrez
- Subjects
Implicit Motives ,Iterated Random Function Systems ,information entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Implicit Motives are non-conscious needs that drive human behavior towards the achievement of incentives that are affectively incited. Repeated affective experiences providing satisfying rewards have been held responsible for the building of Implicit Motives. Responses to rewarding experiences have a biological basis via close connections with neurophysiological systems controlling neurohormone release. We propose an iteration random function system acting in a metric space to model experience–reward interactions. This model is based on key facts of Implicit Motive theory reported in a broad number of studies. The model shows how (random) responses produced by intermittent random experiences create a well-defined probability distribution on an attractor, thus providing an insight into the underlying mechanism leading to the emergence of Implicit Motives as psychological structures. Implicit Motives’ robustness and resilience properties appear theoretically explained by the model. The model also provides uncertainty entropy-like parameters to characterize Implicit Motives which hopefully might be useful, beyond the mere theoretical frame, when used in combination with neurophysiological methods.
- Published
- 2023
- Full Text
- View/download PDF
48. Fault Root Cause Tracking of the Mechanical Components of CNC Lathes Based on Information Transmission
- Author
-
Yingzhi Zhang, Guiming Guo, and Jialin Liu
- Subjects
fault root cause tracking ,signal acquisition ,information entropy ,net transfer entropy ,moving window method ,CNC lathe tool ,Chemical technology ,TP1-1185 - Abstract
This study proposes a new method for the immediate fault warning and fault root tracing of CNC lathes. Here, the information acquisition scheme was formulated based on the analysis of the coupling relationship between the mechanical parts of CNC lathes. Once the collected status signals were de-noised and coarse-grained, transfer entropy theory was introduced to calculate the net entropy of information transfer between the mechanical parts, after which the information transfer model was constructed. The sliding window method was used to determine the probability threshold interval of the net information transfer entropy between the lathe mechanical parts under different processing modes. Therefore, the transition critical point was determined according to the information entropy, and the fault development process was clarified. By analyzing the information transfer changes between the parts, fault early warning and fault root tracking on the CNC lathe were realized. The proposed method realizes the digitalization and intelligentization of fault diagnosis and has the advantages of timely and efficient diagnosis. Finally, the effectiveness of the proposed method is verified by a numerical control lathe tool processing experiment.
- Published
- 2023
- Full Text
- View/download PDF
49. An Information Entropy Masked Vision Transformer (IEM-ViT) Model for Recognition of Tea Diseases
- Author
-
Jiahong Zhang, Honglie Guo, Jin Guo, and Jing Zhang
- Subjects
information entropy ,masked autoencoder ,vision transformer ,tea disease image recognition ,Agriculture - Abstract
Tea is one of the most popular drinks in the world. The rapid and accurate recognition of tea diseases is of great significance for taking targeted preventive measures. In this paper, an information entropy masked vision transformation (IEM-ViT) model was proposed for the rapid and accurate recognition of tea diseases. The information entropy weighting (IEW) method was used to calculate the IE of each segment of the image, so that the model could learn the maximum amount of knowledge and information more quickly and accurately. An asymmetric encoder–decoder architecture was used in the masked autoencoder (MAE), where the encoder operated on only a subset of visible patches and the decoder recovered the labeled masked patches, reconstructing the missing pixels for parameter sharing and data augmentation. The experimental results showed that the proposed IEM-ViT had an accuracy of 93.78% for recognizing the seven types of tea diseases. In comparison to the currently common image recognition algorithms including the ResNet18, VGG16, and VGG19, the recognition accuracy was improved by nearly 20%. Additionally, in comparison to the other six published tea disease recognition methods, the proposed IEM-ViT model could recognize more types of tea diseases and the accuracy was improved simultaneously.
- Published
- 2023
- Full Text
- View/download PDF
50. Semi-Supervised Semantic Segmentation of Remote Sensing Images Based on Dual Cross-Entropy Consistency
- Author
-
Mengtian Cui, Kai Li, Yulan Li, Dany Kamuhanda, and Claudio J. Tessone
- Subjects
cross-entropy consistency ,information entropy ,semi-supervised ,channel attention mechanism ,remote sensing image ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Semantic segmentation is a growing topic in high-resolution remote sensing image processing. The information in remote sensing images is complex, and the effectiveness of most remote sensing image semantic segmentation methods depends on the number of labels; however, labeling images requires significant time and labor costs. To solve these problems, we propose a semi-supervised semantic segmentation method based on dual cross-entropy consistency and a teacher–student structure. First, we add a channel attention mechanism to the encoding network of the teacher model to reduce the predictive entropy of the pseudo label. Secondly, the two student networks share a common coding network to ensure consistent input information entropy, and a sharpening function is used to reduce the information entropy of unsupervised predictions for both student networks. Finally, we complete the alternate training of the models via two entropy-consistent tasks: (1) semi-supervising student prediction results via pseudo-labels generated from the teacher model, (2) cross-supervision between student models. Experimental results on publicly available datasets indicate that the suggested model can fully understand the hidden information in unlabeled images and reduce the information entropy in prediction, as well as reduce the number of required labeled images with guaranteed accuracy. This allows the new method to outperform the related semi-supervised semantic segmentation algorithm at half the proportion of labeled images.
- Published
- 2023
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.