363 results
Search Results
2. Foundations of Information Science Selected papers from FIS 2002
- Author
-
Pedro C. Marijuán
- Subjects
Information Science ,Reductionism ,Mechanics ,Bioinformation ,Adaptability ,Entropy ,Symmetry ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The accompanying papers in the first issue of Entropy, volume 5, 2003 were presented at the electronic conference on Foundations of Information Science FIS 2002 (http://www.mdpi.net/fis2002/). The running title of this FIS e-conference was THE NATURE OF INFORMATION: CONCEPTIONS, MISCONCEPTIONS, AND PARADOXES. It was held on the Internet from 6 to 10 May 2002, and was followed by a series of discussions –structured as focused sessions– which took place in the net from 10 May 2002 until 31 January 2003 (more than 400 messages were exchanged, see: http://fis.iguw.tuwien.ac.at/mailings/). This Introduction will briefly survey the problems around the concept of information, will present the central ideas of the FIS initiative, and will contrast some of the basic differences between information and mechanics (reductionism).
- Published
- 2003
- Full Text
- View/download PDF
3. Efficient Detection of Malicious Traffic Using a Decision Tree-Based Proximal Policy Optimisation Algorithm: A Deep Reinforcement Learning Malicious Traffic Detection Model Incorporating Entropy
- Author
-
Yuntao Zhao, Deao Ma, and Wei Liu
- Subjects
network security ,deep reinforcement learning ,entropy ,decision tree proximal policy optimisation ,malicious traffic detection ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
With the popularity of the Internet and the increase in the level of information technology, cyber attacks have become an increasingly serious problem. They pose a great threat to the security of individuals, enterprises, and the state. This has made network intrusion detection technology critically important. In this paper, a malicious traffic detection model is constructed based on a decision tree classifier of entropy and a proximal policy optimisation algorithm (PPO) of deep reinforcement learning. Firstly, the decision tree idea in machine learning is used to make a preliminary classification judgement on the dataset based on the information entropy. The importance score of each feature in the classification work is calculated and the features with lower contributions are removed. Then, it is handed over to the PPO algorithm model for detection. An entropy regularity term is introduced in the process of the PPO algorithm update. Finally, the deep reinforcement learning algorithm is used to continuously train and update the parameters during the detection process, and finally, the detection model with higher accuracy is obtained. Experiments show that the binary classification accuracy of the malicious traffic detection model based on the deep reinforcement learning PPO algorithm can reach 99.17% under the CIC-IDS2017 dataset used in this paper.
- Published
- 2024
- Full Text
- View/download PDF
4. Minimizing Entropy and Complexity in Creative Production from Emergent Pragmatics to Action Semantics
- Author
-
Stephen Fox
- Subjects
active inference ,assembly index ,assembly theory ,complexity ,creativity ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
New insights into intractable industrial challenges can be revealed by framing them in terms of natural science. One intractable industrial challenge is that creative production can be much more financially expensive and time consuming than standardized production. Creative products include a wide range of goods that have one or more original characteristics. The scaling up of creative production is hindered by high financial production costs and long production durations. In this paper, creative production is framed in terms of interactions between entropy and complexity during progressions from emergent pragmatics to action semantics. An analysis of interactions between entropy and complexity is provided that relates established practice in creative production to organizational survival in changing environments. The analysis in this paper is related to assembly theory, which is a recent theoretical development in natural science that addresses how open-ended generation of complex physical objects can emerge from selection in biology. Parallels between assembly practice in industrial production and assembly theory in natural science are explained through constructs that are common to both, such as assembly index. Overall, analyses reported in the paper reveal that interactions between entropy and complexity underlie intractable challenges in creative production, from the production of individual products to the survival of companies.
- Published
- 2024
- Full Text
- View/download PDF
5. The Holographic Principle Comes from Finiteness of the Universe’s Geometry
- Author
-
Arkady Bolotin
- Subjects
holographic principle ,finite geometry ,entropy ,black hole ,holographic image ,magnification ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Discovered as an apparent pattern, a universal relation between geometry and information called the holographic principle has yet to be explained. This relation is unfolded in the present paper. As it is demonstrated there, the origin of the holographic principle lies in the fact that a geometry of physical space has only a finite number of points. Furthermore, it is shown that the puzzlement of the holographic principle can be explained by a magnification of grid cells used to discretize geometrical magnitudes such as areas and volumes into sets of points. To wit, when grid cells of the Planck scale are projected from the surface of the observable universe into its interior, they become enlarged. For that reason, the space inside the observable universe is described by the set of points whose cardinality is equal to the number of points that constitute the universe’s surface.
- Published
- 2024
- Full Text
- View/download PDF
6. Geometric Insights into the Multivariate Gaussian Distribution and Its Entropy and Mutual Information
- Author
-
Dah-Jing Jwo, Ta-Shun Cho, and Amita Biswal
- Subjects
multivariate Gaussians ,correlated random variables ,visualization ,entropy ,relative entropy ,mutual information ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In this paper, we provide geometric insights with visualization into the multivariate Gaussian distribution and its entropy and mutual information. In order to develop the multivariate Gaussian distribution with entropy and mutual information, several significant methodologies are presented through the discussion, supported by illustrations, both technically and statistically. The paper examines broad measurements of structure for the Gaussian distributions, which show that they can be described in terms of the information theory between the given covariance matrix and correlated random variables (in terms of relative entropy). The content obtained allows readers to better perceive concepts, comprehend techniques, and properly execute software programs for future study on the topic’s science and implementations. It also helps readers grasp the themes’ fundamental concepts to study the application of multivariate sets of data in Gaussian distribution. The simulation results also convey the behavior of different elliptical interpretations based on the multivariate Gaussian distribution with entropy for real-world applications in our daily lives, including information coding, nonlinear signal detection, etc. Involving the relative entropy and mutual information as well as the potential correlated covariance analysis, a wide range of information is addressed, including basic application concerns as well as clinical diagnostics to detect the multi-disease effects.
- Published
- 2023
- Full Text
- View/download PDF
7. Detection of Respiratory Events during Sleep Based on Fusion Analysis and Entropy Features of Cardiopulmonary Signals
- Author
-
Xinlei Yan, Juan Liu, Lin Wang, Shaochang Wang, Senlin Zhang, and Yi Xin
- Subjects
machine learning ,SAHS ,apnea event ,entropy ,respiratory signal ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Sleep apnea hypopnea syndrome (SAHS) is a common sleep disorder with a high prevalence. The apnea hypopnea index (AHI) is an important indicator used to diagnose the severity of SAHS disorders. The calculation of the AHI is based on the accurate identification of various types of sleep respiratory events. In this paper, we proposed an automatic detection algorithm for respiratory events during sleep. In addition to the accurate recognition of normal breathing, hypopnea and apnea events using heart rate variability (HRV), entropy and other manual features, we also presented a fusion of ribcage and abdomen movement data combined with the long short-term memory (LSTM) framework to achieve the distinction between obstructive and central apnea events. While only using electrocardiogram (ECG) features, the accuracy, precision, sensitivity, and F1 score of the XGBoost model are 0.877, 0.877, 0.876, and 0.876, respectively, demonstrating that it performs better than other models. Moreover, the accuracy, sensitivity, and F1 score of the LSTM model for detecting obstructive and central apnea events were 0.866, 0.867, and 0.866, respectively. The research results of this paper can be used for the automatic recognition of sleep respiratory events as well as AHI calculation of polysomnography (PSG), which provide a theoretical basis and algorithm references for out-of-hospital sleep monitoring.
- Published
- 2023
- Full Text
- View/download PDF
8. IoT Privacy Risks Revealed
- Author
-
Kai-Chih Chang, Haoran Niu, Brian Kim, and Suzanne Barber
- Subjects
identity ,privacy ,privacy policy ,Internet of Things ,privacy risks ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
A user’s devices such as their phone and computer are constantly bombarded by IoT devices and associated applications seeking connection to the user’s devices. These IoT devices may or may not seek explicit user consent, thus leaving the users completely unaware the IoT device is collecting, using, and/or sharing their personal data or, only marginal informed, if the user consented to the connecting IoT device but did not read the associated privacy policies. Privacy policies are intended to inform users of what personally identifiable information (PII) data will be collected about them and the policies about how those PII data will be used and shared. This paper presents novel tools and the underlying algorithms employed by the Personal Privacy Assistant app (UTCID PPA) developed by the University of Texas at Austin Center for Identity to inform users of IoT devices seeking to connect to their devices and to notify those users of potential privacy risks posed by the respective IoT device. The assessment of these privacy risks must deal with the uncertainty associated with sharing the user’s personal data. If privacy risk (R) equals the consequences (C) of an incident (i.e., personal data exposure) multiplied by the probability (P) of those consequences occurring (C × P), then efforts to control risks must seek to reduce the possible consequences of an incident as well as reduce the uncertainty of the incident and its consequences occurring. This research classifies risk according to two parameters: expected value of the incident’s consequences and uncertainty (entropy) of those consequences. This research calculates the entropy of the privacy incident consequences by evaluating: (1) the data sharing policies governing the IoT resource and (2) the type of personal data exposed. The data sharing policies of an IoT resource are scored by the UTCID PrivacyCheck™, which uses machine learning to read and score the IoT resource privacy policies against metrics set forth by best practices and international regulations. The UTCID Identity Ecosystem uses empirical identity theft and fraud cases to assess the entropy of privacy incident consequences involving a specific type of personal data, such as name, address, Social Security number, fingerprint, and user location. By understanding the entropy of a privacy incident posed by a given IoT resource seeking to connect to a user’s device, UTCID PPA offers actionable recommendations enhancing the user’s control over IoT connections, interactions, their personal data, and, ultimately, user-centric privacy control.
- Published
- 2024
- Full Text
- View/download PDF
9. A Metric Based on the Efficient Determination Criterion
- Author
-
Jesús E. García, Verónica A. González-López, and Johsac I. Gomez Sanchez
- Subjects
partition Markov models ,Bayesian information criterion ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This paper extends the concept of metrics based on the Bayesian information criterion (BIC), to achieve strongly consistent estimation of partition Markov models (PMMs). We introduce a set of metrics drawn from the family of model selection criteria known as efficient determination criteria (EDC). This generalization extends the range of options available in BIC for penalizing the number of model parameters. We formally specify the relationship that determines how EDC works when selecting a model based on a threshold associated with the metric. Furthermore, we improve the penalty options within EDC, identifying the penalty ln(ln(n)) as a viable choice that maintains the strongly consistent estimation of a PMM. To demonstrate the utility of these new metrics, we apply them to the modeling of three DNA sequences of dengue virus type 3, endemic in Brazil in 2023.
- Published
- 2024
- Full Text
- View/download PDF
10. Identifying Heterogeneity in SAR Data with New Test Statistics
- Author
-
Alejandro C. Frery, Janeth Alpala, and Abraão D. C. Nascimento
- Subjects
SAR ,heterogeneity ,entropy ,coefficient of variation ,hypothesis tests ,Science - Abstract
This paper presents a statistical approach to identify the underlying roughness characteristics in synthetic aperture radar (SAR) intensity data. The physical modeling of this kind of data allows the use of the Gamma distribution in the presence of fully developed speckle, i.e., when there are infinitely many independent backscatterers per resolution cell, and none dominates the return. Such areas are often called “homogeneous” or “textureless” regions. The GI0 distribution is also a widely accepted law for heterogeneous and extremely heterogeneous regions, i.e., areas where the fully developed speckle hypotheses do not hold. We propose three test statistics to distinguish between homogeneous and inhomogeneous regions, i.e., between gamma and GI0 distributed data, both with a known number of looks. The first test statistic uses a bootstrapped non-parametric estimator of Shannon entropy, providing a robust assessment in uncertain distributional assumptions. The second test uses the classical coefficient of variation (CV). The third test uses an alternative form of estimating the CV based on the ratio of the mean absolute deviation from the median to the median. We apply our test statistic to create maps of p-values for the homogeneity hypothesis. Finally, we show that our proposal, the entropy-based test, outperforms existing methods, such as the classical CV and its alternative variant, in identifying heterogeneity when applied to both simulated and actual data.
- Published
- 2024
- Full Text
- View/download PDF
11. DiffFSRE: Diffusion-Enhanced Prototypical Network for Few-Shot Relation Extraction
- Author
-
Yang Chen and Bowen Shi
- Subjects
relation extraction ,diffusion model ,prototypical networks ,entropy ,few-shot learning ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Supervised learning methods excel in traditional relation extraction tasks. However, the quality and scale of the training data heavily influence their performance. Few-shot relation extraction is gradually becoming a research hotspot whose objective is to learn and extract semantic relationships between entities with only a limited number of annotated samples. In recent years, numerous studies have employed prototypical networks for few-shot relation extraction. However, these methods often suffer from overfitting of the relation classes, making it challenging to generalize effectively to new relationships. Therefore, this paper seeks to utilize a diffusion model for data augmentation to address the overfitting issue of prototypical networks. We propose a diffusion model-enhanced prototypical network framework. Specifically, we design and train a controllable conditional relation generation diffusion model on the relation extraction dataset, which can generate the corresponding instance representation according to the relation description. Building upon the trained diffusion model, we further present a pseudo-sample-enhanced prototypical network, which is able to provide more accurate representations for prototype classes, thereby alleviating overfitting and better generalizing to unseen relation classes. Additionally, we introduce a pseudo-sample-aware attention mechanism to enhance the model’s adaptability to pseudo-sample data through a cross-entropy loss, further improving the model’s performance. A series of experiments are conducted to prove our method’s effectiveness. The results indicate that our proposed approach significantly outperforms existing methods, particularly in low-resource one-shot environments. Further ablation analyses underscore the necessity of each module in the model. As far as we know, this is the first research to employ a diffusion model for enhancing the prototypical network through data augmentation in few-shot relation extraction.
- Published
- 2024
- Full Text
- View/download PDF
12. Distinguishing the Leading Agents in Classification Problems Using the Entropy-Based Metric
- Author
-
Evgeny Kagan and Irad Ben-Gal
- Subjects
leading agents ,classification ,entropy ,Rokhlin metric ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The paper addresses the problem of distinguishing the leading agents in the group. The problem is considered in the framework of classification problems, where the agents in the group select the items with respect to certain properties. The suggested method of distinguishing the leading agents utilizes the connectivity between the agents and the Rokhlin distance between the subgroups of the agents. The method is illustrated by numerical examples. The method can be useful in considering the division of labor in swarm dynamics and in the analysis of the data fusion in the tasks based on the wisdom of the crowd techniques.
- Published
- 2024
- Full Text
- View/download PDF
13. Improving the Performance and Stability of TIC and ICE
- Author
-
Tyler Ward
- Subjects
generalization error ,overfitting ,information criteria ,entropy ,AIC ,TIC ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Takeuchi’s Information Criterion (TIC) was introduced as a generalization of Akaike’s Information Criterion (AIC) in 1976. Though TIC avoids many of AIC’s strict requirements and assumptions, it is only rarely used. One of the reasons for this is that the trace term introduced in TIC is numerically unstable and computationally expensive to compute. An extension of TIC called ICE was published in 2021, which allows this trace term to be used for model fitting (where it was primarily compared to L2 regularization) instead of just model selection. That paper also examined numerically stable and computationally efficient approximations that could be applied to TIC or ICE, but these approximations were only examined on small synthetic models. This paper applies and extends these approximations to larger models on real datasets for both TIC and ICE. This work shows the practical models may use TIC and ICE in a numerically stable way to achieve superior results at a reasonable computational cost.
- Published
- 2023
- Full Text
- View/download PDF
14. Assessment of Product Variety Complexity
- Author
-
Vladimir Modrak and Zuzana Soltysova
- Subjects
complexity ,mass customization ,product variety ,entropy ,product configurations ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Product variety complexity assessment plays a vital role in system design, as it has tremendous negative effects on manufacturing complexity in assembly systems and supply chains. On the other hand, practitioners and researchers frequently consider the number of product variants as a sufficient measure to be used to manage this kind of complexity. However, as shown in this study, such a measure does not reflect all pertinent features of complexity. Therefore, the main goal of the paper is to develop a measurement method for product variety complexity that adequately reflects relevant relations between the portfolio of optional components and the number of product variants. As presented in the paper, the concept of information theory can be effectively applied to measure product variety complexity. Moreover, such a measure can also be useful to better understand this system’s properties in order to reduce the level of variety-induced complexity. As such, the proposed method can be viewed as a complementary tool for reducing manufacturing complexity in terms of mass customization. The developed complexity metric was successfully tested on a realistic design example.
- Published
- 2023
- Full Text
- View/download PDF
15. Entropy-Based Methods for Motor Fault Detection: A Review
- Author
-
Sarahi Aguayo-Tapia, Gerardo Avalos-Almazan, and Jose de Jesus Rangel-Magdaleno
- Subjects
entropy ,motor fault detection ,artificial-intelligence-based classifiers ,feature vectors ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In the signal analysis context, the entropy concept can characterize signal properties for detecting anomalies or non-representative behaviors in fiscal systems. In motor fault detection theory, entropy can measure disorder or uncertainty, aiding in detecting and classifying faults or abnormal operation conditions. This is especially relevant in industrial processes, where early motor fault detection can prevent progressive damage, operational interruptions, or potentially dangerous situations. The study of motor fault detection based on entropy theory holds significant academic relevance too, effectively bridging theoretical frameworks with industrial exigencies. As industrial sectors progress, applying entropy-based methodologies becomes indispensable for ensuring machinery integrity based on control and monitoring systems. This academic endeavor enhances the understanding of signal processing methodologies and accelerates progress in artificial intelligence and other modern knowledge areas. A wide variety of entropy-based methods have been employed for motor fault detection. This process involves assessing the complexity of measured signals from electrical motors, such as vibrations or stator currents, to form feature vectors. These vectors are then fed into artificial-intelligence-based classifiers to distinguish between healthy and faulty motor signals. This paper discusses some recent references to entropy methods and a summary of the most relevant results reported for fault detection over the last 10 years.
- Published
- 2024
- Full Text
- View/download PDF
16. It Ain’t Necessarily So: Ludwig Boltzmann’s Darwinian Notion of Entropy
- Author
-
Steven Gimbel
- Subjects
Ludwig Boltzmann ,entropy ,Charles Darwin ,evolution ,model ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Ludwig Boltzmann’s move in his seminal paper of 1877, introducing a statistical understanding of entropy, was a watershed moment in the history of physics. The work not only introduced quantization and provided a new understanding of entropy, it challenged the understanding of what a law of nature could be. Traditionally, nomological necessity, that is, specifying the way in which a system must develop, was considered an essential element of proposed physical laws. Yet, here was a new understanding of the Second Law of Thermodynamics that no longer possessed this property. While it was a new direction in physics, in other important scientific discourses of that time—specifically Huttonian geology and Darwinian evolution, similar approaches were taken in which a system’s development followed principles, but did so in a way that both provided a direction of time and allowed for non-deterministic, though rule-based, time evolution. Boltzmann referred to both of these theories, especially the work of Darwin, frequently. The possibility that Darwin influenced Boltzmann’s thought in physics can be seen as being supported by Boltzmann’s later writings.
- Published
- 2024
- Full Text
- View/download PDF
17. Enhanced Heterogeneous Graph Attention Network with a Novel Multilabel Focal Loss for Document-Level Relation Extraction
- Author
-
Yang Chen and Bowen Shi
- Subjects
relation extraction ,heterogeneous graph neural network ,entropy ,attention mechanism ,dependency tree ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Recent years have seen a rise in interest in document-level relation extraction, which is defined as extracting all relations between entities in multiple sentences of a document. Typically, there are multiple mentions corresponding to a single entity in this context. Previous research predominantly employed a holistic representation for each entity to predict relations, but this approach often overlooks valuable information contained in fine-grained entity mentions. We contend that relation prediction and inference should be grounded in specific entity mentions rather than abstract entity concepts. To address this, our paper proposes a two-stage mention-level framework based on an enhanced heterogeneous graph attention network for document-level relation extraction. Our framework employs two different strategies to model intra-sentential and inter-sentential relations between fine-grained entity mentions, yielding local mention representations for intra-sentential relation prediction and global mention representations for inter-sentential relation prediction. For inter-sentential relation prediction and inference, we propose an enhanced heterogeneous graph attention network to better model the long-distance semantic relationships and design an entity-coreference path-based inference strategy to conduct relation inference. Moreover, we introduce a novel cross-entropy-based multilabel focal loss function to address the class imbalance problem and multilabel prediction simultaneously. Comprehensive experiments have been conducted to verify the effectiveness of our framework. Experimental results show that our approach significantly outperforms the existing methods.
- Published
- 2024
- Full Text
- View/download PDF
18. Continual Reinforcement Learning for Quadruped Robot Locomotion
- Author
-
Sibo Gai, Shangke Lyu, Hongyin Zhang, and Donglin Wang
- Subjects
continual learning ,quadruped robot locomotion ,reinforcement learning ,plasticity ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The ability to learn continuously is crucial for a robot to achieve a high level of intelligence and autonomy. In this paper, we consider continual reinforcement learning (RL) for quadruped robots, which includes the ability to continuously learn sub-sequential tasks (plasticity) and maintain performance on previous tasks (stability). The policy obtained by the proposed method enables robots to learn multiple tasks sequentially, while overcoming both catastrophic forgetting and loss of plasticity. At the same time, it achieves the above goals with as little modification to the original RL learning process as possible. The proposed method uses the Piggyback algorithm to select protected parameters for each task, and reinitializes the unused parameters to increase plasticity. Meanwhile, we encourage the policy network exploring by encouraging the entropy of the soft network of the policy network. Our experiments show that traditional continual learning algorithms cannot perform well on robot locomotion problems, and our algorithm is more stable and less disruptive to the RL training progress. Several robot locomotion experiments validate the effectiveness of our method.
- Published
- 2024
- Full Text
- View/download PDF
19. Financial Network Analysis on the Performance of Companies Using Integrated Entropy–DEMATEL–TOPSIS Model
- Author
-
Kah Fai Liew, Weng Siew Lam, and Weng Hoe Lam
- Subjects
causal relationship ,entropy ,TOPSIS ,DEMATEL ,multi-criteria decision making ,financial ratio ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In this paper, we propose a multi-criteria decision making (MCDM) model by integrating the entropy–DEMATEL with TOPSIS model to analyze the causal relationship of financial ratios towards the financial performance of the companies. The proposed model is illustrated using the financial data of the companies of Dow Jones Industrial Average (DJIA). The financial network analysis using entropy–DEMATEL shows that the financial ratios such as debt to equity ratio (DER) and return on equity (ROE) are classified into the cause criteria group, whereas current ratio (CR), earnings per share (EPS), return on asset (ROA) and debt to assets ratio (DAR) are categorized into the effect criteria group. The top three most influential financial ratios are ROE, CR and DER. The significance of this paper is to determine the causal relationship of financial network towards the financial performance of the companies with the proposed entropy–DEMATEL–TOPSIS model. The ranking identification of the companies in this study is beneficial to the investors to select the companies with good performance in portfolio investment. The proposed model has been applied and validated in the portfolio investment using a mean-variance model based on the selection of companies with good performance. The results show that the proposed model is able to generate higher mean return than the benchmark DJIA index at minimum risk. However, short sale is not allowed for the applicability of the proposed model in portfolio investment.
- Published
- 2022
- Full Text
- View/download PDF
20. Dense-Frequency Signal-Detection Based on the Primal–Dual Splitting Method
- Author
-
Jiaoyu Zheng, Zheng Liao, Xiaoyang Ma, Yanlin Jin, and Huangqi Ma
- Subjects
harmonic ,interharmonic ,dense-frequency signal ,entropy ,phase analysis ,primal–dual splitting method ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Aiming to solve the problem of dense-frequency signals in the power system caused by the growing proportion of new energy, this paper proposes a dense-frequency signal-detection method based on the primal–dual splitting method. After establishing the Taylor–Fourier model of the signal, the proposed method uses the sparse property of the coefficient matrix to obtain the convex optimization form of the model. Then, the optimal solution of the estimated phasor is obtained by iterating over the fixed-point equation, finally acquiring the optimal estimation result for the dense signal. When representing the Taylor–Fourier model as a convex optimization form, the introduction of measuring-error entropy makes the solution of the model more rigorous. It can be further verified through simulation experiments that the estimation accuracy of the primal–dual splitting method proposed in this paper for dense signals can meet the M-class PMU accuracy requirements.
- Published
- 2022
- Full Text
- View/download PDF
21. Designing a Novel Approach Using a Greedy and Information-Theoretic Clustering-Based Algorithm for Anonymizing Microdata Sets
- Author
-
Reza Ahmadi Khatir, Habib Izadkhah, and Jafar Razmara
- Subjects
information theory ,entropy ,data anonymization ,clustering ,privacy-preserving ,individuals’ privacy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Data anonymization is a technique that safeguards individuals’ privacy by modifying attribute values in published data. However, increased modifications enhance privacy but diminish the utility of published data, necessitating a balance between privacy and utility levels. K-Anonymity is a crucial anonymization technique that generates k-anonymous clusters, where the probability of disclosing a record is 1/k. However, k-anonymity fails to protect against attribute disclosure when the diversity of sensitive values within the anonymous cluster is insufficient. Several techniques have been proposed to address this issue, among which t-closeness is considered one of the most robust privacy techniques. In this paper, we propose a novel approach employing a greedy and information-theoretic clustering-based algorithm to achieve strict privacy protection. The proposed anonymization algorithm commences by clustering the data based on both the similarity of quasi-identifier values and the diversity of sensitive attribute values. In the subsequent adjustment phase, the algorithm splits and merges the clusters to ensure that they each possess at least k members and adhere to the t-closeness requirements. Finally, the algorithm replaces the quasi-identifier values of the records in each cluster with the values of the cluster center to attain k-anonymity and t-closeness. Experimental results on three microdata sets from Facebook, Twitter, and Google+ demonstrate the proposed algorithm’s ability to preserve the utility of released data by minimizing the modifications of attribute values while satisfying the k-anonymity and t-closeness constraints.
- Published
- 2023
- Full Text
- View/download PDF
22. Relative Entropy of Correct Proximal Policy Optimization Algorithms with Modified Penalty Factor in Complex Environment
- Author
-
Weimin Chen, Kelvin Kian Loong Wong, Sifan Long, and Zhili Sun
- Subjects
correct proximal policy optimization ,approximation theory ,reinforcement learning ,optimization ,policy gradient ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In the field of reinforcement learning, we propose a Correct Proximal Policy Optimization (CPPO) algorithm based on the modified penalty factor β and relative entropy in order to solve the robustness and stationarity of traditional algorithms. Firstly, In the process of reinforcement learning, this paper establishes a strategy evaluation mechanism through the policy distribution function. Secondly, the state space function is quantified by introducing entropy, whereby the approximation policy is used to approximate the real policy distribution, and the kernel function estimation and calculation of relative entropy is used to fit the reward function based on complex problem. Finally, through the comparative analysis on the classic test cases, we demonstrated that our proposed algorithm is effective, has a faster convergence speed and better performance than the traditional PPO algorithm, and the measure of the relative entropy can show the differences. In addition, it can more efficiently use the information of complex environment to learn policies. At the same time, not only can our paper explain the rationality of the policy distribution theory, the proposed framework can also balance between iteration steps, computational complexity and convergence speed, and we also introduced an effective measure of performance using the relative entropy concept.
- Published
- 2022
- Full Text
- View/download PDF
23. Breaking Barriers in Emerging Biomedical Applications
- Author
-
Konstantinos Katzis, Lazar Berbakov, Gordana Gardašević, and Olivera Šveljo
- Subjects
biomedical ,smart healthcare ,Internet of Things ,image compression ,entropy ,emerging technologies ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The recent global COVID-19 pandemic has revealed that the current healthcare system in modern society can hardly cope with the increased number of patients. Part of the load can be alleviated by incorporating smart healthcare infrastructure in the current system to enable patient’s remote monitoring and personalized treatment. Technological advances in communications and sensing devices have enabled the development of new, portable, and more power-efficient biomedical sensors, as well as innovative healthcare applications. Nevertheless, such applications require reliable, resilient, and secure networks. This paper aims to identify the communication requirements for mass deployment of such smart healthcare sensors by providing the overview of underlying Internet of Things (IoT) technologies. Moreover, it highlights the importance of information theory in understanding the limits and barriers in this emerging field. With this motivation, the paper indicates how data compression and entropy used in security algorithms may pave the way towards mass deployment of such IoT healthcare devices. Future medical practices and paradigms are also discussed.
- Published
- 2022
- Full Text
- View/download PDF
24. Reliability Analysis of the New Exponential Inverted Topp–Leone Distribution with Applications
- Author
-
Ahmed Sayed M. Metwally, Amal S. Hassan, Ehab M. Almetwally, B M Golam Kibria, and Hisham M. Almongy
- Subjects
new exponential-X ,stress–strength reliability ,entropy ,Bayesian ,maximum product spacing ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The inverted Topp–Leone distribution is a new, appealing model for reliability analysis. In this paper, a new distribution, named new exponential inverted Topp–Leone (NEITL) is presented, which adds an extra shape parameter to the inverted Topp–Leone distribution. The graphical representations of its density, survival, and hazard rate functions are provided. The following properties are explored: quantile function, mixture representation, entropies, moments, and stress–strength reliability. We plotted the skewness and kurtosis measures of the proposed model based on the quantiles. Three different estimation procedures are suggested to estimate the distribution parameters, reliability, and hazard rate functions, along with their confidence intervals. Additionally, stress–strength reliability estimators for the NEITL model were obtained. To illustrate the findings of the paper, two real datasets on engineering and medical fields have been analyzed.
- Published
- 2021
- Full Text
- View/download PDF
25. Spectrum Sensing Implemented with Improved Fluctuation-Based Dispersion Entropy and Machine Learning
- Author
-
Gianmarco Baldini, Jean-Marc Chareau, and Fausto Bonavitacola
- Subjects
spectrum sensing ,entropy ,machine learning ,signal processing ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Spectrum sensing is an important function in radio frequency spectrum management and cognitive radio networks. Spectrum sensing is used by one wireless system (e.g., a secondary user) to detect the presence of a wireless service with higher priority (e.g., a primary user) with which it has to coexist in the radio frequency spectrum. If the wireless signal is detected, the second user system releases the given frequency to maintain the principle of not interfering. This paper proposes a machine learning implementation of spectrum sensing using the entropy measure as a feature vector. In the training phase, the information about the activity of the wireless service with higher priority is gathered, and the model is formed. In the classification phase, the wireless system compares the current sensing report to the created model to calculate the posterior probability and classify the sensing report into either the presence or absence of wireless service with higher priority. This paper proposes the novel application of the Fluctuation Dispersion Entropy (FDE) measure recently introduced in the research community as a feature vector to build the model and implement the classification. An improved implementation of the FDE (IFDE) is used to enhance the robustness to noise. IFDE is further enhanced with an adaptive method (AIFDE) to automatically select the hyper-parameter introduced in IFDE. Then, this paper combines the machine learning approach with the entropy measure approach, which are both recent developments in spectrum sensing research. The approach is compared to similar approaches in literature and the classical energy detection method using a generated radar signal data set with different conditions of SNR(dB) and fading conditions. The results show that the proposed approach is able to outperform the approaches from literature based on other entropy measures or the Energy Detector (ED) in a consistent way across different levels of SNR and fading conditions.
- Published
- 2021
- Full Text
- View/download PDF
26. Entropy in Landscape Ecology: A Quantitative Textual Multivariate Review
- Author
-
Samuel A. Cushman
- Subjects
entropy ,landscape ,review ,multivariate textual analysis ,pattern ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This paper presents a multivariate textual analysis of more than 1300 papers on entropy in ecology. There are six main insights that emerged. First, there is a large body of literature that has addressed some aspect of entropy in ecology, most of which has been published in the last 5–10 years. Second, the vast majority of these papers focus on species distribution, species richness, relative abundance or trophic structure and not landscape-scale patterns or processes, pe se. Third, there have been few papers addressing landscape-level questions related to entropy. Fourth, the quantitative analysis with hierarchical clustering identified a strongly nested structure among papers that addressed entropy in ecology. Fifth, there is clear differentiation of papers focused on landscape-level applications of entropy from other papers, with landscape focused papers clustered together at each level of the hierarchy in a relatively small and closely associated group. Sixth, this group of landscape-focused papers was substructured between papers that explicitly adopted entropy measures to quantify the spatial pattern of landscape mosaics, often using variations on Boltzmann entropy, versus those that utilize Shannon entropy measures from information theory, which are not generally explicit in their assessment of spatial configuration. This review provides a comprehensive, quantitative assessment of the scope, trends and relationships among a large body of literature related to entropy in ecology and for the first time puts landscape ecological research on entropy into that context.
- Published
- 2021
- Full Text
- View/download PDF
27. Entropy-Based Behavioural Efficiency of the Financial Market
- Author
-
Emil Dinga, Camelia Oprean-Stan, Cristina-Roxana Tănăsescu, Vasile Brătian, and Gabriela-Mariana Ionescu
- Subjects
behaviour ,entropy ,efficiency ,implicit information ,financial market ,EMH ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The most known and used abstract model of the financial market is based on the concept of the informational efficiency (EMH) of that market. The paper proposes an alternative which could be named the behavioural efficiency of the financial market, which is based on the behavioural entropy instead of the informational entropy. More specifically, the paper supports the idea that, in the financial market, the only measure (if any) of the entropy is the available behaviours indicated by the implicit information. Therefore, the behavioural entropy is linked to the concept of behavioural efficiency. The paper argues that, in fact, in the financial markets, there is not a (real) informational efficiency, but there exists a behavioural efficiency instead. The proposal is based both on a new typology of information in the financial market (which provides the concept of implicit information—that is, that information ”translated” by the economic agents from observing the actual behaviours) and on a non-linear (more exactly, a logistic) curve linking the behavioural entropy to the behavioural efficiency of the financial markets. Finally, the paper proposes a synergic overcoming of both EMH and AMH based on the new concept of behavioural entropy in the financial market.
- Published
- 2021
- Full Text
- View/download PDF
28. Wavelet-Based Multiscale Intermittency Analysis: The Effect of Deformation
- Author
-
José M. Angulo and Ana E. Madrid
- Subjects
complexity ,deformation ,energy transfer ,entropy ,intermittency ,wavelets ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Intermittency represents a certain form of heterogeneous behavior that has interest in diverse fields of application, particularly regarding the characterization of system dynamics and for risk assessment. Given its intrinsic location-scale-dependent nature, wavelets constitute a useful functional tool for technical analysis of intermittency. Deformation of the support may induce complex structural changes in a signal. In this paper, we study the effect of deformation on intermittency. Specifically, we analyze the interscale transfer of energy and its implications on different wavelet-based intermittency indicators, depending on whether the signal corresponds to a ‘level’- or a ‘flow’-type physical magnitude. Further, we evaluate the effect of deformation on the interscale distribution of energy in terms of generalized entropy and complexity measures. For illustration, various contrasting scenarios are considered based on simulation, as well as two segments corresponding to different regimes in a real seismic series before and after a significant earthquake.
- Published
- 2023
- Full Text
- View/download PDF
29. Consideration for Affects of an XOR in a Random Number Generator Using Ring Oscillators
- Author
-
Ryoichi Sato, Yuta Kodera, Md. Arshad Ali, Takuya Kusaka, Yasuyuki Nogami, and Robert H. Morelos-Zaragoza
- Subjects
entropy ,field programmable gate array ,true random number generator ,period ,ring oscillator ,stomatic process ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
A cloud service to offer entropy has been paid much attention to. As one of the entropy sources, a physical random number generator is used as a true random number generator, relying on its irreproducibility. This paper focuses on a physical random number generator using a field-programmable gate array as an entropy source by employing ring oscillator circuits as a representative true random number generator. This paper investigates the effects of an XOR gate in the oscillation circuit by observing the output signal period. It aims to reveal the relationship between inputs and the output through the XOR gate in the target generator. The authors conduct two experiments to consider the relevance. It is confirmed that combining two ring oscillators with an XOR gate increases the complexity of the output cycle. In addition, verification using state transitions showed that the probability of the state transitions was evenly distributed by increasing the number of ring oscillator circuits.
- Published
- 2021
- Full Text
- View/download PDF
30. Model for Risk Calculation and Reliability Comparison of Level Crossings
- Author
-
Pamela Ercegovac, Gordan Stojić, Miloš Kopić, Željko Stević, Feta Sinani, and Ilija Tanackov
- Subjects
risk ,reliability ,level crossings ,queueing theory ,accident ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
There is not a single country in the world that is so rich that it can remove all level crossings or provide their denivelation in order to absolutely avoid the possibility of accidents at the intersections of railways and road traffic. In the Republic of Serbia alone, the largest number of accidents occur at passive crossings, which make up three-quarters of the total number of crossings. Therefore, it is necessary to constantly find solutions to the problem of priorities when choosing level crossings where it is necessary to raise the level of security, primarily by analyzing the risk and reliability at all level crossings. This paper presents a model that enables this. The calculation of the maximal risk of a level crossing is achieved under the conditions of generating the maximum entropy in the virtual operating mode. The basis of the model is a heterogeneous queuing system. Maximum entropy is based on the mandatory application of an exponential distribution. The system is Markovian and is solved by a standard analytical concept. The basic input parameters for the calculation of the maximal risk are the geometric characteristics of the level crossing and the intensities and structure of the flows of road and railway vehicles. The real risk is based on statistical records of accidents and flow intensities. The exact reliability of the level crossing is calculated from the ratio of real and maximal risk, which enables their further comparison in order to raise the level of safety, and that is the basic idea of this paper.
- Published
- 2021
- Full Text
- View/download PDF
31. Performance Evaluation of Construction Companies Using Integrated Entropy–Fuzzy VIKOR Model
- Author
-
Weng Siew Lam, Weng Hoe Lam, Saiful Hafizah Jaaman, and Kah Fai Liew
- Subjects
entropy ,fuzzy VIKOR ,multi-criteria decision making ,financial ratio ,research framework ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The construction sector plays an important role in a country’s economic development. The financial performance of a company is a good indicator of its financial health and status. In Malaysia, the government encourages the construction industry to develop an advanced infrastructure related to health, transport, education and housing. In view of the COVID-19 pandemic, the operations and financial performance of construction sector companies have been affected recently. Additionally, uncertainty plays a vital role in the multi-criteria decision-making (MCDM) process. Based on previous studies, there has been no comprehensive study conducted on the evaluation of the financial performance of construction companies by integrating entropy and fuzzy VIKOR models. Therefore, this paper aims to propose an MCDM model to evaluate and compare the financial performance of construction companies with an integrated entropy–fuzzy VIKOR model. A case study is carried out by evaluating the listed construction companies in Malaysia with the proposed model. The findings of this paper indicate that the company ECONBHD achieves the best financial performance over the study period. The significance of this paper is to determine the priority of the financial ratios and ranking of the construction companies with the proposed entropy–fuzzy VIKOR model.
- Published
- 2021
- Full Text
- View/download PDF
32. The 'Real' Gibbs Paradox and a Composition-Based Resolution
- Author
-
Fabien Paillusson
- Subjects
Gibbs paradox ,mixtures ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
There is no documented evidence to suggest that J. W. Gibbs did not recognize the indistinguishable nature of states involving the permutation of identical particles or that he did not know how to justify on a priori grounds that the mixing entropy of two identical substances must be zero. However, there is documented evidence to suggest that Gibbs was puzzled by one of his theoretical findings, namely that the entropy change per particle would amount to kBln2 when equal amounts of any two different substances are mixed, no matter how similar these substances may be, and would drop straight to zero as soon as they become exactly identical. The present paper is concerned with this latter version of the Gibbs paradox and, to this end, develops a theory characterising real finite-size mixtures as realisations sampled from a probability distribution over a measurable attribute of the constituents of the substances. In this view, two substances are identical, relative to this measurable attribute, if they have the same underlying probability distribution. This implies that two identical mixtures do not need to have identical finite-size realisations of their compositions. By averaging over composition realisations, it is found that (1) fixed composition mixtures behave as homogeneous single-component substances and (2) in the limit of a large system size, the entropy of mixing per particle shows a continuous variation from kBln2 to 0, as two different substances are made more similar, thereby resolving the “real” Gibbs paradox.
- Published
- 2023
- Full Text
- View/download PDF
33. Comparative Study on Feature Extraction of Marine Background Noise Based on Nonlinear Dynamic Features
- Author
-
Guanni Ji, Yu Wang, and Fei Wang
- Subjects
marine background noise ,feature extraction ,nonlinear dynamics feature ,entropy ,Lempel–Ziv complexity ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Marine background noise (MBN) is the background noise of the marine environment, which can be used to invert the parameters of the marine environment. However, due to the complexity of the marine environment, it is difficult to extract the features of the MBN. In this paper, we study the feature extraction method of MBN based on nonlinear dynamics features, where the nonlinear dynamical features include two main categories: entropy and Lempel–Ziv complexity (LZC). We have performed single feature and multiple feature comparative experiments on feature extraction based on entropy and LZC, respectively: for entropy-based feature extraction experiments, we compared feature extraction methods based on dispersion entropy (DE), permutation entropy (PE), fuzzy entropy (FE), and sample entropy (SE); for LZC-based feature extraction experiments, we compared feature extraction methods based on LZC, dispersion LZC (DLZC) and permutation LZC (PLZC), and dispersion entropy-based LZC (DELZC). The simulation experiments prove that all kinds of nonlinear dynamics features can effectively detect the change of time series complexity, and the actual experimental results show that regardless of the entropy-based feature extraction method or LZC-based feature extraction method, they both present better feature extraction performance for MBN.
- Published
- 2023
- Full Text
- View/download PDF
34. Some Families of Jensen-like Inequalities with Application to Information Theory
- Author
-
Neri Merhav
- Subjects
Jensen’s inequality ,convex function ,concave function ,entropy ,capacity ,moment-generating function ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
It is well known that the traditional Jensen inequality is proved by lower bounding the given convex function, f(x), by the tangential affine function that passes through the point (E{X},f(E{X})), where E{X} is the expectation of the random variable X. While this tangential affine function yields the tightest lower bound among all lower bounds induced by affine functions that are tangential to f, it turns out that when the function f is just part of a more complicated expression whose expectation is to be bounded, the tightest lower bound might belong to a tangential affine function that passes through a point different than (E{X},f(E{X})). In this paper, we take advantage of this observation by optimizing the point of tangency with regard to the specific given expression in a variety of cases and thereby derive several families of inequalities, henceforth referred to as “Jensen-like” inequalities, which are new to the best knowledge of the author. The degree of tightness and the potential usefulness of these inequalities is demonstrated in several application examples related to information theory.
- Published
- 2023
- Full Text
- View/download PDF
35. A Hyperspectral Anomaly Detection Algorithm Based on Morphological Profile and Attribute Filter with Band Selection and Automatic Determination of Maximum Area
- Author
-
Ferdi Andika, Mia Rizkinia, and Masahiro Okuda
- Subjects
anomaly detection ,hyperspectral ,morphological profile ,attribute filter ,image histogram ,entropy ,Science - Abstract
Anomaly detection is one of the most challenging topics in hyperspectral imaging due to the high spectral resolution of the images and the lack of spatial and spectral information about the anomaly. In this paper, a novel hyperspectral anomaly detection method called morphological profile and attribute filter (MPAF) algorithm is proposed. Aiming to increase the detection accuracy and reduce computing time, it consists of three steps. First, select a band containing rich information for anomaly detection using a novel band selection algorithm based on entropy and histogram counts. Second, remove the background of the selected band with morphological profile. Third, filter the false anomalous pixels with attribute filter. A novel algorithm is also proposed in this paper to define the maximum area of anomalous objects. Experiments were run on real hyperspectral datasets to evaluate the performance, and analysis was also conducted to verify the contribution of each step of MPAF. The results show that the performance of MPAF yields competitive results in terms of average area under the curve (AUC) for receiver operating characteristic (ROC), precision-recall, and computing time, i.e., 0.9916, 0.7055, and 0.25 s, respectively. Compared with four other anomaly detection algorithms, MPAF yielded the highest average AUC for ROC and precision-recall in eight out of thirteen and nine out of thirteen datasets, respectively. Further analysis also proved that each step of MPAF has its effectiveness in the detection performance.
- Published
- 2020
- Full Text
- View/download PDF
36. A Novel Comprehensive Evaluation Method for Estimating the Bank Profile Shape and Dimensions of Stable Channels Using the Maximum Entropy Principle
- Author
-
Hossein Bonakdari, Azadeh Gholami, Amir Mosavi, Amin Kazemian-Kale-Kale, Isa Ebtehaj, and Amir Hossein Azimi
- Subjects
water resources ,channel ,mathematical entropy model ,bank profile shape ,gene expression programming (GEP) ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This paper presents an extensive and practical study of the estimation of stable channel bank shape and dimensions using the maximum entropy principle. The transverse slope (St) distribution of threshold channel bank cross-sections satisfies the properties of the probability space. The entropy of St is subject to two constraint conditions, and the principle of maximum entropy must be applied to find the least biased probability distribution. Accordingly, the Lagrange multiplier (λ) as a critical parameter in the entropy equation is calculated numerically based on the maximum entropy principle. The main goal of the present paper is the investigation of the hydraulic parameters influence governing the mean transverse slope (St¯) value comprehensively using a Gene Expression Programming (GEP) by knowing the initial information (discharge (Q) and mean sediment size (d50)) related to the intended problem. An explicit and simple equation of the St¯ of banks and the geometric and hydraulic parameters of flow is introduced based on the GEP in combination with the previous shape profile equation related to previous researchers. Therefore, a reliable numerical hybrid model is designed, namely Entropy-based Design Model of Threshold Channels (EDMTC) based on entropy theory combined with the evolutionary algorithm of the GEP model, for estimating the bank profile shape and also dimensions of threshold channels. A wide range of laboratory and field data are utilized to verify the proposed EDMTC. The results demonstrate that the used Shannon entropy model is accurate with a lower average value of Mean Absolute Relative Error (MARE) equal to 0.317 than a previous model proposed by Cao and Knight (1997) (MARE = 0.98) in estimating the bank profile shape of threshold channels based on entropy for the first time. Furthermore, the EDMTC proposed in this paper has acceptable accuracy in predicting the shape profile and consequently, the dimensions of threshold channel banks with a wide range of laboratory and field data when only the channel hydraulic characteristics (e.g., Q and d50) are known. Thus, EDMTC can be used in threshold channel design and implementation applications in cases when the channel characteristics are unknown. Furthermore, the uncertainty analysis of the EDMTC supports the model’s high reliability with a Width of Uncertainty Bound (WUB) of ±0.03 and standard deviation (Sd) of 0.24.
- Published
- 2020
- Full Text
- View/download PDF
37. About the Entropy of a Natural Number and a Type of the Entropy of an Ideal
- Author
-
Nicuşor Minculete and Diana Savin
- Subjects
entropy ,numbers ,ideals ,ramification theory in algebraic number fields ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
In this article, we find some properties of certain types of entropies of a natural number. We are studying a way of measuring the “disorder” of the divisors of a natural number. We compare two of the entropies H and H¯ defined for a natural number. An useful property of the Shannon entropy is the additivity, HS(pq)=HS(p)+HS(q), where pq denotes tensor product, so we focus on its study in the case of numbers and ideals. We mention that only one of the two entropy functions discussed in this paper satisfies additivity, whereas the other does not. In addition, regarding the entropy H of a natural number, we generalize this notion for ideals, and we find some of its properties.
- Published
- 2023
- Full Text
- View/download PDF
38. Spatiotemporal Landscape Pattern Analyses Enhanced by an Integrated Index: A Study of the Changbai Mountain National Nature Reserve
- Author
-
Ying Zhang, Jingxiong Zhang, Fengyan Wang, and Wenjing Yang
- Subjects
landscape pattern metrics ,integrated spatial landscape index (ISLI) ,variogram ,entropy ,spatial correlation ,Changbai Mountain National Nature Reserve ,Science - Abstract
The analysis of spatiotemporal changes of landscape patterns is of great significance for forest protection. However, the selection of landscape metrics is often subjective, and existing composite landscape metrics rarely consider the effects of spatial correlation. A more objective approach to formulating composite landscape metrics involves proper weighting that incorporates spatial structure information into integrating individual conventional metrics selected for building a composite metric. This paper proposes an integrated spatial landscape index (ISLI) based on variogram modeling and entropy weighting. It was tested through a case study, which sought to analyze spatiotemporal changes in the landscape pattern in the Changbai Mountains over 30 years based on six global land-cover products with a fine classification system at 30 m resolution (GLC_FCS30). The test results confirm: (1) spatial structure information is useful for weighting conventional landscape pattern metrics when constructing ISLI as validated by correlation analysis between the incorporated conventional metrics and their variogram ranges. In terms of the range parameters of different land cover types, broadleaf forest and needleleaf forest have much larger range values than those of other land cover types; (2) DIVISION and PLAND, two of the conventional landscape metrics considered for constructing ISLI, were assigned the greatest weights in computing ISLI for this study; and (3) ISLI values can be used to determine the dominant landscape types. For the study area, ISLI values of broadleaf forests remained the largest until 2020, indicating that forest landscape characteristics were the most prominent during that period. After 2020, the dominance of needleleaf forest gradually increased, with its ISLI value reaching a maximum of 0.91 in 2025. Therefore, the proposed ISLI not only functions as an extension and complement to conventional landscape metrics but also provides more comprehensive information concerning landscape pattern dynamics.
- Published
- 2023
- Full Text
- View/download PDF
39. Entropy and Cities: A Bibliographic Analysis towards More Circular and Sustainable Urban Environments
- Author
-
Daniel R. Rondinel-Oviedo and Naomi Keena
- Subjects
entropy ,sustainable cities ,circular economy ,thermodynamics ,urban systems ,urban entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Cities are critical to a sustainable future for our planet; still, the construction and operation of cities rely on intensive resource and energy use and transformation, leading to the generation of waste, effluents, and pollution, representing negative externalities outside and inside the city. Within every process, transformation implies the use of energy and the increase of entropy. In an urban system, the transformation of energy and materials will trigger the creation of entropic landscapes, mainly in the informal city and in unguarded natural landscapes, even hundreds of kilometers away, which generates substantial economic, social, and environmental impacts. In this sense, cities are significant contributors to the environmental crisis. Upstream, degradation of landscapes and ecosystems is frequent. Cities’ externalities and exogenous consumptions are directly linked with entropy and entropic landscapes, which are recognized as pollution (in the air, water, and land) or waste and in the degradation of natural ecosystems and communities. Through a systematic review of existing literature, this paper first outlines briefly how entropy has been applied in different disciplines and then focuses on presenting recent developments of how entropy has been defined, used, and characterized in urban studies concerning sustainability in cities and architecture, and presents a definition of the concept in relation to urban systems and key aspects to consider.
- Published
- 2023
- Full Text
- View/download PDF
40. Efficient Video Watermarking Algorithm Based on Convolutional Neural Networks with Entropy-Based Information Mapper
- Author
-
Marta Bistroń and Zbigniew Piotrowski
- Subjects
CNN ,entropy ,information mapping ,neural networks ,watermarking ,video watermarking ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
This paper presents a method for the transparent, robust, and highly capacitive watermarking of video signals using an information mapper. The proposed architecture is based on the use of deep neural networks to embed the watermark in the luminance channel in the YUV color space. An information mapper was used to enable the transformation of a multi-bit binary signature of varying capacitance reflecting the entropy measure of the system into a watermark embedded in the signal frame. To confirm the effectiveness of the method, tests were carried out for video frames with a resolution of 256 × 256 pixels, with a watermark capacity of 4 to 16,384 bits. Transparency metrics (SSIM and PSNR) and a robustness metric—the bit error rate (BER)—were used to assess the performance of the algorithms.
- Published
- 2023
- Full Text
- View/download PDF
41. Considerations for Applying Entropy Methods to Temporally Correlated Stochastic Datasets
- Author
-
Joshua Liddy and Michael Busa
- Subjects
entropy ,biomechanics ,human movement ,temporal correlations ,sample entropy ,information theory ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The goal of this paper is to highlight considerations and provide recommendations for analytical issues that arise when applying entropy methods, specifically Sample Entropy (SampEn), to temporally correlated stochastic datasets, which are representative of a broad range of biomechanical and physiological variables. To simulate a variety of processes encountered in biomechanical applications, autoregressive fractionally integrated moving averaged (ARFIMA) models were used to produce temporally correlated data spanning the fractional Gaussian noise/fractional Brownian motion model. We then applied ARFIMA modeling and SampEn to the datasets to quantify the temporal correlations and regularity of the simulated datasets. We demonstrate the use of ARFIMA modeling for estimating temporal correlation properties and classifying stochastic datasets as stationary or nonstationary. We then leverage ARFIMA modeling to improve the effectiveness of data cleaning procedures and mitigate the influence of outliers on SampEn estimates. We also emphasize the limitations of SampEn to distinguish among stochastic datasets and suggest the use of complementary measures to better characterize the dynamics of biomechanical variables. Finally, we demonstrate that parameter normalization is not an effective procedure for increasing the interoperability of SampEn estimates, at least not for entirely stochastic datasets.
- Published
- 2023
- Full Text
- View/download PDF
42. Homogeneous Adaboost Ensemble Machine Learning Algorithms with Reduced Entropy on Balanced Data
- Author
-
Mahesh Thyluru Ramakrishna, Vinoth Kumar Venkatesan, Ivan Izonin, Myroslav Havryliuk, and Chandrasekhar Rohith Bhat
- Subjects
machine learning ,entropy ,breast cancer ,ensemble methods ,precision ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Today’s world faces a serious public health problem with cancer. One type of cancer that begins in the breast and spreads to other body areas is breast cancer (BC). Breast cancer is one of the most prevalent cancers that claim the lives of women. It is also becoming clearer that most cases of breast cancer are already advanced when they are brought to the doctor’s attention by the patient. The patient may have the evident lesion removed, but the seeds have reached an advanced stage of development or the body’s ability to resist them has weakened considerably, rendering them ineffective. Although it is still much more common in more developed nations, it is also quickly spreading to less developed countries. The motivation behind this study is to use an ensemble method for the prediction of BC, as an ensemble model aims to automatically manage the strengths and weaknesses of each of its separate models, resulting in the best decision being made overall. The main objective of this paper is to predict and classify breast cancer using Adaboost ensemble techniques. The weighted entropy is computed for the target column. Taking each attribute’s weights results in the weighted entropy. Each class’s likelihood is represented by the weights. The amount of information gained increases with a decrease in entropy. Both individual and homogeneous ensemble classifiers, created by mixing Adaboost with different single classifiers, have been used in this work. In order to deal with the class imbalance issue as well as noise, the synthetic minority over-sampling technique (SMOTE) was used as part of the data mining pre-processing. The suggested approach uses a decision tree (DT) and naive Bayes (NB), with Adaboost ensemble techniques. The experimental findings shown 97.95% accuracy for prediction using the Adaboost-random forest classifier.
- Published
- 2023
- Full Text
- View/download PDF
43. Entropy, Graph Homomorphisms, and Dissociation Sets
- Author
-
Ziyuan Wang, Jianhua Tu, and Rongling Lang
- Subjects
entropy ,graph homomorphisms ,dissociation sets ,independent sets ,bipartite graphs ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Given two graphs G and H, the mapping of f:V(G)→V(H) is called a graph homomorphism from G to H if it maps the adjacent vertices of G to the adjacent vertices of H. For the graph G, a subset of vertices is called a dissociation set of G if it induces a subgraph of G containing no paths of order three, i.e., a subgraph of a maximum degree, which is at most one. Graph homomorphisms and dissociation sets are two generalizations of the concept of independent sets. In this paper, by utilizing an entropy approach, we provide upper bounds on the number of graph homomorphisms from the bipartite graph G to the graph H and the number of dissociation sets in a bipartite graph G.
- Published
- 2023
- Full Text
- View/download PDF
44. New Fast ApEn and SampEn Entropy Algorithms Implementation and Their Application to Supercomputer Power Consumption
- Author
-
Jiří Tomčala
- Subjects
entropy ,measure of complexity ,approximate entropy ,sample entropy ,fast approximate entropy ,fast sample entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Approximate Entropy and especially Sample Entropy are recently frequently used algorithms for calculating the measure of complexity of a time series. A lesser known fact is that there are also accelerated modifications of these two algorithms, namely Fast Approximate Entropy and Fast Sample Entropy. All these algorithms are effectively implemented in the R software package TSEntropies. This paper contains not only an explanation of all these algorithms, but also the principle of their acceleration. Furthermore, the paper contains a description of the functions of this software package and their parameters, as well as simple examples of using this software package to calculate these measures of complexity of an artificial time series and the time series of a complex real-world system represented by the course of supercomputer infrastructure power consumption. These time series were also used to test the speed of this package and to compare its speed with another R package pracma. The results show that TSEntropies is up to 100 times faster than pracma and another important result is that the computational times of the new Fast Approximate Entropy and Fast Sample Entropy algorithms are up to 500 times lower than the computational times of their original versions. At the very end of this paper, the possible use of this software package TSEntropies is proposed.
- Published
- 2020
- Full Text
- View/download PDF
45. Information and Statistical Measures in Classical vs. Quantum Condensed-Matter and Related Systems
- Author
-
Adam Gadomski and Sylwia Zielińska-Raczyńska
- Subjects
information ,entropy ,classical vs. quantum system ,condensed matter ,soft matter ,complex systems ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
The presented editorial summarizes in brief the efforts of ten (10) papers collected by the Special Issue (SI) “Condensed-Matter-Principia Based Information & Statistical Measures: From Classical to Quantum”. The SI called for papers dealing with condensed-matter systems, or their interdisciplinary analogs, for which well-defined classical statistical vs. quantum information measures can be inferred while based on the entropy concept. The SI has mainly been rested upon objectives addressed by an international colloquium held in October 2019, at the University of Science and Technology (UTP) Bydgoszcz, Poland (see http://zmpf.imif.utp.edu.pl/rci-jcs/rci-jcs-4/), with an emphasis placed on the achievements of Professor Gerard Czajkowski (PGC). PGC commenced his research activity with diffusion-reaction (open) systems under the supervision of Roman S. Ingarden (Toruń), a father of Polish synergetics, and original thermodynamic approaches to self-organization. The active cooperation of PGC mainly with German physicists (Friedrich Schloegl, Aachen; Werner Ebeling, Berlin) ought to be underlined. Then, the development of Czajkowski’s research is worth underscoring, moving from statistical thermodynamics to solid state theory, pursued in terms of nonlinear solid-state optics (Franco Bassani, Pisa), and culminating very recently with large quasiparticles, termed Rydberg excitons, and their coherent interactions with light.
- Published
- 2020
- Full Text
- View/download PDF
46. Cross-Entropy as a Metric for the Robustness of Drone Swarms
- Author
-
Piotr Cofta, Damian Ledziński, Sandra Śmigiel, and Marta Gackowska
- Subjects
entropy ,cross-entropy ,drones ,swarms ,robustness ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Due to their growing number and increasing autonomy, drones and drone swarms are equipped with sophisticated algorithms that help them achieve mission objectives. Such algorithms vary in their quality such that their comparison requires a metric that would allow for their correct assessment. The novelty of this paper lies in analysing, defining and applying the construct of cross-entropy, known from thermodynamics and information theory, to swarms. It can be used as a synthetic measure of the robustness of algorithms that can control swarms in the case of obstacles and unforeseen problems. Based on this, robustness may be an important aspect of the overall quality. This paper presents the necessary formalisation and applies it to a few examples, based on generalised unexpected behaviour and the results of collision avoidance algorithms used to react to obstacles.
- Published
- 2020
- Full Text
- View/download PDF
47. Entropy-Based Effect Evaluation of Delineators in Tunnels on Drivers’ Gaze Behavior
- Author
-
Xueyan Han, Yang Shao, Shaowei Yang, and Peng Yu
- Subjects
tunnel safety ,delineator post configurations ,entropy ,gaze behavior ,driving fatigue ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Driving safety in tunnels has always been an issue of great concern. Establishing delineators to improve drivers’ instantaneous cognition of the surrounding environment in tunnels can effectively enhance driver safety. Through a simulation study, this paper explored how delineators affect drivers’ gaze behavior (including fixation and scanpath) in tunnels. In addition to analyzing typical parameters, such as fixation position and fixation duration in areas of interest (AOIs), by modeling drivers’ switching process as Markov chains and calculating Shannon’s entropy of the fit Markov model, this paper quantified the complexity of individual switching patterns between AOIs under different delineator configurations and with different road alignments. A total of 25 subjects participated in this research. The results show that setting delineators in tunnels can attract drivers’ attention and make them focus on the pavement. When driving in tunnels equipped with delineators, especially tunnels with both wall delineators and pavement delineators, the participants exhibited a smaller transition entropy H t and stationary entropy H s , which can greatly reduce drivers’ visual fatigue. Compared with left curve and right curve, participants obtained higher H t and H s values in the straight section.
- Published
- 2020
- Full Text
- View/download PDF
48. A History of Thermodynamics: The Missing Manual
- Author
-
Wayne M. Saslow
- Subjects
thermodynamics ,energy ,heat ,temperature ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
We present a history of thermodynamics. Part 1 discusses definitions, a pre-history of heat and temperature, and steam engine efficiency, which motivated thermodynamics. Part 2 considers in detail three heat conservation-based foundational papers by Carnot, Clapeyron, and Thomson. For a reversible Carnot cycle operating between thermal reservoirs with Celsius temperatures t and t + d t , heat Q from the hot reservoir, and net work W, Clapeyron derived W / Q = d t / C ( t ) , with C ( t ) material-independent. Thomson used μ = 1 / C ( t ) to define an absolute temperature but, unaware that an additional criterion was needed, he first proposed a logarithmic function of the ideal gas temperature T g . Part 3, following a discussion of conservation of energy, considers in detail a number of energy conservation-based papers by Clausius and Thomson. As noted by Gibbs, in 1850, Clausius established the first modern form of thermodynamics, followed by Thomson’s 1851 rephrasing of what he called the Second Law. In 1854, Clausius theoretically established for a simple Carnot cycle the condition Q 1 / T 1 + Q 2 / T 2 = 0 . He generalized it to ∑ i Q i / T g , i = 0 , and then ∮ d Q / T g = 0 . This both implied a new thermodynamic state function and, with appropriate integration factor 1 / T , the thermodynamic temperature. In 1865, Clausius named this new state function the entropy S.
- Published
- 2020
- Full Text
- View/download PDF
49. Fast and Efficient Image Encryption Algorithm Based on Modular Addition and SPD
- Author
-
Khushbu Khalid Butt, Guohui Li, Sajid Khan, and Sohaib Manzoor
- Subjects
image encryption ,modular addition ,scrambling plus diffusion (spd) ,sha-512 ,security ,entropy ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Bit-level and pixel-level methods are two classifications for image encryption, which describe the smallest processing elements manipulated in diffusion and permutation respectively. Most pixel-level permutation methods merely alter the positions of pixels, resulting in similar histograms for the original and permuted images. Bit-level permutation methods, however, have the ability to change the histogram of the image, but are usually not preferred due to their time-consuming nature, which is owed to bit-level computation, unlike that of other permutation techniques. In this paper, we introduce a new image encryption algorithm which uses binary bit-plane scrambling and an SPD diffusion technique for the bit-planes of a plain image, based on a card game trick. Integer values of the hexadecimal key SHA-512 are also used, along with the adaptive block-based modular addition of pixels to encrypt the images. To prove the first-rate encryption performance of our proposed algorithm, security analyses are provided in this paper. Simulations and other results confirmed the robustness of the proposed image encryption algorithm against many well-known attacks; in particular, brute-force attacks, known/chosen plain text attacks, occlusion attacks, differential attacks, and gray value difference attacks, among others.
- Published
- 2020
- Full Text
- View/download PDF
50. Adaptive Multiscale Symbolic-Dynamics Entropy for Condition Monitoring of Rotating Machinery
- Author
-
Chunhong Dou and Jinshan Lin
- Subjects
multiscale ,symbolic dynamics ,entropy ,condition monitoring ,rotating machinery ,Science ,Astrophysics ,QB460-466 ,Physics ,QC1-999 - Abstract
Vibration data from rotating machinery working in different conditions display different properties in spatial and temporal scales. As a result, insights into spatial- and temporal-scale structures of vibration data of rotating machinery are fundamental for describing running conditions of rotating machinery. However, common temporal statistics and typical nonlinear measures have difficulties in describing spatial and temporal scales of data. Recently, statistical linguistic analysis (SLA) has been pioneered in analyzing complex vibration data from rotating machinery. Nonetheless, SLA can examine data in spatial scales but not in temporal scales. To improve SLA, this paper develops symbolic-dynamics entropy for quantifying word-frequency series obtained by SLA. By introducing multiscale analysis to SLA, this paper proposes adaptive multiscale symbolic-dynamics entropy (AMSDE). By AMSDE, spatial and temporal properties of data can be characterized by a set of symbolic-dynamics entropy, each of which corresponds to a specific temporal scale. Afterward, AMSDE is employed to deal with vibration data from defective gears and rolling bearings. Moreover, the performance of AMSDE is benchmarked against five common temporal statistics (mean, standard deviation, root mean square, skewness and kurtosis) and three typical nonlinear measures (approximate entropy, sample entropy and permutation entropy). The results suggest that AMSDE performs better than these benchmark methods in characterizing running conditions of rotating machinery.
- Published
- 2019
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.