1,676 results
Search Results
2. Adjustment mode decision based on support vector data description and evidence theory for assembly lines
- Author
-
Lv, Youlong, Qin, Wei, Yang, Jungang, and Zhang, Jie
- Published
- 2018
- Full Text
- View/download PDF
3. A Space Infrared Dim Target Recognition Algorithm Based on Improved DS Theory and Multi-Dimensional Feature Decision Level Fusion Ensemble Classifier.
- Author
-
Chen, Xin, Zhang, Hao, Zhang, Shenghao, Feng, Jiapeng, Xia, Hui, Rao, Peng, and Ai, Jianliang
- Subjects
RADIANT intensity ,RECOGNITION (Psychology) ,SUPPORT vector machines ,SITUATIONAL awareness ,FEATURE extraction - Abstract
Space infrared dim target recognition is an important applications of space situational awareness (SSA). Due to the weak observability and lack of geometric texture of the target, it may be unreliable to rely only on grayscale features for recognition. In this paper, an intelligent information decision-level fusion method for target recognition which takes full advantage of the ensemble classifier and Dempster–Shafer (DS) theory is proposed. To deal with the problem that DS produces counterintuitive results when evidence conflicts, a contraction–expansion function is introduced to modify the body of evidence to mitigate conflicts between pieces of evidence. In this method, preprocessing and feature extraction are first performed on the multi-frame dual-band infrared images to obtain the features of the target, which include long-wave radiant intensity, medium–long-wave radiant intensity, temperature, emissivity–area product, micromotion period, and velocity. Then, the radiation intensities are fed to the random convolutional kernel transform (ROCKET) architecture for recognition. For the micromotion period feature, a support vector machine (SVM) classifier is used, and the remaining categories of the features are input into the long short-term memory network (LSTM) for recognition, respectively. The posterior probabilities corresponding to each category, which are the result outputs of each classifier, are constructed using the basic probability assignment (BPA) function of the DS. Finally, the discrimination of the space target category is implemented according to improved DS fusion rules and decision rules. Continuous multi-frame infrared images of six flight scenes are used to evaluate the effectiveness of the proposed method. The experimental results indicate that the recognition accuracy of the proposed method in this paper can reach 93% under the strong noise level (signal-to-noise ratio is 5). Its performance outperforms single-feature recognition and other benchmark algorithms based on DS theory, which demonstrates that the proposed method can effectively enhance the recognition accuracy of space infrared dim targets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. A Tunnel Fire Detection Method Based on an Improved Dempster-Shafer Evidence Theory.
- Author
-
Wang, Haiying, Shi, Yuke, Chen, Long, and Zhang, Xiaofeng
- Subjects
HEAT release rates ,DEMPSTER-Shafer theory ,MULTISENSOR data fusion ,WIND tunnels ,WIND speed ,FIRE detectors - Abstract
Tunnel fires are generally detected using various sensors, including measuring temperature, CO concentration, and smoke concentration. To address the ambiguity and inconsistency in multi-sensor data, this paper proposes a tunnel fire detection method based on an improved Dempster-Shafer (DS) evidence theory for multi-sensor data fusion. To solve the problem of evidence conflict in the DS theory, a two-level multi-sensor data fusion framework is adopted. The first level of fusion involves feature fusion of the same type of sensor data, removing ambiguous data to obtain characteristic data, and calculating the basic probability assignment (BPA) function through the feature interval. The second-level fusion derives basic probability numbers from the BPA, calculates the degree of evidence conflict, normalizes the BPA to obtain the relative conflict degree, and optimizes the BPA using the trust coefficient. The classical DS evidence theory is then used to integrate and obtain the probability of tunnel fire occurrence. Different heat release rates, tunnel wind speeds, and fire locations are set, forming six fire scenarios. Sensor monitoring data under each simulation condition are extracted and fused using the improved DS evidence theory. The results show that there is a 67.5%, 83.5%, 76.8%, 83%, 79.6%, and 84.1% probability of detecting fire when it occurs, respectively, and identifies fire occurrence in approximately 2.4 s, an improvement from 64.7% to 70% over traditional methods. This demonstrates the feasibility and superiority of the proposed method, highlighting its significant importance in ensuring personnel safety. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. PSO-ECM: particle swarm optimization-based evidential C-means algorithm.
- Author
-
Cai, Yuxuan, Zhou, Qianli, and Deng, Yong
- Abstract
As an extension of Fuzzy C-Means (FCM), Evidence C-Means (ECM) is proposed in the framework of Dempster–Shafer theory (DST) and has been applied to many fields. However, the objective function of ECM involves only the distortion between the object and the prototype, which relies heavily on the initial prototype. Therefore, ECM may encounter the problem of local optimization. To solve this problem, this paper introduces ECM with Particle Swarm Optimization (PSO) initialization to determine the initial clustering centroids, and proposes Particle Swarm Optimization-based Evidential C-Means (PSO-ECM), which reduces the influence of bad initial prototypes and improves the local optimality problem of ECM. PSO-ECM is compared with three other clustering algorithms in four experiments and with ECM on a noise-containing dataset. According to the experimental results, PSO-ECM performs well in terms of different clustering validity metrics compared with existing clustering algorithms, has high stability of clustering, and can effectively and stably cluster noise-containing datasets and accurately identify outlier points. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Multi-Objective Optimization of the Robustness of Complex Networks Based on the Mixture of Weighted Surrogates.
- Author
-
Nie, Junfeng, Yu, Zhuoran, and Li, Junli
- Subjects
DEMPSTER-Shafer theory ,EVOLUTIONARY algorithms ,CONTROLLABILITY in systems engineering ,MIXTURES - Abstract
Network robustness is of paramount importance. Although great progress has been achieved in robustness optimization using single measures, such networks may still be vulnerable to many attack scenarios. Consequently, multi-objective network robustness optimization has recently garnered greater attention. A complex network structure plays an important role in both node-based and link-based attacks. In this paper, since multi-objective robustness optimization comes with a high computational cost, a surrogate model is adopted instead of network controllability robustness in the optimization process, and the Dempster–Shafer theory is used for selecting and mixing the surrogate models. The method has been validated on four types of synthetic networks, and the results show that the two selected surrogate models can effectively assist the multi-objective evolutionary algorithm in finding network structures with improved controllability robustness. The adaptive updating of surrogate models during the optimization process leads to better results than the selection of two surrogate models, albeit at the cost of longer processing times. Furthermore, the method demonstrated in this paper achieved better performance than existing methods, resulting in a marked increase in computational efficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. A data fusion method in wireless sensor network based on belief structure.
- Author
-
Long, Chengfeng, Liu, Xingxin, Yang, Yakun, Zhang, Tao, Tan, Siqiao, Fang, Kui, Tang, Xiaoyong, and Yang, Gelan
- Subjects
MULTISENSOR data fusion ,WIRELESS sensor networks ,WIRELESS sensor nodes ,ROUGH sets ,GRANULAR computing ,INFORMATION storage & retrieval systems - Abstract
Considering the issue with respect to the high data redundancy and high cost of information collection in wireless sensor nodes, this paper proposes a data fusion method based on belief structure to reduce attribution in multi-granulation rough set. By introducing belief structure, attribute reduction is carried out for multi-granulation rough sets. From the view of granular computing, this paper studies the evidential characteristics of incomplete multi-granulation ordered information systems. On this basis, the positive region reduction, belief reduction and plausibility reduction are put forward in incomplete multi-granulation ordered information system and analyze the consistency in the same level and transitivity in different levels. The positive region reduction and belief reduction are equivalent, and the positive region reduction and belief reduction are unnecessary and sufficient conditional plausibility reduction in the same level, if the cover structure order of different levels are the same the corresponding equivalent positive region reduction. The algorithm proposed in this paper not only performs three reductions, but also reduces the time complexity largely. The above study fuses the node data which reduces the amount of data that needs to be transmitted and effectively improves the information processing efficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
8. 基于改进 DSET 的电力视频终端网络安全评估.
- Author
-
杨敏杰, 林静, 何良圆, and 夏飞
- Subjects
DEMPSTER-Shafer theory ,INTERNET of things ,MEDIAN (Mathematics) ,GENERATION gap ,INTELLIGENT networks - Abstract
Copyright of Journal of Harbin University of Science & Technology is the property of Journal of Harbin University of Science & Technology and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
9. Multi-Agent Cooperative Camera-Based Semantic Grid Generation.
- Author
-
Caillot, Antoine, Ouerghi, Safa, Dupuis, Yohan, Vasseur, Pascal, and Boutteau, Rémi
- Abstract
The idea of cooperative perception for navigation assistance was introduced about a decade ago with the aim to increase safety on dangerous areas like intersections. In this context, roadside infrastructure appeared very recently to provide a new point of view of the scene. In this paper, we propose to combine the Vehicle-To-Vehicle (V2V) and Vehicle-To-Infrastructure (V2I) approaches in order to take advantage of the elevated points of view offered by the infrastructure and the in-scene points of view offered by the vehicles to build a semantic grid map of the moving elements in the scene. To create this map, we chose to use camera information and 2-Dimentional (2D) bounding boxes in order to minimize the impact on the network and ignored possible depth information as opposed to all state-of-the art methods. We propose a framework based on two fusion methods: one based on the Bayesian theory and the other on the Dempster-Shafer Theory (DST) to merge the information and chose a label for each cell of the semantic grid in order to assess the best fusion method. Finally, we evaluate our approach on a set of datasets that we generated from the CARLA simulator varying the proportion of Connected Vehicle (CV) and the traffic density. We also show the superiority of the method based on the DST with a gain on the mean intersection over union between the two methods of up to 23.35%. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Enhancing Probabilistic Solar PV Forecasting: Integrating the NB-DST Method with Deterministic Models.
- Author
-
Ahmad, Tawsif, Zhou, Ning, Zhang, Ziang, and Tang, Wenyuan
- Subjects
ARTIFICIAL neural networks ,DEMPSTER-Shafer theory ,CUMULATIVE distribution function ,FORECASTING ,ELECTRIC power distribution grids ,PHOTOCATHODES - Abstract
Accurate quantification of uncertainty in solar photovoltaic (PV) generation forecasts is imperative for the efficient and reliable operation of the power grid. In this paper, a data-driven non-parametric probabilistic method based on the Naïve Bayes (NB) classification algorithm and Dempster–Shafer theory (DST) of evidence is proposed for day-ahead probabilistic PV power forecasting. This NB-DST method extends traditional deterministic solar PV forecasting methods by quantifying the uncertainty of their forecasts by estimating the cumulative distribution functions (CDFs) of their forecast errors and forecast variables. The statistical performance of this method is compared with the analog ensemble method and the persistence ensemble method under three different weather conditions using real-world data. The study results reveal that the proposed NB-DST method coupled with an artificial neural network model outperforms the other methods in that its estimated CDFs have lower spread, higher reliability, and sharper probabilistic forecasts with better accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. An Artificial Intelligence Framework for Supporting Coarse-Grained Workload Classification in Complex Virtual Environments.
- Author
-
Cuzzocrea, Alfredo, Mumolo, Enzo, Belmerabet, Islam, and Hafsaou, Abderraouf
- Subjects
ARTIFICIAL intelligence ,VIRTUAL reality ,DEMPSTER-Shafer theory ,VIRTUAL machine systems ,INFORMATION storage & retrieval systems - Abstract
We propose Cloud-based machine learning tools for enhanced Big Data applications, where the main idea is that of predicting the \next" workload occurring against the target Cloud infrastructure via an innovative ensemble-based approach that combines the effectiveness of different well-known classifiers in order to enhance the whole accuracy of the final classification, which is very relevant at now in the specific context of Big Data. The so- called workload categorization problem plays a critical role in improving the efficiency and reliability of Cloud-based big data applications. Implementation-wise, our method proposes deploying Cloud entities that participate in the distributed classification approach on top of virtual machines, which represent classical \commodity" settings for Cloud-based big data applications. Given a number of known reference workloads, and an unknown workload, in this paper we deal with the problem of finding the reference workload which is most similar to the unknown one. The depicted scenario turns out to be useful in a plethora of modern information system applications. We name this problem as coarse-grained workload classification, because, instead of characterizing the unknown workload in terms of finer behaviors, such as CPU, memory, disk, or network intensive patterns, we classify the whole unknown workload as one of the (possible) reference workloads. Reference workloads represent a category of workloads that are relevant in a given applicative environment. In particular, we focus our attention on the classification problem described above in the special case represented by virtualized environments. Today, Virtual Machines (VMs) have become very popular because they offer important advantages to modern computing environments such as cloud computing or server farms. In virtualization frameworks, workload classification is very useful for accounting, security reasons, or user profiling. Hence, our research makes more sense in such environments, and it turns out to be very useful in a special context like Cloud Computing, which is emerging now. In this respect, our approach consists of running several machine learning-based classifiers of different workload models, and then deriving the best classifier produced by the Dempster-Shafer Fusion, in order to magnify the accuracy of the final classification. Experimental assessment and analysis clearly confirm the benefits derived from our classification framework. The running programs which produce unknown workloads to be classified are treated in a similar way. A fundamental aspect of this paper concerns the successful use of data fusion in workload classification. Different types of metrics are in fact fused together using the Dempster-Shafer theory of evidence combination, giving a classification accuracy of slightly less than 80%. The acquisition of data from the running process, the pre-processing algorithms, and the workload classification are described in detail. Various classical algorithms have been used for classification to classify the workloads, and the results are compared. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. Failure mode and effects analysis using an improved pignistic probability transformation function and grey relational projection method.
- Author
-
Tang, Yongchuan, Sun, Zhaoxing, Zhou, Deyun, and Huang, Yubo
- Subjects
FAILURE mode & effects analysis ,DEMPSTER-Shafer theory ,PROBABILITY theory ,TURBINE blades - Abstract
Failure mode and effects analysis (FMEA) is an important risk analysis tool that has been widely used in diverse areas to manage risk factors. However, how to manage the uncertainty in FMEA assessments is still an open issue. In this paper, a novel FMEA model based on the improved pignistic probability transformation function in Dempster–Shafer evidence theory (DST) and grey relational projection method (GRPM) is proposed to improve the accuracy and reliability in risk analysis with FMEA. The basic probability assignment (BPA) function in DST is used to model the assessments of experts with respect to each risk factor. Dempster's rule of combination is adopted for fusion of assessment information from different experts. The improved pignistic probability function is proposed and used to transform the fusion result of BPA into probability function for getting more accurate decision-making result in risk analysis with FMEA. GRPM is adopted to determine the risk priority order of all the failure modes to overcome the shortcoming in traditional risk priority number in FMEA. Applications in aircraft turbine rotor blades and steel production process are presented to show the rationality and generality of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. An On-Demand Fault-Tolerant Routing Strategy for Secure Key Distribution Network.
- Author
-
Wu, Zhiwei, Deng, Haojiang, and Li, Yang
- Subjects
SUPERLATTICES ,SCALABILITY ,FAULT tolerance (Engineering) ,DEMPSTER-Shafer theory ,SEMICONDUCTOR devices ,TRUST ,BLOCK ciphers - Abstract
The point-to-point key distribution technology based on twinning semiconductor superlattice devices can provide high-speed secure symmetric keys, suitable for scenarios with high security requirements such as the one-time pad cipher. However, deploying these devices and scaling them in complex scenarios, such as many-to-many communication, poses challenges. To address this, an effective solution is to build a secure key distribution network for communication by selecting trusted relays and deploying such devices between them. The larger the network, the higher the likelihood of relay node failure or attack, which can impact key distribution efficiency and potentially result in communication key leakage. To deal with the above challenges, this paper proposes an on-demand fault-tolerant routing strategy based on the secure key distribution network to improve the fault tolerance of the network while ensuring scalability and availability. The strategy selects the path with better local key status through a fault-free on-demand path discovery mechanism. To improve the reliability of the communication key, we integrate an acknowledgment-based fault detection mechanism in the communication key distribution process to locate the fault, and then identified the cause of the fault based on the Dempster–Shafer evidence theory. The identified fault is then isolated through subsequent path discovery and the key status is transferred. Simulation results demonstrate that the proposed method outperforms OSPF, the adaptive stochastic routing and the multi-path communication scheme, achieving an average 20 % higher packet delivery ratio and lower corrupted key ratio, thus highlighting its reliability. Additionally, the proposed solution exhibits a relatively low local key overhead, indicating its practical value. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Lightning Failure Risk Assessment of Overhead Transmission Lines Based on Modified Dempster–Shafer Theory.
- Author
-
Liu, Jie, Jia, Boyan, Zhang, Zhimeng, Wang, Zimo, Wang, Ping, and Geng, Jianghai
- Subjects
DEMPSTER-Shafer theory ,ELECTRIC lines ,LIGHTNING ,RISK assessment ,LIGHTNING protection - Abstract
Lightning has a certain degree of potential threat to the safe operation of overhead transmission lines. In order to make targeted lightning protection arrangements and reduce the impact of lightning on overhead transmission lines, it is necessary to conduct lightning risk assessments on overhead transmission lines. This paper proposes a lightning failure risk assessment method for overhead transmission lines based on a modified Dempster–Shafer theory. First, analyze the historical lightning failure data of the line, determine the lightning failure impact factors, and use confidence to express the relationship between the lightning failure and the impact factor; then, use entropy weight theory and gray relational theory to calculate the value of mass function, and modify it on this basis; finally, use Dempster–Shafer theory to determine the trust degree and fit this with the calculated lightning trip rate to produce the risk assessment. This paper analyzes the lightning failure data of overhead transmission lines in some areas of Hebei Province. The results show that, compared with the evaluation method of the Dempster–Shafer theory, the accuracy of the evaluation is improved to a certain extent after correcting the mass function value. It can be seen that this method can integrate and comprehensively consider different data and can provide a reference for preventing damage to transmission lines by lightning strike weather. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
15. Integrating Community Context Information Into a Reliably Weighted Collaborative Filtering System Using Soft Ratings.
- Author
-
Nguyen, Van-Doan, Huynh, Van-Nam, and Sriboonchitta, Songsak
- Subjects
RECOMMENDER systems ,DEMPSTER-Shafer theory ,SOCIAL networks ,COMMUNITIES ,PARALLEL algorithms - Abstract
In this paper, we aim at developing a new collaborative filtering recommender system using soft ratings, which is capable of dealing with both imperfect information about user preferences and the sparsity problem. On the one hand, Dempster–Shafer theory is employed for handling the imperfect information due to its advantage in providing not only a flexible framework for modeling uncertain, imprecise, and incomplete information, but also powerful operations for fusion of information from multiple sources. On the other hand, in dealing with the sparsity problem, community context information that is extracted from the social network containing all users is used for predicting unprovided ratings. As predicted ratings are not a hundred percent accurate, while the provided ratings are actually evaluated by users, we also develop a new method for calculating user–user similarities, in which provided ratings are considered to be more significant than predicted ones. In the experiments, the developed recommender system is tested on two different data sets; and the experiment results indicate that this system is more effective than CoFiDS, a typical recommender system offering soft ratings. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
16. A novel belief Tanimoto coefficient with its applications in multisource information fusion.
- Author
-
Lu, Yuhang and Xiao, Fuyuan
- Subjects
DEMPSTER-Shafer theory ,MULTISENSOR data fusion ,CONFLICT management ,FAULT diagnosis - Abstract
Dempster-Shafer evidence theory (DST) is a versatile framework for handling uncertainty and provides a reliable method for data fusion. Managing conflicts between multiple bodies of evidence (BOEs) within DST poses a challenging problem that necessitates effective strategies. In this paper, we present a novel similarity measurement called the belief Tanimoto coefficient (BTC). The BTC accurately quantifies the consistency between BOEs by considering both the length and direction of the evidence vectors. Furthermore, we propose a conflict measurement approach based on BTC. We analyze and demonstrate the desirable properties of the proposed similarity and conflict measures. Numerical examples and comparisons are provided to illustrate the superior effectiveness and validity of BTC. Additionally, we introduce a multisource information fusion method called BTC-MSIF. The proposed BTC-MSIF method achieves higher accuracy rates compared to existing approaches in real-world scenarios, including fault diagnosis and pattern classification. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Probabilistic Hesitant Fuzzy Evidence Theory and Its Application in Capability Evaluation of a Satellite Communication System.
- Author
-
Liu, Jiahuan, Jian, Ping, Liu, Desheng, and Xiong, Wei
- Subjects
TELECOMMUNICATION satellites ,DEMPSTER-Shafer theory ,AMBIGUITY ,FUZZY sets ,SATISFACTION - Abstract
Evaluating the capabilities of a satellite communication system (SCS) is challenging due to its complexity and ambiguity. It is difficult to accurately analyze uncertain situations, making it difficult for experts to determine appropriate evaluation values. To address this problem, this paper proposes an innovative approach by extending the Dempster-Shafer evidence theory (DST) to the probabilistic hesitant fuzzy evidence theory (PHFET). The proposed approach introduces the concept of probabilistic hesitant fuzzy basic probability assignment (PHFBPA) to measure the degree of support for propositions, along with a combination rule and decision approach. Two methods are developed to generate PHFBPA based on multi-classifier and distance techniques, respectively. In order to improve the consistency of evidence, discounting factors are proposed using an entropy measure and the Jousselme distance of PHFBPA. In addition, a model for evaluating the degree of satisfaction of SCS capability requirements based on PHFET is presented. Experimental classification and evaluation of SCS capability requirements are performed to demonstrate the effectiveness and stability of the PHFET method. By employing the DST framework and probabilistic hesitant fuzzy sets, PHFET provides a compelling solution for handling ambiguous data in multi-source information fusion, thereby improving the evaluation of SCS capabilities. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Drawbacks of Uncertainty Measures Based on the Pignistic Transformation.
- Author
-
Abellan, Joaquin and Bosse, Eloi
- Subjects
UNCERTAINTY ,DEMPSTER-Shafer theory - Abstract
Recently, a measure of “total uncertainty” (TU) in Dempster–Shafer theory, based on the pignistic distribution called ambiguity measure (AM), have been modified. The resulting new measure has been simply referred as modified AM (MAM). In the literature, it has been shown that AM, in addition to showing some undesirable behaviors, has important drawbacks related to two essential properties for such measures: 1) subadditivity and 2) monotonicity. The MAM measure has been developed to solve the AM subadditivity problem, but this paper demonstrates that MAM suffers the same drawback as AM with respect to monotonicity. A measure of uncertainty that cannot meet the monotonicity requirement has an important drawback for its exploitation in operational contexts such as in analytics, information fusion, and decision support. This paper aims at identifying and discussing drawbacks of this type of measures (AM, MAM). Our main motivation is to insist upon the important requirement of monotonicity that a TU measure should possess to improve its potential of being used and trusted in applications. This discussion is due time since the monotonicity problem needs first to be solved to avoid building too high expectations for usefulness and potential exploitation of such measures in operational communities. [ABSTRACT FROM PUBLISHER]
- Published
- 2018
- Full Text
- View/download PDF
19. Flexible Risk Evidence Combination Rules in Breast Cancer Precision Therapy.
- Author
-
Kenn, Michael, Karch, Rudolf, Singer, Christian F., Dorffner, Georg, and Schreiner, Wolfgang
- Subjects
DEMPSTER-Shafer theory ,BREAST cancer ,HORMONE receptors ,CANCER treatment ,DATA quality - Abstract
Evidence theory by Dempster-Shafer for determination of hormone receptor status in breast cancer samples was introduced in our previous paper. One major topic pointed out here is the link between pieces of evidence found from different origins. In this paper the challenge of selecting appropriate ways of fusing evidence, depending on the type and quality of data involved is addressed. A parameterized family of evidence combination rules, covering the full range of potential needs, from emphasizing discrepancies in the measurements to aspiring accordance, is covered. The consequences for real patient samples are shown by modeling different decision strategies. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
20. Apple grading method based on neural network with ordered partitions and evidential ensemble learning.
- Author
-
Ma, Liyao, Wei, Peng, Qu, Xinhua, Bi, Shuhui, Zhou, Yuan, and Shen, Tao
- Subjects
DEMPSTER-Shafer theory ,APPLES ,ARTIFICIAL neural networks ,APPLE growing - Abstract
In order to improve the performance of the automatic apple grading and sorting system, in this paper, an ensemble model of ordinal classification based on neural network with ordered partitions and Dempster–Shafer theory is proposed. As a non‐destructive grading method, apples are graded into three grades based on the Soluble Solids Content value, with features extracted from the preprocessed near‐infrared spectrum of apple serving as model inputs. Considering the uncertainty in grading labels, mass generation approach and evidential encoding scheme for ordinal label are proposed, with uncertainty handled within the framework of Dempster–Shafer theory. Constructing neural network with ordered partitions as the base learner, the learning procedure of the Bagging‐based ensemble model is detailed. Experiments on Yantai Red Fuji apples demonstrate the satisfactory grading performances of proposed evidential ensemble model for ordinal classification. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
21. Benefits Realization: Novel Conceptual Model for Front End–Design Decision-Making Using Dempster–Shafer Theory and Quality Function Deployment.
- Author
-
Serugga, Joas, Kagioglou, Mike, and Tzortzopoulos, Patricia
- Subjects
QUALITY function deployment ,DEMPSTER-Shafer theory ,CONCEPTUAL models ,DECISION making - Abstract
This paper proposes a new conceptual approach to address the gap in the understanding of benefits realization, which is an increasingly central element in the delivery of successful projects in architecture, engineering, and construction (AEC) design. The paper focuses on the link between uncertainty in front-end design (FED) and the delivery of project benefits as understood in the broader benefits realization theory. The paper builds on the nature of uncertainty and use of Dempster–Shafer theoretic and quality function deployment to model the various interdependences among design attributes. A social housing case study of Brazil's Mia Casa, Mia Vida Program demonstrated the application of the modeling approach. Optimal belief and plausibility structures in design decision-making were observed to increase with increasing consideration of more use models. These findings demonstrate that modeling uncertainty in FED can contribute to improved design decision-making. The paper's novel contribution is introducing uncertainty modeling to support design decision-making to overcome insufficiencies in the inherently rational design decision-making that often is unable to discern complexities in dynamic project contexts. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
22. Fuzzy Assessment of Management Consulting Projects: Model Validation and Case Studies.
- Author
-
Sun, Hongyi, Ni, Wenbin, and Huang, Lanxuan
- Subjects
LITERATURE reviews ,MODEL validation ,DEMPSTER-Shafer theory ,PROJECT management ,CLIENT satisfaction - Abstract
Management consulting (MC) has been heavily involved in emerging business opportunities in mainland China. However, there are no well-known local MC project management models to help evaluate whether an MC project can be successful or not. This paper reports a model for the self-assessment of management consulting projects, which has been validated by 15 experts and 13 cases. The new model, with seven factors that are critical to the success of MC projects, was developed from a literature review. The model was then verified by developing a questionnaire that was sent to 15 experts and using Dempster–Shafer theory to obtain the weight of each part of the model. The model was applied to 13 real cases to verify its effectiveness in evaluating an MC project. This new MC model can help consulting teams to conduct assessments in the early and middle stages, and evaluate in the late stage, of consulting projects, and also can help teams improve the probability of project success and client satisfaction. It can be used by consultants, client companies, or both. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
23. A Fuzzy Dempster–Shafer Evidence Theory Method with Belief Divergence for Unmanned Surface Vehicle Multi-Sensor Data Fusion.
- Author
-
Qiao, Shuanghu, Song, Baojian, Fan, Yunsheng, and Wang, Guofeng
- Subjects
MULTISENSOR data fusion ,DEMPSTER-Shafer theory ,AUTONOMOUS vehicles ,REMOTELY piloted vehicles ,COMPARATIVE method ,PROBLEM solving ,PRIOR learning - Abstract
The safe navigation of unmanned surface vehicles in the marine environment requires multi-sensor collaborative perception, and multi-sensor data fusion technology is a prerequisite for realizing the collaborative perception of different sensors. To address the problem of poor fusion accuracy for existing multi-sensor fusion methods without prior knowledge, a fuzzy evidence theory multi-sensor data fusion method with belief divergence is proposed in this paper. First of all, an adjustable distance for measuring discrepancies between measurements is devised to evaluate the degree of measurement closeness to the true value, which improves the adaptability of the method to different classes of sensor data. Furthermore, an adaptive multi-sensor measurement fusion strategy is designed for the case where the sensor accuracy is known in advance. Secondly, the affiliation function of the fuzzy theory is introduced into the evidence theory approach to assign initial evidence of measurements in terms of defining the degree of fuzzy support between measurements, which improves the fusion accuracy of the method. Finally, the belief Jensen–Shannon divergence and the Rényi divergence are combined for measuring the conflict between the evidence pieces to obtain the credibility degree as the reliability of the evidence, which solves the problem of high conflict between evidence pieces. Three examples of multi-sensor data fusion in different domains are employed to validate the adaptability of the proposed method to different kinds of multi-sensors. The maximum relative error of the proposed method for multiple sensor experiments is greater than or equal to 0.18%, and its error accuracy is much higher than the best result of 0.46% among other comparative methods. The experimental results verify that the proposed data fusion method is more accurate than other existing methods. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Damage Effectiveness Calculation of Hitting Targets with Ammunition Based on Bayesian Multinomial Distribution.
- Author
-
Liu, Haobang and Shi, Xianming
- Subjects
MULTINOMIAL distribution ,MARKOV chain Monte Carlo ,AMMUNITION ,DEMPSTER-Shafer theory ,DIGITAL divide - Abstract
Owning to the fact that ammunition can cause varying degrees of damage to its target, this article presents a damage effectiveness calculation method of hitting targets with ammunition based on Bayesian multinomial distribution to solve the problems of complex processes, few trial times and difficult calculations of damage probability in target-hitting tests with high-tech ammunition, according to a calculation index of damage effectiveness about the occurrence probability of different damage. Based on the concept of symmetry, the idea of "divide damage level—determine distribution—integrate information—solve distribution" is adopted. Firstly, this paper describes the damage effectiveness test of ammunition attacking targets with multiple distributions; secondly, this paper integrates the damage effectiveness information of ammunition strike targets with Dempster–Shafer evidence theory (D–S evidence theory) and symmetry advantage; finally, this paper attempts to solve the symmetric posterior distribution of damage effectiveness parameters with Bayesian theory and the Markov chain Monte Carlo (MCMC) method. The result demonstrates that this method is very significant in improving the calculation accuracy of ammunition damage effectiveness, which could describe the damage effectiveness of ammunition in detail by integrating the prior information with multiple types of damage effectiveness. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
25. Aggregation of Epistemic Uncertainty in Forms of Possibility and Certainty Factors.
- Author
-
Yamada, Koichi
- Subjects
EPISTEMIC uncertainty ,DEMPSTER-Shafer theory ,FUZZY sets ,COMPUTATIONAL intelligence ,INFORMATION resources - Abstract
Uncertainty aggregation is an important reasoning for making decisions in the real world, which is full of uncertainty. The paper proposes an information source model for aggregating epistemic uncertainties about truth and discusses uncertainty aggregation in the form of possibility distributions. A new combination rule of possibilities for truth is proposed. Then, this paper proceeds to discussion about a traditional but seemingly forgotten representation of uncertainty (i.e., certainty factors (CFs)) and proposes a new interpretation based on possibility theory. CFs have been criticized because of their lack of sound mathematical interpretation from the viewpoint of probability. Thus, this paper first establishes a theory for a sound interpretation using possibility theory. Then it examines aggregation of CFs based on the interpretation and some combination rules of possibility distributions. The paper proposes several combination rules for CFs having sound theoretical basis, one of which is exactly the same as the oft-criticized combination. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
26. Dempster-Shafer Theory in Recommender Systems: A Survey.
- Author
-
Belmessous, Khadidja, Sebbak, Faouzi, Mataoui, M'hamed, Senouci, Mustapha Reda, and Cherifi, Walid
- Subjects
- *
RECOMMENDER systems , *DEMPSTER-Shafer theory , *INFORMATION resources , *SYSTEMS theory , *SCIENTIFIC community - Abstract
Due to the limitations associated with the use of a single type of data during the recommendation process, recent research has focused on developing new fusion-based recommenders that make use of multiple heterogeneous sources of information to provide more accurate suggestions. However, the realistic and flexible methods available to users for expressing their preferences for products and services inherently generate uncertain, imperfect, and ambiguous data that feed recommenders and thus affect their accuracy. As a result, Recommender Systems (RS) make significant use of soft mathematical tools to deal with uncertainty. Among these tools is Dempster-Shafer Theory (DST), which has been shown to be effective at dealing with the inherent uncertainty in numerous applications. This article provides a survey of the use of DST in the RS field. Thus, after a brief introduction to recommender systems and the DST, this survey discusses recent DST applications in RS. It introduces a new taxonomy that encompasses the primary application context for DST-based RS solutions, as well as a comprehensive multi-criteria analysis of the peer-reviewed papers. The resulting comparisons are analyzed to draw conclusions, identify current study limitations, and define future research directions. This survey serves as a valuable resource for the entire research community that is interested in recommender systems and DST. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Belief entropy rate: a method to measure the uncertainty of interval-valued stochastic processes.
- Author
-
Wang, Zhiyuan, Zhou, Qianli, and Deng, Yong
- Subjects
STOCHASTIC processes ,UNCERTAINTY (Information theory) ,MARKOV processes ,DEMPSTER-Shafer theory ,STOCHASTIC models ,PROBABILITY theory - Abstract
Entropy rate, as an effective tool in information theory, can measure the uncertainty of stochastic processes modeled by probability mass function. However, when the stochastic process to be measured cannot be accurately modeled, i.e., it is more sense to describe the phenomenon with an interval, the stochastic process needs a more general method to represent. In this paper, the interval-valued stochastic process is modeled with the basic belief assignment in an ordered frame of discernment and the corresponding belief entropy rate is proposed to measure its uncertainty. Two common stochastic processes are discussed. The first is the case of independent identically distributed stochastic processes, where the belief entropy rate is formally the same as the Shannon entropy rate. The second is Markov processes. We construct the evidential Markov chain and calculate its belief entropy rate. Compared with the Shannon entropy rate, the belief entropy rate is easier to implement the Markov chains. By validating in real dataset, the proposed method can better deal with interval information with stronger practicability. When encountering tiny disturbances, the variance of the Shannon entropy rate is more than 50 times the variance of the belief entropy rate, which reflects the stronger robustness of the belief entropy rate. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
28. Multimodal feature selection from microarray data based on Dempster–Shafer evidence fusion.
- Author
-
Nekouie, Nadia, Romoozi, Morteza, and Esmaeili, Mahdi
- Subjects
MAJORITIES ,DEMPSTER-Shafer theory ,PLURALITY voting ,METAHEURISTIC algorithms ,CANCER diagnosis - Abstract
Microarray data have a crucial role in identifying and classifying different types of cancer tissues. In cancer research, high dimensional of microarray datasets has always caused problems in the design of classifiers. Thus, microarray data before classification are preprocessed through feature selection (FS) techniques, whereby the features with lower information values are discarded away. Traditional FS as search involves selecting a subset of features from large number of features. On the other hand, other subsets of features also contain information, and the multimodal property of FS problem should be considered. In multimodal metaheuristic searches, instead of one solution, several semi-optimized solutions are found, where each solution is an independent view compared to others. In this paper, in the first stage, multimodal FS is performed using multimodal firefly algorithm (FA), where several feature subsets are obtained. Then, in the second stage, this subset of features is used for assigning weight to the classifiers and identifying the effective classifiers in the ensemble process based on Dempster–Shafer evidence theory. The mentioned weights are calculated so that an ensemble could be developed through weighted majority vote. Eventually, the proposed algorithm is evaluated on microarray data for cancer diagnosis. The proposed method efficiency is evaluated by applying on 11 microarray datasets. The obtained results indicated good performance of the multimodal FS method in relation to other methods compared. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
29. A Novel Approach for Modeling and Evaluating Road Operational Resilience Based on Pressure-State-Response Theory and Dynamic Bayesian Networks.
- Author
-
Yu, Gang, Lin, Dinghao, Xie, Jiayi, and Wang, Ye. Ken
- Subjects
BAYESIAN analysis ,DEMPSTER-Shafer theory ,HUMAN behavior ,NATURAL disasters - Abstract
Urban roads face significant challenges from the unpredictable and destructive characteristics of natural or man-made disasters, emphasizing the importance of modeling and evaluating their resilience for emergency management. Resilience is the ability to recover from disruptions and is influenced by factors such as human behavior, road conditions, and the environment. However, current approaches to measuring resilience primarily focus on the functional attributes of road facilities, neglecting the vital feedback effects that occur during disasters. This study aims to model and evaluate road resilience under dynamic and uncertain emergency event scenarios. A new definition of road operational resilience is proposed based on the pressure-state-response theory, and the interaction mechanism between multidimensional factors and the stage characteristics of resilience is analyzed. A method for measuring road operational resilience using Dynamic Bayesian Networks (DBN) is proposed, and a hierarchical DBN structure is constructed based on domain knowledge to describe the influence relationship between resilience elements. The Best Worst method (BWM) and Dempster–Shafer evidence theory are used to determine the resilience status of network nodes in DBN parameter learning. A road operational resilience cube is constructed to visually integrate multidimensional and dynamic road resilience measurement results obtained from DBNs. The method proposed in this paper is applied to measure the operational resilience of roads during emergencies on the Shanghai expressway, achieving a 92.19% accuracy rate in predicting resilient nodes. Sensitivity analysis identifies scattered objects, casualties, and the availability of rescue resources as key factors affecting the rapidity of response disposal in road operations. These findings help managers better understand road resilience during emergencies and make informed decisions. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. A novel context inconsistency elimination algorithm based on the optimized Dempster-Shafer evidence theory for context-awareness systems.
- Author
-
Liu, Qiang, Xu, Hongji, He, Bo, Yuan, Hui, Liu, Zhi, Fan, Shidi, Xu, Jie, Li, Tiankuo, Li, Juan, Wang, Mengmeng, and Li, Shijie
- Subjects
DEMPSTER-Shafer theory ,COSINE function ,SYSTEMS theory ,INFORMATION resources ,ALGORITHMS ,INTERNET of things ,ENTROPY (Information theory) - Abstract
With the fast advancement of the Internet of things (IoT), context-awareness systems (CASs) have been widely used in many different fields, such as digital home and smart healthcare. However, low quality of context (QoC) usually causes the CASs to make inappropriate decisions, therefore, the context inconsistency has become an urgent problem that needs to be resolved. Although many researchers have adopted some QoC parameters to solve this problem, they do not take sufficient account of the relationships between context information sources and the impacts of the uncertainty of context information on the credibility of context information sources. In this paper, different distance measurement methods are utilized to divide context information sources into credible context information sources and incredible context information sources, on this basis, the Deng entropy is introduced to construct a new discounting factor in order to assign different discounting factors for different kinds of context information sources and a novel context inconsistency elimination algorithm based on the optimized Dempster-Shafer (D-S) evidence theory is proposed. The experimental results demonstrate that the proposed algorithm based on the Cosine distance can obtain 94.33% context-judge rate under high precision configuration of sensors. Besides, under low precision configuration of sensors, compared to the correlation coefficient based on generalized information quality (CIQ)-weighted algorithm which has the highest context-judge rate among other inconsistency elimination algorithms, the proposed algorithm based on the Cosine distance can solve more inconsistent context information. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. Generalized quantum evidence theory.
- Author
-
Xiao, Fuyuan
- Subjects
QUANTUM theory ,DEMPSTER-Shafer theory ,HILBERT space ,CLASSIFICATION algorithms ,OPEN spaces - Abstract
With the development of quantum decision making, how to bridge classical theory with quantum framework has gained much attention in past few years. Recently, a complex evidence theory (CET), as a generalized Dempster–Shafer evidence theory was presented to handle uncertainty on the complex plane. Whereas, CET focuses on a closed world, where the frame of discernment is complete with exhaustive and complete elements. To address this limitation, in this paper, we generalize CET to quantum framework of Hilbert space in an open world, and propose a generalized quantum evidence theory (GQET). On the basis of GQET, a quantum multisource information fusion algorithm is proposed to handle the uncertainty in an open world. To verify its effectiveness, we apply the proposed quantum multisource information fusion algorithm in a practical classification fusion. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. Power Transformer Condition-Based Evaluation and Maintenance (CBM) Using Dempster–Shafer Theory (DST).
- Author
-
Blažević, Damir, Keser, Tomislav, Glavaš, Hrvoje, and Noskov, Robert
- Subjects
CONDITION-based maintenance ,DEMPSTER-Shafer theory ,POWER transformers ,SYSTEMS availability - Abstract
Transformers are the most important elements in the power system. Due to their mass and complexity, they require constant monitoring and maintenance. Maintenance of power transformers increases the availability of the power system. The large number of substations and the specifics of their locations make condition-based maintenance (CBM) useful as part of the system's on-demand response. Unlike other system responses, the transformer contains a large amount of uncertain information, both qualitative and numerical. A large amount of information is necessary to implement CBM, but due to the often incomplete information, an analysis tool is essential. In this paper, a multi-level condition assessment framework based on evidential reasoning is proposed. A model for condition-based maintenance of a power transformer and procedures for the aggregation process based on evidential reasoning are presented. The implementation of the decomposition model with appropriate weights of a baseline and general attributes was made. Based on the decomposition model, the data and ratings of baseline attributes were collected. By carrying out the aggregation process, the ratings of the baseline attributes, as well as the ratings of the condition of the individual elements and the overall rating of the system condition as a whole, for several points in time, were obtained. The scientific contribution of the work is the proposal of an analysis that provides an insight into the condition of a complex technical system based on a single numerical value, thus determining its priority in the maintenance process. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
33. Multisensor Data Fusion in IoT Environments in Dempster–Shafer Theory Setting: An Improved Evidence Distance-Based Approach.
- Author
-
Hamda, Nour El Imane, Hadjali, Allel, and Lagha, Mohand
- Subjects
MULTISENSOR data fusion ,DEMPSTER-Shafer theory ,SET theory ,PATTERN recognition systems ,INTERNET of things ,FAULT diagnosis - Abstract
In IoT environments, voluminous amounts of data are produced every single second. Due to multiple factors, these data are prone to various imperfections, they could be uncertain, conflicting, or even incorrect leading to wrong decisions. Multisensor data fusion has proved to be powerful for managing data coming from heterogeneous sources and moving towards effective decision-making. Dempster–Shafer (D–S) theory is a robust and flexible mathematical tool for modeling and merging uncertain, imprecise, and incomplete data, and is widely used in multisensor data fusion applications such as decision-making, fault diagnosis, pattern recognition, etc. However, the combination of contradictory data has always been challenging in D–S theory, unreasonable results may arise when dealing with highly conflicting sources. In this paper, an improved evidence combination approach is proposed to represent and manage both conflict and uncertainty in IoT environments in order to improve decision-making accuracy. It mainly relies on an improved evidence distance based on Hellinger distance and Deng entropy. To demonstrate the effectiveness of the proposed method, a benchmark example for target recognition and two real application cases in fault diagnosis and IoT decision-making have been provided. Fusion results were compared with several similar methods, and simulation analyses have shown the superiority of the proposed method in terms of conflict management, convergence speed, fusion results reliability, and decision accuracy. In fact, our approach achieved remarkable accuracy rates of 99.32% in target recognition example, 96.14% in fault diagnosis problem, and 99.54% in IoT decision-making application. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. A New Correlation Measure for Belief Functions and Their Application in Data Fusion.
- Author
-
Zhang, Zhuo, Wang, Hongfei, Zhang, Jianting, and Jiang, Wen
- Subjects
MULTISENSOR data fusion ,DEMPSTER-Shafer theory ,DATA fusion (Statistics) ,INFORMATION processing - Abstract
Measuring the correlation between belief functions is an important issue in Dempster–Shafer theory. From the perspective of uncertainty, analyzing the correlation may provide a more comprehensive reference for uncertain information processing. However, existing studies about correlation have not combined it with uncertainty. In order to address the problem, this paper proposes a new correlation measure based on belief entropy and relative entropy, named a belief correlation measure. This measure takes into account the influence of information uncertainty on their relevance, which can provide a more comprehensive measure for quantifying the correlation between belief functions. Meanwhile, the belief correlation measure has the mathematical properties of probabilistic consistency, non-negativity, non-degeneracy, boundedness, orthogonality, and symmetry. Furthermore, based on the belief correlation measure, an information fusion method is proposed. It introduces the objective weight and subjective weight to assess the credibility and usability of belief functions, thus providing a more comprehensive measurement for each piece of evidence. Numerical examples and application cases in multi-source data fusion demonstrate that the proposed method is effective. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. An improved global land cover mapping in 2015 with 30 m resolution (GLC-2015) based on a multisource product-fusion approach.
- Author
-
Li, Bingjie, Xu, Xiaocong, Liu, Xiaoping, Shi, Qian, Zhuang, Haoming, Cai, Yaotong, and He, Da
- Subjects
LAND cover ,DEMPSTER-Shafer theory ,GLOBAL environmental change ,CLIMATE change ,PRODUCT improvement - Abstract
Global land cover (GLC) information with fine spatial resolution is a fundamental data input for studies on biogeochemical cycles of the Earth system and global climate change. Although there are several public GLC products with 30 m resolution, considerable inconsistencies were found among them, especially in fragmented regions and transition zones, which brings great uncertainties to various application tasks. In this paper, we developed an improved global land cover map in 2015 with 30 m resolution (GLC-2015) by fusing multiple existing land cover (LC) products based on the Dempster–Shafer theory of evidence (DSET). Firstly, we used more than 160 000 global point-based samples to locally evaluate the reliability of the input products for each land cover class within each 4 ∘ × 4 ∘ geographical grid for the establishment of the basic probability assignment (BPA) function. Then, Dempster's rule of combination was used for each 30 m pixel to derive the combined probability mass of each possible land cover class from all the candidate maps. Finally, each pixel was determined with a land cover class based on a decision rule. Through this fusing process, each pixel is expected to be assigned the land cover class that contributes to achieving a higher accuracy. We assessed our product separately with 34 711 global point-based samples and 201 global patch-based samples. Results show that the GLC-2015 map achieved the highest mapping performance globally, continentally, and ecoregionally compared with the existing 30 m GLC maps, with an overall accuracy of 79.5 % (83.6 %) and a kappa coefficient of 0.757 (0.566) against the point-based (patch-based) validation samples. Additionally, we found that the GLC-2015 map showed substantial outperformance in the areas of inconsistency, with an accuracy improvement of 19.3 %–28.0 % in areas of moderate inconsistency and 27.5 %–29.7 % in areas of high inconsistency. Hopefully, this improved GLC-2015 product can be applied to reduce uncertainties in the research on global environmental changes, ecosystem service assessments, and hazard damage evaluations. The GLC-2015 map developed in this study is available at 10.6084/m9.figshare.22358143.v2 (Li et al., 2023). [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. A new basic probability assignment generation and combination method for conflict data fusion in the evidence theory.
- Author
-
Tang, Yongchuan, Zhou, Yonghao, Ren, Xiangxuan, Sun, Yufei, Huang, Yubo, and Zhou, Deyun
- Subjects
MULTISENSOR data fusion ,DEMPSTER-Shafer theory ,PARADOX ,ENTROPY - Abstract
Dempster–Shafer evidence theory is an effective method to deal with information fusion. However, how to deal with the fusion paradoxes while using the Dempster's combination rule is still an open issue. To address this issue, a new basic probability assignment (BPA) generation method based on the cosine similarity and the belief entropy was proposed in this paper. Firstly, Mahalanobis distance was used to measure the similarity between the test sample and BPA of each focal element in the frame of discernment. Then, cosine similarity and belief entropy were used respectively to measure the reliability and uncertainty of each BPA to make adjustments and generate a standard BPA. Finally, Dempster's combination rule was used for the fusion of new BPAs. Numerical examples were used to prove the effectiveness of the proposed method in solving the classical fusion paradoxes. Besides, the accuracy rates of the classification experiments on datasets were also calculated to verify the rationality and efficiency of the proposed method. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
37. A framework for the fusion of non-exclusive and incomplete information on the basis of D number theory.
- Author
-
Deng, Xinyang and Jiang, Wen
- Subjects
NUMBER theory ,DEMPSTER-Shafer theory ,INFORMATION modeling ,ARTIFICIAL intelligence ,PROBLEM solving - Abstract
Uncertainty is of great concern in information fusion and artificial intelligence. Dempster-Shafer theory is a popular tool to deal with uncertainty, but it cannot effectively represent and fuse uncertain information involving non-exclusiveness and incompleteness. In order to solve that problem, an idea of D number theory (DNT) has been proposed. In this paper, the basic theory of DNT for the fusion of non-exclusive and incomplete information is studied to strengthen its theoretical foundation, including concept formalization, uncertainty representation, information modelling and fusion. At first, the non-exclusiveness in DNT is defined formally and its basic properties are discussed. Secondly, new measures of belief and plausibility for D numbers are developed. Thirdly, the combination rule for D numbers is studied by extending the exclusive conflict redistribution rule. Fourthly, a method to combine information-incomplete D numbers is proposed. The proposed concepts, definitions, and methods are analyzed mathematically and exemplified through illustrative examples. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. An Evidence Theoretic Approach for Traffic Signal Intrusion Detection.
- Author
-
Chowdhury, Abdullahi, Karmakar, Gour, Kamruzzaman, Joarder, Das, Rajkumar, and Newaz, S. H. Shah
- Subjects
SIGNAL detection ,TRAFFIC signs & signals ,UNCERTAINTY (Information theory) ,DEMPSTER-Shafer theory ,DECISION theory ,INTELLIGENT transportation systems - Abstract
The increasing attacks on traffic signals worldwide indicate the importance of intrusion detection. The existing traffic signal Intrusion Detection Systems (IDSs) that rely on inputs from connected vehicles and image analysis techniques can only detect intrusions created by spoofed vehicles. However, these approaches fail to detect intrusion from attacks on in-road sensors, traffic controllers, and signals. In this paper, we proposed an IDS based on detecting anomalies associated with flow rate, phase time, and vehicle speed, which is a significant extension of our previous work using additional traffic parameters and statistical tools. We theoretically modelled our system using the Dempster–Shafer decision theory, considering the instantaneous observations of traffic parameters and their relevant historical normal traffic data. We also used Shannon's entropy to determine the uncertainty associated with the observations. To validate our work, we developed a simulation model based on the traffic simulator called SUMO using many real scenarios and the data recorded by the Victorian Transportation Authority, Australia. The scenarios for abnormal traffic conditions were generated considering attacks such as jamming, Sybil, and false data injection attacks. The results show that the overall detection accuracy of our proposed system is 79.3% with fewer false alarms. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
39. A new correlation belief function in Dempster-Shafer evidence theory and its application in classification.
- Author
-
Tang, Yongchuan, Zhang, Xu, Zhou, Ying, Huang, Yubo, and Zhou, Deyun
- Subjects
DEMPSTER-Shafer theory ,STATISTICAL correlation ,CONFLICT management ,INFORMATION modeling ,CLASSIFICATION - Abstract
Uncertain information processing is a key problem in classification. Dempster-Shafer evidence theory (D-S evidence theory) is widely used in uncertain information modelling and fusion. For uncertain information fusion, the Dempster's combination rule in D-S evidence theory has limitation in some cases that it may cause counterintuitive fusion results. In this paper, a new correlation belief function is proposed to address this problem. The proposed method transfers the belief from a certain proposition to other related propositions to avoid the loss of information while doing information fusion, which can effectively solve the problem of conflict management in D-S evidence theory. The experimental results of classification on the UCI dataset show that the proposed method not only assigns a higher belief to the correct propositions than other methods, but also expresses the conflict among the data apparently. The robustness and superiority of the proposed method in classification are verified through experiments on different datasets with varying proportion of training set. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
40. A Multi-Source Data Fusion Method for Assessing the Tunnel Collapse Risk Based on the Improved Dempster–Shafer Theory.
- Author
-
Wu, Bo, Zeng, Jiajia, Zhu, Ruonan, Zheng, Weiqiang, and Liu, Cong
- Subjects
DEMPSTER-Shafer theory ,MULTISENSOR data fusion ,TUNNELS ,TUNNEL design & construction ,DECISION trees ,RISK assessment ,BUILDING failures ,PROGRESSIVE collapse - Abstract
Collapse is the main engineering disaster in tunnel construction when using the drilling and blasting method, and risk assessment is one of the important means to significantly reduce engineering disasters. Aiming at the problems of random decision-making and misjudgment of single indices in traditional risk assessment, a multi-source data fusion method with high accuracy based on improved Dempster–Shafer evidence theory (D-S model) is proposed in this study, which can realize the accurate assessment of tunnel collapse risk value. The evidence conflict coefficient K is used as the identification index, and the credibility and importance are introduced. The weight coefficient is determined according to whether the conflicting evidence is divided into two situations. The advanced geological forecast data, on-site inspection data and instrument monitoring data are trained by Cloud Model (CM), Gradient Boosting Decision Tree (GBDT) and Support Vector Classification (SVC), respectively, to obtain the initial BPA value. Combined with the weight coefficient, the identified conflict evidence is adjusted, and then the evidence from different sources is fused to obtain the overall collapse risk value. Finally, the accuracy is selected to verify the proposed method. The proposed method has been successfully applied to Wenbishan Tunnel. The results show that the evaluation accuracy of the proposed multi-source information fusion method can reach 88%, which is 16% higher than that of the traditional D-S model and more than 20% higher than that of the single-source information method. The high-precision multi-source data fusion method proposed in this paper has good universality and effectiveness in tunnel collapse risk assessment. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
41. An Interpretation of the Surface Temperature Time Series through Fuzzy Measures.
- Author
-
Devi, Rashmi Rekha and Chattopadhyay, Surajit
- Subjects
SURFACE temperature ,FUZZY measure theory ,TIME series analysis ,DEMPSTER-Shafer theory ,FUZZY numbers - Abstract
This paper reports a study to interpret the surface temperature based on time series and fuzzy measures. We demonstrated a method to identify the uncertainty around the surface temperature data concerning the summer monsoon in India. The random variables were standardized, and the Dempster-Shafer Theory was used to generate common goals. Two criteria, represented as fuzzy numbers, were used for this purpose. We constructed three polynomials to illustrate a functional connection between time series and the measure of joint belief. The analysis of the obtained results showed that the certainty increased over time. It confirmed that the degree of the evidence is a more predictable parameter at a more extended period. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
42. A throughput analysis of an energy-efficient spectrum sensing scheme for the cognitive radio-based Internet of things.
- Author
-
Miah, Md Sipon, Schukat, Michael, and Barrett, Enda
- Subjects
COGNITIVE radio ,INTERNET of things ,DEMPSTER-Shafer theory ,SPECTRUM analysis ,ERROR probability ,RADIO networks - Abstract
Spectrum sensing in a cognitive radio network involves detecting when a primary user vacates their licensed spectrum, to enable secondary users to broadcast on the same band. Accurately sensing the absence of the primary user ensures maximum utilization of the licensed spectrum and is fundamental to building effective cognitive radio networks. In this paper, we address the issues of enhancing sensing gain, average throughput, energy consumption, and network lifetime in a cognitive radio-based Internet of things (CR-IoT) network using the non-sequential approach. As a solution, we propose a Dempster–Shafer theory-based throughput analysis of an energy-efficient spectrum sensing scheme for a heterogeneous CR-IoT network using the sequential approach, which utilizes firstly the signal-to-noise ratio (SNR) to evaluate the degree of reliability and secondly the time slot of reporting to merge as a flexible time slot of sensing to more efficiently assess spectrum sensing. Before a global decision is made on the basis of both the soft decision fusion rule like the Dempster–Shafer theory and hard decision fusion rule like the "n-out-of-k" rule at the fusion center, a flexible time slot of sensing is added to adjust its measuring result. Using the proposed Dempster–Shafer theory, evidence is aggregated during the time slot of reporting and then a global decision is made at the fusion center. In addition, the throughput of the proposed scheme using the sequential approach is analyzed based on both the soft decision fusion rule and hard decision fusion rule. Simulation results indicate that the new approach improves primary user sensing accuracy by 13 % over previous approaches, while concurrently increasing detection probability and decreasing false alarm probability. It also improves overall throughput, reduces energy consumption, prolongs expected lifetime, and reduces global error probability compared to the previous approaches under any condition [part of this paper was presented at the EuCAP2018 conference (Md. Sipon Miah et al. 2018)]. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
43. Using Mutual Aggregate Uncertainty Measures in a Threat Assessment Problem Constructed by Dempster–Shafer Network.
- Author
-
Shahpari, Asghar and Seyedin, Seyed A.
- Subjects
PROBABILITY theory ,DEMPSTER-Shafer theory ,UNCERTAINTY ,CYBERNETICS ,AMBIGUITY - Abstract
Mutual information as a tool for measuring the amount of dependency between two variables is used in many applications in probability theory. In this paper, three mutual measures based on three aggregate uncertainty (AU) measures in Dempster–Shafer theory (DST) are proposed. These uncertainty measures are: 1) AU; 2) ambiguity measure (AM); and 3) modified AM (MAM), which is proposed in this paper. MAM is the modification of AM which resolves the nonsubadditivity problem of AM. A threat assessment problem constructed by a Dempster–Shafer network is used for testing these mutual measures. We use the proposed mutual measures to identify which input variables of the network are more influential on the threat value. Finally, we conclude that mutual uncertainty based on MAM is a justifiable measure to compute the dependency between two variables in applications related to the DST. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
44. An Improved Method for Multisensor High Conflict Data Fusion.
- Author
-
Wang, Like and Bao, Yu
- Subjects
PROBLEM solving ,DEMPSTER-Shafer theory ,MULTISENSOR data fusion ,SYSTEM identification ,ENTROPY ,DATA fusion (Statistics) ,INFORMATION processing - Abstract
Dempster-Shafer evidence theory can effectively process imperfect information and is widely used in a data fusion system. However, classical Dempster-Shafer evidence theory involves counter-intuitive behaviors with the data of multisensor high conflict in target identification system. In order to solve this problem, an improved evidence combination method is proposed in this paper. By calculating the support degree and the belief entropy of each sensor, the proposed method combines conflict evidences. A new method is used to calculate support degree in this paper. At the same time, inspired by Deng entropy, the modified belief entropy is proposed by considering the scale of the frame of discernment (FOD) and the relative scale of the intersection between evidences with respect to FOD. Because of these two modifications, the effect has been improved in conflict data fusion. Several methods are compared and analyzed through examples. And the result suggests the proposed method can not only obtain reasonable and correct results but also have the highest fusion reliability in solving the problem of high conflict data fusion. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
45. E-APTDetect: Early Advanced Persistent Threat Detection in Critical Infrastructures with Dynamic Attestation.
- Author
-
Genge, Béla, Haller, Piroska, and Roman, Adrian-Silviu
- Subjects
SUPERVISORY control & data acquisition systems ,INFRASTRUCTURE (Economics) ,DEMPSTER-Shafer theory ,VINYL acetate ,ANOMALY detection (Computer security) - Abstract
Advanced Persistent Threats (APTs) represent a complex series of techniques directed against a particular organization, where the perpetrator is able to hide its presence for a longer period of time (e.g., months, years). Previous such attacks have demonstrated the exceptional impact that a cyber attack may have on the operation of Supervisory Control And Data Acquisition Systems (SCADA), and, more specifically, on the underlying physical process. Existing techniques for the detection of APTs focus on aggregating results originating from a collection of anomaly detection agents. However, such approaches may require an extensive time period in case the process is in a steady-state. Conversely, this paper documents E-APTDetect, an approach that uses dynamic attestation and multi-level data fusion for the early detection of APTs. The methodology leverages sensitivity analysis and Dempster-Shafer's Theory of Evidence as its building blocks. Extensive experiments are performed on a realistic Vinyl Acetate Monomer (VAM) process model. The model contains standard chemical unit operations and typical industrial characteristics, which make it suitable for a large variety of experiments. The experimental results conducted on the VAM process demonstrate E-APTDetect's ability to efficiently detect APTs, but also highlight key aspects related to the attacker's advantage. The experiments also highlight that the adversary's advantage is affected by two major factors: the number of compromised components; and, the precision of manipulation. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
46. Solution-Space-Reduction-Based Evidence Theory Method for Stiffness Evaluation of Air Springs with Epistemic Uncertainty.
- Author
-
Yin, Shengwen, Jin, Keliang, Bai, Yu, Zhou, Wei, and Wang, Zhonggang
- Subjects
EPISTEMIC uncertainty ,DEMPSTER-Shafer theory ,EVALUATION methodology ,ARTIFICIAL joints ,UNCERTAIN systems - Abstract
In the Dempster–Shafer evidence theory framework, extremum analysis, which should be repeatedly executed for uncertainty quantification (UQ), produces a heavy computational burden, particularly for a high-dimensional uncertain system with multiple joint focal elements. Although the polynomial surrogate can be used to reduce computational expenses, the size of the solution space hampers the efficiency of extremum analysis. To address this, a solution-space-reduction-based evidence theory method (SSR-ETM) is proposed in this paper. The SSR-ETM invests minimal additional time for potentially high-efficiency returns in dealing with epistemic uncertainty. In the SSR-ETM, monotonicity analysis of the polynomial surrogate over the range of evidence variables is first performed. Thereafter, the solution space can be narrowed to a smaller size to accelerate extremum analysis if the surrogate model is at least monotonic in one dimension. Four simple functions and an air spring system with epistemic uncertainty demonstrated the efficacy of the SSR-ETM, indicating an apparent superiority over the conventional method. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. A New Reliability Coefficient Using Betting Commitment Evidence Distance in Dempster–Shafer Evidence Theory for Uncertain Information Fusion.
- Author
-
Tang, Yongchuan, Wu, Shuaihong, Zhou, Ying, Huang, Yubo, and Zhou, Deyun
- Subjects
DEMPSTER-Shafer theory ,INFORMATION theory ,INFORMATION modeling ,HUMAN information processing ,MACHINE learning ,CONFLICT theory ,RELIABILITY in engineering - Abstract
Dempster–Shafer evidence theory is widely used to deal with uncertain information by evidence modeling and evidence reasoning. However, if there is a high contradiction between different pieces of evidence, the Dempster combination rule may give a fusion result that violates the intuitive result. Many methods have been proposed to solve conflict evidence fusion, and it is still an open issue. This paper proposes a new reliability coefficient using betting commitment evidence distance in Dempster–Shafer evidence theory for conflict and uncertain information fusion. The single belief function for belief assignment in the initial frame of discernment is defined. After evidence preprocessing with the proposed reliability coefficient and single belief function, the evidence fusion result can be calculated with the Dempster combination rule. To evaluate the effectiveness of the proposed uncertainty measure, a new method of uncertain information fusion based on the new evidence reliability coefficient is proposed. The experimental results on UCI machine learning data sets show the availability and effectiveness of the new reliability coefficient for uncertain information processing. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. Identifying Qualified Public Safety Education Venues Using the Dempster–Shafer Theory-Based PROMETHEE Method under Linguistic Environments.
- Author
-
Zhang, Yiqian, Dai, Yutong, and Liu, Bo
- Subjects
SAFETY education ,PUBLIC safety ,PUBLIC education ,DEMPSTER-Shafer theory ,EXPERIENTIAL learning ,CRISIS management ,CRISIS communication - Abstract
How to improve safety awareness is an important topic, and it is of great significance for the public to reduce losses in the face of disasters and crises. A public safety education venue is an important carrier to realize safety education, as it has the characteristics of professionalism, comprehensiveness, experience, interest, participation, and so on, arousing the enthusiasm of the public for learning. As a meaningful supplement to "formal safety education", venue education has many advantages. However, there are problems in the current venue construction such as imperfect infrastructure, weak professionalism, poor service level, chaotic organizational structure, and low safety, which affect the effect of safety education. To evaluate safety education venues effectively, this study proposes an evidential PROMETHEE method under linguistic environments. The innovation of this study lies in the integration of various linguistic expressions into the Dempster–Shafer theory (DST) framework, realizing the free expression and choice of evaluation information. The results and contributions of this study are summarized as follows. First, a two-tier evaluation index system of public safety education venues including 18 sub-standards is constructed. Secondly, it sets up four levels of quality evaluation for public safety education venues. Third, the belief function is used to represent all kinds of linguistic information, so as to maximize the effect of linguistic information fusion. Fourthly, an evidential PROMETHEE model is proposed to rank the venues. Fifthly, a case study is presented to demonstrate the usage of the proposed method in detail, and the evaluation results are fully analyzed and discussed. The implications of this study are as follows. First of all, to enhance public safety education, people need to face the significance of experiential education venues. Second, experiential education venues can increase learners' enthusiasm for learning. Thirdly, the evaluation index system provided in this paper can be used to guide the construction of appropriate education venues in cities. Fourthly, the method of linguistic information transformation based on DST is also applicable to other decision-making and evaluation problems. Finally, the evidential PROMETHEE method can not only evaluate the quality of education venues, but also be used to rank a group of alternative venues. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Cutting-state identification of machine tools based on improved Dempster-Shafer evidence theory.
- Author
-
Xu, Bo and Sun, Yingqiang
- Subjects
DEMPSTER-Shafer theory ,MACHINE tools ,BACK propagation ,GENETIC algorithms ,WAVELETS (Mathematics) ,EVOLUTIONARY algorithms - Abstract
The reliability of machine tools is highly influenced by the cutting state. The traditional recognition method of cutting state is emphasized on a single classifier, which has the weakness of low identification accuracy and strong randomness. This paper proposes a cutting-state identification method based on improved Dempster-Shafer (DS) evidence theory. This method is divided into multi-classifier preliminary-diagnosis layer and improved DS information-fusion layer. The wavelet packet analysis method is extracted as the input of multi-classifier (Back Propagation (BP) neural network, genetic algorithm (GA) optimized BP neural network and thinking evolution (mind evolutionary algorithms) MEA optimized BP neural network). After the preliminary judgment, the improved DS information-fusion method is integrated as the final judgment, and finally, the effectiveness and feasibility of the method are verified. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
50. Decision-making for the anomalies in IIoTs based on 1D convolutional neural networks and Dempster–Shafer theory (DS-1DCNN).
- Author
-
Çavdar, Tuğrul, Ebrahimpour, Nader, Kakız, Muhammet Talha, and Günay, Faruk Baturalp
- Subjects
CONVOLUTIONAL neural networks ,DEMPSTER-Shafer theory ,DECISION making ,DEEP learning - Abstract
The main motivation of the Internet of Things (IoT) is to enable everyday physical objects to sense and process data and communicate with other objects. Its applications in industry are called industrial Internet of Things (IIoTs) or Industry 4.0. One of the main goals of the IIoT is to automatically monitor and detect unexpected events, changes, and alterations to the collected data. Anomaly detection includes all techniques that identify data patterns deviating from the expected behavior. Deep learning (DL) can search for a specific relationship in billions of corporate IoT data and reach a meaningful goal by analyzing and classifying collected data, leading to making the right decisions. The realization of the IoT is entirely dependent on making the proper decisions. However, the conventional methods for processing voluminous IIoT data are not qualified. Hence, DL is indispensable for making the intended inferences through big IIoT data. Likewise, due to the advancement of sensor technology, various sensor resources such as sound, vibration, and current can be used to obtain appropriate inferences. Accordingly, the decision fusion theory can be used to make optimal decisions when there are multiple sources of information. Therefore, this paper proposes a method that combines one-dimensional convolution neural networks (1DCNNs) and the Dempster–Shafer (DS) decision-fusion method (DS-1DCNN) for decision-making on IIoT anomalies. According to obtained simulation results, this proposed method increases the decision accuracy and significantly decreases uncertainty. The proposed method was compared with long short-term memory, random forest and CNN models, which obtained better performance than these algorithms. The proposed method on the Mill dataset got an average recall of 0.9763 and an average precision of 0.9899, which is an acceptable and reliable result for decision-making. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.