93 results on '"*CONSISTENCY models (Computers)"'
Search Results
2. Research for Practice: Convergence.
- Author
-
KLEPPMANN, MARK
- Subjects
- *
COMPUTER science research , *CONSISTENCY models (Computers) , *PROGRAMMING languages , *HUMAN-computer interaction , *DATA management , *DISTRIBUTED databases - Abstract
The article presents a selection of research papers focusing on the theme of convergence, also known as eventual consistency, and specific computer science topics including systems, programming languages, human-computer interaction, and data management. Topics include conflict-free replicated data type (CRDT), consistency as logical monotonicity (CALM), and mergeable replicated data types (MRDTs).
- Published
- 2022
- Full Text
- View/download PDF
3. Traceability Management of Socio-Cyber-Physical Systems Involving Goal and SysML Models †.
- Author
-
Anda, Amal Ahmed, Amyot, Daniel, and Mylopoulos, John
- Subjects
SYSML (Computer science) ,CYBER physical systems ,COMPUTER software development ,COMPLETENESS theorem ,CONSISTENCY models (Computers) - Abstract
Socio-cyber-physical systems (SCPSs) have emerged as networked heterogeneous systems that incorporate social components (e.g., business processes and social networks) along with physical (e.g., Internet-of-Things devices) and software components. Model-driven techniques for building SCPSs need actor and goal models to capture social concerns, whereas system issues are often addressed with the Systems Modeling Language (SysML). Comprehensive traceability between these types of models is essential to support consistency and completeness checks, change management, and impact analysis. However, traceability management between these complementary views is not well supported across SysML tools, particularly when models evolve because SysML does not provide sophisticated out-of-the-box goal modeling capabilities. In our previous work, we proposed a model-based framework, called CGS4Adaptation, that supports basic traceability by importing goal and SysML models into a leading third-party requirement-management system, namely IBM Rational DOORS. In this paper, we present the framework's traceability management method and its use for automated consistency and completeness checks. Traceability management also includes implicit link detection, thereby, improving the quality of traceability links while better aligning designs with requirements. The method is evaluated using an adaptive SCPS case study involving an IoT-based smart home. The results suggest that the tool-supported method is effective and useful in supporting the traceability management process involving complex goal and SysML models in one environment while saving development time and effort. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. Massive lighter quark corrections to boosted-top cross section.
- Author
-
Bris, Alejandro, Mateu, Vicent, and Gil, Fernando
- Subjects
- *
QUARK matter , *NUCLEAR cross sections , *QUANTUM computing , *CONSISTENCY models (Computers) , *STATISTICAL correlation - Abstract
In this work we present the computation of the missing pieces to get the bHQET thrust distribution with non-vanishing secondary quark masses at NNLO: the jet and hard functions. The difference with respect to the massless case is encoded in diagrams with massive-quark bubbles. For its computation we use a Mellin Barnes representation of the dispersive integral method that permits expressing the result as an analytic, fast-converging expansion in powers of a small parameter rather than integrals that can only be solved numerically. We also obtain the matching coefficient present when integrating out the quark mass. It is necessary for a continuous top-down running and to verify the consistency check. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
5. The PCL Theorem.
- Author
-
Bushkov, Victor, Dziuma, Dmytro, Fatourou, Panagiota, and Guerraoui, Rachid
- Subjects
ALGORITHMS ,CONSISTENCY models (Computers) ,ARTIFICIAL intelligence ,COMPUTER programming ,COMPUTER programmers - Abstract
We establish a theorem called the PCL theorem, which states that it is impossible to design a transactional memory algorithm that ensures (1) parallelism, i.e., transactions do not need to synchronize unless they access the same application objects, (2) very little consistency, i.e., a consistency condition, called weak adaptive consistency, introduced here and that is weaker than snapshot isolation, processor consistency, and any other consistency condition stronger than them (such as opacity, serializability, causal serializability, etc.), and (3) very little liveness, i.e., which transactions eventually commit if they run solo. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
6. A colorization method for historical videos.
- Author
-
Xin Jin, Yiqing Rong, Ke Liu, Chaoen Xiao, and Xiaokun Zhang
- Subjects
DEEP learning ,COMPUTER vision ,COLORIZATION of motion pictures ,IMAGE color analysis ,CONSISTENCY models (Computers) - Abstract
The development of imaging technology has allowed people to move beyond the black-and-white era and into the age of color. However, preserved black-and-white historical footage remains a precious memory for people. We propose a coloring method for historical videos that combines historical image coloring methods with temporal consistency methods, thus achieving color editing for historical videos. The temporal consistency technique uses deep video priors to model the video structure and effectively ensure smoothness between frames after video color editing, even with a small amount of training data. Meanwhile, we have collected a historical video dataset named MHMD-Video, which facilitates further research on colorization of historical videos for researchers. Finally, we demonstrate the effectiveness of the proposed method through objective and subjective evaluation. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Analysing Snapshot Isolation.
- Author
-
Cerone, Andrea and Gotsman, Alexey
- Subjects
CONSISTENCY models (Computers) ,TRANSACTION systems (Computer systems) ,DISTRIBUTED computing ,ROBUST control ,ALGORITHMS - Abstract
Snapshot isolation (SI) is a widely used consistency model for transaction processing, implemented by most major databases and some of transactional memory systems. Unfortunately, its classical definition is given in a low-level operational way, by an idealised concurrency-control algorithm, and this complicates reasoning about the behaviour of applications running under SI. We give an alternative specification to SI that characterises it in terms of transactional dependency graphs of Adya et al., generalising serialisation graphs. Unlike previous work, our characterisation does not require adding additional information to dependency graphs about start and commit points of transactions. We then exploit our specification to obtain two kinds of static analyses. The first one checks when a set of transactions running under SI can be chopped into smaller pieces without introducing new behaviours, to improve performance. The other analysis checks whether a set of transactions running under a weakening of SI behaves the same as when running under SI. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
8. Keeping Calm.
- Author
-
Sharifi, Mohsen, Sayyadabdi, Amirhossein, Hellerstein, Joseph M., and Alvaro, Peter
- Subjects
- *
CONSISTENCY models (Computers) , *COMPUTER software - Published
- 2022
- Full Text
- View/download PDF
9. Consistently Eventual.
- Author
-
HELLAND, PAT
- Subjects
- *
COMPUTER science , *INFORMATION science , *RELATIONAL databases , *SOFTWARE architecture , *APPLICATION software , *CONSISTENCY models (Computers) - Abstract
The article discusses the computer science concept of eventual consistency. Alleged differences between data inside and outside relational databases are compared, the algorithmic treatment of replica data is scrutinized, and the author discusses the alleged need for trust and resolved uncertainty in managing the permissions and data access structures of application software.
- Published
- 2018
- Full Text
- View/download PDF
10. Quantifying Eventual Consistency with PBS.
- Author
-
Bailis, Peter, Venkataraman, Shivaram, Franklin, Michael J., Hellerstein, Joseph M., and Stoica, Ion
- Subjects
- *
LATENT semantic analysis , *CONSISTENCY models (Computers) , *DISTRIBUTED computing , *PROBABILISTIC generative models , *OPEN source software , *PERFORMANCE of distributed shared memory - Abstract
Data replication results in a fundamental trade-off between operation latency and consistency. At the weak end of the spectrum of possible consistency models is eventual consistency, which provides no limit to the staleness of data returned. However, anecdotally, eventual consistency is often “good enough” for practitioners given its latency and availability benefits. In this work, we explain this phenomenon and demonstrate that, despite their weak guarantees, eventually consistent systems regularly return consistent data while providing lower latency than their strongly consistent counterparts. To quantify the behavior of eventually consistent stores, we introduce Probabilistically Bounded Staleness (PBS), a consistency model that provides expected bounds on data staleness with respect to both versions and wall clock time. We derive a closed-form solution for version-based staleness and model real-time staleness for a large class of quorum replicated, Dynamo-style stores. Using PBS, we measure the trade-off between latency and consistency for partial, non-overlapping quorum systems under Internet production workloads. We quantitatively demonstrate how and why eventually consistent systems frequently return consistent data within tens of milliseconds while offering large latency benefits. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
11. Consistency model for runtime objects in the Open Community Runtime.
- Author
-
Dokulil, Jiri
- Subjects
- *
RUN time systems (Computer science) , *CONSISTENCY models (Computers) , *APPLICATION program interfaces , *COMPUTER programming , *C++ , *PROGRAMMING languages - Abstract
Task-based runtime systems are seen as a way to deal with increasing heterogeneity and performance variability in modern high performance computing systems. The Open Community Runtime is an example of such runtime system, with the benefit of having an open specification that defines its interface and behavior. This allows others to study the design and create alternative implementations of the runtime system. However, it turns out that the consistency model that defines how the OCR programs are synchronized when executed in parallel is not sufficiently defined. In the following text, we complete the consistency model by filling in the missing pieces and extending the memory model given in the specification. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
12. Supporting automated containment checking of software behavioural models using model transformations and model checking.
- Author
-
UL Muram, Faiz, Tran, Huy, and Zdun, Uwe
- Subjects
- *
SOFTWARE engineering , *COMPUTER software development , *CONSISTENCY models (Computers) , *AUTOMATION , *COMPUTER software testing - Abstract
Abstract Models are extensively used in many areas of software engineering to represent the behaviour of software systems at different levels of abstraction. Because of the involvement of different stakeholders in constructing these models and their independent evolution, inconsistencies might occur between the models. It is thus crucial to detect these inconsistencies at early phases of the software development process, and especially as soon as refined models deviate from their abstract counterparts. In this article, we introduce a containment checking approach to verify whether a certain low-level behaviour model, typically created by refining and enhancing a high-level model, still is consistent with the specification provided in its high-level counterpart. We interpret the containment checking problem as a model checking problem, which has not received special treatment in the literature so far. Because the containment checking is based on model checking, it requires both formal consistency constraints and specifications of these models. Unfortunately, creating formal consistency constraints and specifications is currently done manually, and therefore, labour-intensive and error prone. To alleviate this issue, we define and develop a fully automated transformation of behaviour models into formal specifications and properties. The generated formal specifications and properties can directly be used by existing model checkers for detecting any discrepancy between the input models and yield corresponding counterexamples. Moreover, our approach can provide the developers more informative and comprehensive feedback regarding the inconsistency issues, and therefore, help them to efficiently identify and resolve the problems. The evaluation of various scenarios from industrial case studies demonstrates that the proposed approach efficiently translates the behaviour models into formal specifications and properties. Highlights • Containment checking of software behaviour models at different abstraction levels. • Automated transformation of models into LTL formulas and formal SMV descriptions. • Counterexample interpretation for resolving the inconsistencies between models. • Applicability and feasibility of the approach for realistically sized models. • Performance evaluation in a typical developer's working environment. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
13. A mobility-aware approach for distributed data update on unstructured mobile P2P networks.
- Author
-
Lai, Chuan-Chi and Liu, Chuan-Ming
- Subjects
- *
DISTRIBUTED databases , *CYBERINFRASTRUCTURE , *CONSISTENCY models (Computers) , *DATA transmission systems , *COMMUNICATION infrastructure - Abstract
Abstract In unstructured mobile peer-to-peer systems (MP2P), the frequent link breakages lead frequent topology mismatching problems and data transmission failures due to high mobility of nodes. The overhead of data transmission and synchronization cannot be neglected. In order to keep the data consistent, flooding is a fundamental and straightforward data synchronization mechanism since a mobile node does not know which else has the same shared data item. However, data flooding causes the broadcast storm problem. In this paper, we propose a mobility-aware data update approach (MADU) to improve the data dissemination and to reduce the overhead of maintaining the consistency of shared data items in an MP2P network. We use safe-time which is derived from the neighbor's location and speed to determine the time for a node to do the checking and updating between the neighbor nodes and itself. We also consider the network connectivity of a mobile node and access frequency of a data item as the factors to trigger the update process. By combining the mobility information of nodes, network connectivity, and access frequency of data items, we set a reasonable data update mechanism, which can significantly decrease the number of retransmissions and redundant messages so as to reduce the overhead of maintaining the data consistency. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
14. Strong Time-Consistent Subset of the Core in Cooperative Differential Games with Finite Time Horizon.
- Author
-
Petrosian, O. L., Gromova, E. V., and Pogozhev, S. V.
- Subjects
- *
COOPERATIVE game theory , *CONSISTENCY models (Computers) , *MULTIPLE imputation (Statistics) , *TRAJECTORY optimization , *OPTIMALITY theory (Linguistics) - Abstract
Time consistency is one of the most important properties of solutions in cooperative differential games. This paper uses the core as a cooperative solution of the game. We design a strong time-consistent subset of the core. The design method of this subset is based on a special class of imputation distribution procedures (IDPs). [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
15. Source Estimation in Time Series and the Surprising Resilience of HMMs.
- Author
-
Kozdoba, Mark and Mannor, Shie
- Subjects
- *
TIME series analysis , *HIDDEN Markov models , *MARKOV processes , *CONSISTENCY models (Computers) , *TYPE theory , *ESTIMATION theory - Abstract
Suppose that we are given a time series where consecutive samples are believed to come from a probabilistic source, that the source changes from time to time and that the total number of sources is fixed. Our objective is to estimate the distributions of the sources. A standard approach to this problem is to model the data as a hidden Markov model (HMM). However, since the data often lacks the Markov or the stationarity properties of an HMM, one can ask whether this approach is still suitable or perhaps another approach is required. In this paper, we show that a maximum likelihood HMM estimator can be used to approximate the source distributions in a much larger class of models than HMMs. Specifically, we propose a natural and fairly general non-stationary model of the data, where the only restriction is that the sources do not change too often. Our main result shows that for this model, a maximum-likelihood HMM estimator produces the correct second moment of the data, and the results can be extended to higher moments. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
16. A Study of Chance-Corrected Agreement Coefficients for the Measurement of Multi-Rater Consistency.
- Author
-
Zheng Xie, Gadepalli, Chaitanya, and Cheetham, Barry M. G.
- Subjects
CONSISTENCY models (Computers) ,COEFFICIENTS (Statistics) - Abstract
O Chance corrected agreement coefficients such as the Cohen and Fleiss Kappas are commonly used for the measurement of consistency in the decisions made by clinical observers or raters. However, the way that they estimate the probability of agreement (Pe) or cost of disagreement (De) 'by chance' has been strongly questioned, and alternatives have been proposed, such as the Aickin Alpha coefficient and the Gwet AC1 and AC2 coefficients. A well known paradox illustrates deficiencies of the Kappa coefficients which may be remedied by scaling Pe or De according to the uniformity of the scoring. The AC1 and AC2 coefficients result from the application of this scaling to the Brennan-Prediger coefficient which may be considered a simplified form of Kappa. This paper examines some commonly used multi-rater agreement coefficients including AC1 and AC2. It then proposes an alternative subject-by-subject scaling approach that may be applied to weighted and unweighted multi-rater Cohen and Fleiss Kappas and also Intra-Class Correlation (ICC) coefficients. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
17. Systematic Review of Software Behavioral Model Consistency Checking.
- Author
-
UL MURAM, FAIZ, TRAN, HUY, and ZDUN, UWE
- Subjects
- *
UNIFIED modeling language , *COMPUTER software development , *CONSISTENCY models (Computers) , *COMPUTER software research , *COMPUTER software development -- Management - Abstract
In software development, models are often used to represent multiple views of the same system. Such models need to be properly related to each other in order to provide a consistent description of the developed system. Models may contain contradictory system specifications, for instance, when they evolve independently. Therefore, it is very crucial to ensure that models conform to each other. In this context, we focus on consistency checking of behavior models. Several techniques and approaches have been proposed in the existing literature to support behavioral model consistency checking. This article presents a Systematic Literature Review (SLR) that was carried out to obtain an overview of the various consistency concepts, problems, and solutions proposed regarding behavior models. In our study, the identification and selection of the primary studies was based on a well-planned search strategy. The search process identified a total of 1770 studies, out of which 96 have been thoroughly analyzed according to our predefined SLR protocol. The SLR aims to highlight the state-of-the-art of software behavior model consistency checking and identify potential gaps for future research. Based on research topics in selected studies, we have identified seven main categories: targeted software models, types of consistency checking, consistency checking techniques, inconsistency handling, type of study and evaluation, automation support, and practical impact. The findings of the systematic review also reveal suggestions for future research, such as improving the quality of study design and conducting evaluations, and application of research outcomes in industrial settings. For this purpose, appropriate strategy for inconsistency handling, better tool support for consistency checking and/or development tool integration should be considered in future studies. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
18. On the speed of constraint propagation and the time complexity of arc consistency testing.
- Author
-
Berkholz, Christoph and Verbitsky, Oleg
- Subjects
- *
CONSISTENCY models (Computers) , *CONSTRAINT satisfaction , *HEURISTIC algorithms , *COLOR computer graphics , *COMBINATORICS - Abstract
Establishing arc consistency on two relational structures is one of the most popular heuristics for the constraint satisfaction problem. We aim at determining the time complexity of arc consistency testing. The input structures G and H can be supposed to be connected colored graphs, as the general problem reduces to this particular case. We first observe the upper bound O ( e ( G ) v ( H ) + v ( G ) e ( H ) ) , which implies the bound O ( e ( G ) e ( H ) ) in terms of the number of edges and the bound O ( ( v ( G ) + v ( H ) ) 3 ) in terms of the number of vertices. We then show that both bounds are tight as long as an arc consistency algorithm is based on constraint propagation (as all current algorithms are). Our lower bounds are based on examples of slow constraint propagation. We measure the speed of constraint propagation observed on a pair G , H by the size of a combinatorial proof that Spoiler wins the existential 2-pebble game on G , H . [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
19. Triangular bounded consistency of fuzzy preference relations.
- Author
-
Chang, Wen-Jun, Fu, Chao, Xu, Dong-Ling, and Xue, Min
- Subjects
- *
TRIANGULAR operator algebras , *FUZZY algorithms , *CONSISTENCY models (Computers) , *DECISION making , *MULTIPLY transitive groups - Abstract
Abstract There are typically two types of consistency of fuzzy preference relations (FPR), namely additive and multiplicative consistency. They are defined based on the assumption that decision makers are rational and can provide strictly consistent FPRs. To take into consideration the bounded rationality of decision makers, the current study relaxes this assumption and proposes a new measure called triangular bounded consistency for judging the consistency of FPRs. To define triangular bounded consistency, a directed triangle is used to represent three FPRs among any three alternatives, with each directed edge representing an FPR. The condition of restricted max–max transitivity (RMMT) in the directed triangle is quantitatively examined. Under the assumption that the bounded rationality of a decision maker is characterized by their historical FPRs, which are represented by directed triangles that satisfy RMMT, triangular bounded consistency is determined using the historical FPRs. We then illustrate how triangular bounded consistency can be used to verify the consistency of FPRs that are newly provided by decision makers and how to estimate some missing FPRs that are not provided by decision makers. Finally, to demonstrate the application of triangular bounded consistency of FPRs in multi-attribute decision analysis, we investigate a problem that involves selecting areas to market products for a company. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
20. Decidability and complexity for quiescent consistency and its variations.
- Author
-
Dongol, Brijesh and Hierons, Robert M.
- Subjects
- *
CONSISTENCY models (Computers) , *DECIDABILITY (Mathematical logic) , *LIMITS (Mathematics) , *COMPUTATIONAL complexity , *MEMBERSHIP - Abstract
Quiescent consistency is a notion of correctness for a concurrent object that gives meaning the object's behaviour in its quiescent states. This paper shows that the membership problem for quiescent consistency is NP-complete and that the correctness problem is decidable, but coNEXPTIME-complete. We consider restricted versions of quiescent consistency by assuming an upper limit on the number of events between two quiescent points. Here, we show that the membership problem is in PTIME, whereas correctness is PSPACE-complete. We also consider quiescent sequential consistency, which strengthens quiescent consistency with an additional sequential consistency condition. We show that the unrestricted versions of membership and correctness are NP-complete and undecidable, respectively. When placing a limit on the number of events between two quiescent points, membership is in PTIME, while correctness is PSPACE-complete. Given an upper limit on the number of processes for every run of the implementation, the membership problem is in PTIME. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
21. Managing critical supply chain issues in Indian healthcare.
- Author
-
Singh, Sudhanshu, Verma, Rakesh, and Koul, Saroj
- Subjects
HEALTH care industry ,SUPPLY chain management ,GROUP decision making ,CONSISTENCY models (Computers) ,ANALYTIC hierarchy process - Abstract
This exploratory study proposes a new approach that utilized pre-built issues libraries in healthcare supply chain. Supply chain related issues are collected and deduced from the literature to build issues libraries. This is followed by application of group decision-making for their prioritization and defining solution requirements from doctors’ perspectives. A new approach of shared decision-making is proposed by utilizing literature for developing pre-built issues libraries as an input to shared decision-making. Quick identification and resolution mean that an organisation is continually learning and moving towards excellence. It can be used as a checklist for comparison within and across organisations. Usage of open source applications such as Google Sheets and WhatsApp was utilized for geographically dispersed experts. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
22. A multi-server information-sharing environment for cross-party collaboration on a private cloud.
- Author
-
Zhang, Jianping, Liu, Qiang, Hu, Zhenzhong, Lin, Jiarui, and Yu, Fangqiang
- Subjects
- *
CLIENT/SERVER computing , *INFORMATION sharing , *STAKEHOLDERS , *CONSISTENCY models (Computers) , *SOFTWARE verification - Abstract
Interoperability remains the key problem in multi-discipline collaboration based on building information modeling (BIM). Although various methods have been proposed to solve the technical issues of interoperability, such as data sharing and data consistency; organizational issues, including data ownership and data privacy, remain unresolved to date. These organizational issues prevent different stakeholders from sharing their data due to concerns regarding losing control of the data. This study proposes a multi-server information-sharing approach on a private cloud after analyzing the requirements for cross-party collaboration to address the aforementioned issues and prepare for massive data handling in the near future. This approach adopts a global controller to track the location, ownership and privacy of the data, which are stored in different servers that are controlled by different parties. Furthermore, data consistency conventions, parallel sub-model extraction, and sub-model integration with model verification are investigated in depth to support information sharing in a distributed environment and to maintain data consistency. Thus, with this approach, the ownership and privacy of the data can be controlled by its owner while still enabling certain required data to be shared with other parties. Application of the multi-server approach for information interoperability and cross-party collaboration is illustrated using a real construction project of an airport terminal. Validation shows that the proposed approach is feasible for maintaining the ownership and privacy of the data while supporting cross-party data sharing and collaboration at the same time, thus avoiding possible legal problems regarding data copyrights or other legal issues. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
23. Saliency detection using suitable variant of local and global consistency.
- Author
-
Jiazhong Chen, Jie Chen, Hua Cao, Rong Li, Tao Xia, Hefei Ling, and Yang Chen
- Subjects
- *
COMPUTER vision , *CONSISTENCY models (Computers) , *IMAGE processing , *COST functions , *IMAGE - Abstract
In existing local and global consistency (LGC) framework, the cost functions related to classifying functions adopt the sum of each row of weight matrix as an important factor. Some of these classifying functions are successfully applied to saliency detection. From the point of saliency detection, this factor is inversely proportional to the colour contrast between image regions and their surroundings. However, an image region that holds a big colour contrast against it surroundings does not denote it must be a salient region. Therefore a suitable variant of LGC is introduced by removing this factor in cost function, and a suitable classifying function (SCF) is decided. Then a saliency detection method that utilises the SCF, content-based initial label assignment scheme, and appearance-based label assignment scheme is presented. Via updating the content-based initial labels and appearance-based labels by the SCF, a coarse saliency map and several intermediate saliency maps are obtained. Furthermore, to enhance the detection accuracy, a novel optimisation function is presented to fuse the intermediate saliency maps that have a high detection performance for final saliency generation. Numerous experimental results demonstrate that the proposed method achieves competitive performance against some recent state-of-the-art algorithms for saliency detection. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
24. Models of data envelopment analysis and stochastic frontier analysis in the efficiency assessment of universities.
- Author
-
Aleskerov, F., Belousova, V., and Petrushchenko, V.
- Subjects
- *
DATA envelopment analysis , *STOCHASTIC frontier analysis , *STOCHASTIC analysis , *CONSISTENCY models (Computers) , *UNIVERSITY rankings - Abstract
This paper systematizes the empirical results on efficiency concepts applied to higher education institutions, data envelopment analysis (DEA) adjusted to heterogeneous samples, inputs and outputs chosen for these institutions and factors tended to make universities efficient. Special attention is paid to the consistency of results yielded by different models. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
25. Consistency-Constrained Nonnegative Coding for Tracking.
- Author
-
Tian, Xiaolin, Jiao, Licheng, Gan, Zhipeng, Zheng, Xiaoli, and Wang, Chaohui
- Subjects
- *
CODING theory , *OBJECT tracking (Computer vision) , *NONNEGATIVE matrices , *CONSISTENCY models (Computers) , *FLOW charts - Abstract
A novel visual object tracking method based on consistency-constrained nonnegative coding (CNC) is proposed in this paper. For the purpose of computational efficiency, superpixels are first extracted from each observed video frame. And then CNC is performed based on those obtained superpixels, where the locality on manifold is preserved by enforcing the temporal and spatial smoothness. The coding result is achieved via an iterative update scheme, which is proved to converge. The proposed method enhances the coding stability and makes the tracker more robust for object tracking. The tracking performance has been evaluated based on ten challenging benchmark sequences involving drastic motion, partial or severe occlusions, large variation in pose, and illumination variation. The experimental results demonstrate the superior performance of our method in comparison with ten state-of-art trackers. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
26. Approximating the correction of weighted and unweighted orthology and paralogy relations.
- Author
-
Dondi, Riccardo, Lafond, Manuel, and El-Mabrouk, Nadia
- Subjects
- *
GENETIC genealogy , *APPROXIMATION algorithms , *SATISFIABILITY (Computer science) , *CONSISTENCY models (Computers) , *POLYNOMIAL time algorithms - Abstract
Background: Given a gene family, the relations between genes (orthology/paralogy), are represented by a relation graph, where edges connect pairs of orthologous genes and "missing" edges represent paralogs. While a gene tree directly induces a relation graph, the converse is not always true. Indeed, a relation graph is not necessarily "satisfiable", i.e. does not necessarily correspond to a gene tree. And even if that holds, it may not be "consistent", i.e. the tree may not represent a true history in agreement with a species tree. Previous studies have addressed the problem of correcting a relation graph for satisfiability and consistency. Here we consider the weighted version of the problem, where a degree of confidence is assigned to each orthology or paralogy relation. We also consider a maximization variant of the unweighted version of the problem. Results: We provide complexity and algorithmic results for the approximation of the considered problems. We show that minimizing the correction of a weighted graph does not admit a constant factor approximation algorithm assuming the unique game conjecture, and we give an n-approximation algorithm, n being the number of vertices in the graph. We also provide polynomial time approximation schemes for the maximization variant for unweighted graphs. Conclusions: We provided complexity and algorithmic results for variants of the problem of correcting a relation graph for satisfiability and consistency. For the maximization variants we were able to design polynomial time approximation schemes, while for the weighted minimization variants we were able to provide the first inapproximability results. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
27. Phase Compensation Sensor for Ranging Consistency in Inter-Satellite Links of Navigation Constellation.
- Author
-
Zhijun Meng, Jun Yang, Xiye Guo, and Mei Hu
- Subjects
- *
GLOBAL Positioning System , *ARTIFICIAL satellite tracking , *ARTIFICIAL satellites , *SYNCHRONIZATION , *CONSISTENCY models (Computers) - Abstract
The performance of the global navigation satellite system (GNSS) can be enhanced significantly by introducing the inter-satellite links (ISL) of a navigation constellation. In particular, the improvement of the position, velocity, and time accuracy, and the realization of autonomous functions require the ISL distance measurement data as the original input. For building a high-performance ISL, the ranging consistency between navigation satellites becomes a crucial problem to be addressed. Considering the frequency aging drift and the relativistic effect of the navigation satellite, the frequency and phase adjustment (FPA) instructions for the 10.23 MHz must be injected from the ground station to ensure the time synchronization of the navigation constellation. Moreover, the uncertainty of the initial phase each time the onboard clock equipment boots also results in a pseudo-range offset. In this Ref., we focus on the influence of the frequency and phase characteristics of the onboard clock equipment on the ranging consistency of the ISL and propose a phase compensation sensor design method for the phase offset. The simulation and experimental results show that the proposed method not only realized a phase compensation for the pseudo-range jitter, but, when the 1 PPS (1 pulse per second) falls in the 10.23 MHz skip area, also overcomes the problem of compensating the ambiguous phase by directly tracking the 10.23 MHz to ensure consistency in the ranging. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
28. A volume-consistent discrete formulation of particle breakage equation.
- Author
-
Saha, Jitraj, Kumar, Jitendra, and Heinrich, Stefan
- Subjects
- *
FINITE volume method , *DIMENSIONAL analysis , *MATHEMATICAL formulas , *STOCHASTIC convergence , *CONSISTENCY models (Computers) - Abstract
We introduce a finite volume scheme to approximate the one dimensional breakage equations. An interesting feature is that it is simple in mathematical formulation and predicts particle number density and its moments with improved accuracy. Efficiency of the new scheme is compared with the existing finite volume scheme proposed by Bourgade and Filbet (2008) over some test problems. It is seen that the new scheme preserves the volume conservative property of the previous scheme and additionally gives an improved estimation of the particle number density and its zero-order moment. Furthermore, the new scheme is computationally more efficient than the existing one. A detailed mathematical analysis including convergence and consistency of the new scheme is also performed. This analysis proves that the new scheme follows a second order convergence rate irrespective of the nature of the meshes. Several example problems are solved numerically to validate the results. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
29. Impact of Spatial Sampling on Continuity of MODIS–VIIRS Land Surface Reflectance Products: A Simulation Approach.
- Author
-
Pahlevan, Nima, Sarkar, Sudipta, Devadiga, Sadashiva, Lin, Guoqing, Wolfe, Robert E., Roman, Miguel, Vermote, Eric, and Xiong, Xiaoxiong
- Subjects
- *
BIOSPHERE , *CONSISTENCY models (Computers) , *GEOMETRIC modeling , *MODIS (Spectroradiometer) , *LAND surface temperature , *GEOGRAPHIC spatial analysis , *INFRARED imaging - Abstract
With the increasing need to construct long-term climate-quality data records to understand, monitor, and predict climate variability and change, it is vital to continue systematic satellite measurements along with the development of new technology for more quantitative and accurate observations. The Suomi National Polar-orbiting Partnership mission provides continuity in monitoring the Earth's surface and its atmosphere in a similar fashion as the heritage MODIS instruments onboard the National Aeronautics and Space Administration's Terra and Aqua satellites. In this paper, we aim at quantifying the consistency of Aqua MODIS and Suomi-NPP Visible Infrared Imaging Radiometer Suite (VIIRS) Land Surface Reflectance (LSR) and NDVI products as related to their inherent spatial sampling characteristics. To avoid interferences from sources of measurement and/or processing errors other than spatial sampling, including calibration, atmospheric correction, and the effects of the bidirectional reflectance distribution function, the MODIS and VIIRS LSR products were simulated using the Landsat-8's Operational Land Imager (OLI) LSR products. The simulations were performed using the instruments' point spread functions on a daily basis for various OLI scenes over a 16-day orbit cycle. It was found that the daily mean differences due to discrepancies in spatial sampling remain below 0.0015 (1%) in absolute surface reflectance at subgranule scale (i.e., OLI scene size). We also found that the MODIS–VIIRS product intercomparisons appear to be minimally impacted when differences in the corresponding view zenith angles (VZAs) are within the range of -\!\!15^\circ to -35^\circ (\textrmVZAV-\textrmVZAM), where VIIRS and MODIS footprints resemble in size. In general, depending on the spatial heterogeneity of the OLI scene contents, per-grid-cell differences can reach up to 20%. Further spatial analysis of the simulated NDVI and LSR products revealed that, depending on the user accuracy requirements for product intercomparisons, spatial aggregations may be used. It was found that if per-grid-cell differences on the order of 10% (in LSR or NDVI) are tolerated, the product intercomparisons are expected to be immune from differences in spatial sampling. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
30. Limitations of Highly-Available Eventually-Consistent Data Stores.
- Author
-
Attiya, Hagit, Ellen, Faith, and Morrison, Adam
- Subjects
- *
INFORMATION retrieval , *CLIENT/SERVER computing , *WIDE area networks , *CONSISTENCY models (Computers) , *SYSTEMS availability - Abstract
Modern replicated data stores aim to provide high availability, by immediately responding to client requests, often by implementing objects that expose concurrency. Such objects, for example, multi-valued registers (MVRs), do not have sequential specifications. This paper explores a recent model for replicated data stores that can be used to precisely specify causal consistency for such objects, and liveness properties like eventual consistency, without revealing details of the underlying implementation. The model is used to prove the following results: 1) An eventually consistent data store implementing MVRs cannot satisfy a consistency model strictly stronger than observable causal consistency ( \mathcal OCC
-bit message is sent. [ABSTRACT FROM AUTHOR]- Published
- 2017
- Full Text
- View/download PDF
31. Decision making with interval-valued intuitionistic fuzzy preference relations based on additive consistency analysis.
- Author
-
Tang, Jie, Zhang, Yongliang, and Meng, Fanyong
- Subjects
- *
FUZZY numbers , *CONSISTENCY models (Computers) , *VECTOR analysis , *COMPUTER programming , *DECISION making - Abstract
Abstract This paper discusses the theory and application of interval-valued intuitionistic fuzzy preference relations (IVIFPRs). To do this, we first introduce an additive consistency concept for IVIFPRs that can address the issues in previous ones. Then, several additive consistency-based programming models for judging the consistency of IVIFPRs and for determining missing values are constructed. Furthermore, an approach for calculating the interval-valued intuitionistic fuzzy priority weight vector is offered. With respect to group decision making with IVIFPRs, a consensus index is offered, and an interactive algorithm for improving the consensus level is proposed that can ensure the additive consistency. After that, a consensus analysis-based programming model for determining the weights of decision makers is built. Furthermore, a consistency-and-consensus-based group decision-making method that can address inconsistent and incomplete IVIFPRs is developed. Finally, a practical decision-making problem about evaluating talents in universities is offered to show the feasibility and efficiency of the new method. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
32. A goal programming approach to deriving interval weights in analytic form from interval Fuzzy preference relations based on multiplicative consistency.
- Author
-
Wang, Zhou-Jing
- Subjects
- *
FUZZY systems , *COMPUTER programming , *CONSISTENCY models (Computers) , *APPROXIMATION theory , *MATHEMATICAL equivalence - Abstract
This paper focuses on how to find an analytic solution of optimal interval weights from consistent interval fuzzy preference relations (IFPRs) and obtain approximate-solution-based interval weights in analytic form from inconsistent IFPRs. The paper first analyzes the popularly used interval weight additive normalization model and illustrates its drawbacks on the existence and uniqueness for characterizing ]0, 1[-valued interval weights obtained from IFPRs. By examining equivalency of ]0, 1[-valued interval weight vectors, a novel framework of multiplicatively normalized interval fuzzy weights (MNIFWs) is then proposed and used to define multiplicatively consistent IFPRs. The paper presents significant properties for multiplicatively consistent IFPRs and their associated MNIFWs. These properties are subsequently used to establish two goal programming (GP) models for obtaining optimal MNIFWs from consistent IFPRs. By the Lagrangian multiplier method, analytic solutions of the two GP models are found for consistent IFPRs. The paper further devises a two-step procedure for deriving approximate-solution-based MNIFWs in analytic form from inconsistent IFPRs. Two visualized computation formulas are developed to determine the left and right bounds of approximate-solution-based MNIFWs of any IFPR. The paper shows that this approximate solution is an optimal solution if an IFPR is multiplicatively consistent. Three numerical examples including three IFPRs and comparative analyses are offered to demonstrate rationality and validity of the developed model. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
33. On the consistency of orthology relationships.
- Author
-
Jones, Mark, Paul, Christophe, and Scornavacca, Céline
- Subjects
- *
GENOMICS , *POLYNOMIAL time algorithms , *COMPARATIVE genomics , *ANALOGY , *CONSISTENCY models (Computers) , *PYTHON programming language - Abstract
Background: Orthologs inference is the starting point of most comparative genomics studies and a plethora of methods have been designed in the last decade to address this challenging task. In this paper we focus on the problems of deciding consistency with a species tree (known or not) of a partial set of orthology/paralogy relationships C on a collection of n genes. Results: We give the first polynomial algorithm-more precisely a O(n3) time algorithm-to decide whether C is consistent, even when the species tree is unknown. We also investigate a biologically meaningful optimization version of these problems, in which we wish to minimize the number of duplication events; unfortunately, we show that all these optimization problems are NP-hard and are unlikely to have good polynomial time approximation algorithms. Conclusions: Our polynomial algorithm for checking consistency has been implemented in Python and is available. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
34. Non-parametric Algorithm to Isolate Chunks in Response Sequences.
- Author
-
Alamia, Andrea, Solopchuk, Oleg, Olivier, Etienne, Zenon, Alexandre, De Leonibus, Elvira, and Baraduc, Pierre
- Subjects
SHORT-term memory ,TIME series analysis ,RANKING (Statistics) ,CLUSTER analysis (Statistics) ,CONSISTENCY models (Computers) - Abstract
Chunking consists in grouping items of a sequence into small clusters, named chunks, with the assumed goal of lessening working memory load. Despite extensive research, the current methods used to detect chunks, and to identify different chunking strategies, remain discordant and difficult to implement. Here, we propose a simple and reliable method to identify chunks in a sequence and to determine their stability across blocks. This algorithm is based on a ranking method and its major novelty is that it provides concomitantly both the features of individual chunk in a given sequence, and an overall index that quantifies the chunking pattern consistency across sequences. The analysis of simulated data confirmed the validity of our method in different conditions of noise, chunk lengths and chunk numbers; moreover, we found that this algorithm was particularly efficient in the noise range observed in real data, provided that at least 4 sequence repetitions were included in each experimental block. Furthermore, we applied this algorithm to actual reaction time series gathered from 3 published experiments and were able to confirm the findings obtained in the original reports. In conclusion, this novel algorithm is easy to implement, is robust to outliers and provides concurrent and reliable estimation of chunk position and chunking dynamics, making it useful to study both sequence-specific and general chunking effects. The algorithm is available at: https://github.com/artipago/Non-parametric-algorithm-to-isolate-chunks-in-responsesequences. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
35. Improved Mean Shift Target Localization using True Background Weighted Histogram and Geometric Centroid Adjustment.
- Author
-
MEHMOOD, Rashid, ul HUDA, Noor, SONG, Jinho, RIAZ, M. Moshin, IQBAL, Naveed, and Tae Sun CHOI
- Subjects
HISTOGRAMS ,CENTROID ,INDOOR positioning systems ,ALGORITHMS ,CONSISTENCY models (Computers) ,ELECTRIC filters - Abstract
Mean Shift (MS) tracking using histogram features alone may cause inaccuracy in target localization. The problem becomes worst due to presence of mingled background features in target model representation. To improve MS target localization problem, this paper propose a spatiospectral technique. The true background features are identified in target model representation using spectral and spatial weighting and then a transformation is applied to minimize their effect in target model representation for localization improvement. The target localization is further improved by adjusting the MS estimated target position through edge based centroid re positioning. The paper also propose method of target model update for background weighted histogram based algorithms followed by weighted transformation through online feature consistency data. The proposed method is designed for single object tracking in complex scenarios and tested for comparative results with existing state of the art techniques. Experimental results on numerous challenging video sequences verify the significance of proposed technique in terms of robustness to complex background, occlusions, appearance changes, and similar color object avoidance. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
36. Maximum consistency method for data fitting under interval uncertainty.
- Author
-
Shary, Sergey
- Subjects
CURVE fitting ,GLOBAL optimization ,INTERVAL analysis ,CONSISTENCY models (Computers) ,PARAMETER estimation ,NONCONVEX programming - Abstract
The work is devoted to application of global optimization in data fitting problem under interval uncertainty. Parameters of the linear function that best fits intervally defined data are taken as the maximum point for a special ('recognizing') functional which is shown to characterize consistency between the data and parameters. The new data fitting technique is therefore called 'maximum consistency method'. We investigate properties of the recognizing functional and present interpretation of the parameter estimates produced by the maximum consistency method. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
37. Agent-based fuzzy constraint-directed negotiation mechanism for distributed job shop scheduling.
- Author
-
Hsu, Chia-Yu, Kao, Bo-Ruei, Ho, Van Lam, and Lai, K. Robert
- Subjects
- *
PRODUCTION scheduling , *FUZZY algorithms , *CONSTRAINT algorithms , *BUSINESS negotiation , *INTELLIGENT agents , *CONSISTENCY models (Computers) - Abstract
This paper presents an agent-based fuzzy constraint-directed negotiation (AFCN) mechanism to solve distributed job shop scheduling problems (JSSPs). The scheduling problem is modelled as a set of fuzzy constraint satisfaction problems (FCSPs), interlinked by inter-agent constraints. Each FCSP represents the perspective of the participants and is governed by autonomous agents. The novelty of the proposed AFCN is to bring the concept of a fuzzy membership function to represent the imprecise preferences of task start time for job and resource agents. This added information sharing is crucial for the effectiveness of distributed coordination. It not only can speed up the convergence, but also enforce a global consistency through iterative exchange of offers and counter-offers. The AFCN mechanism can also flexibly adopt different negotiation strategies, such as competitive, win–win, and collaborative strategies, for different production environments. The experimental results demonstrate that the proposed model can provide not only high-quality and cost-effective job shop scheduling (i.e., comparable to that of centralized methods) but also superior performance in terms of the makespan and average flow time compared with other negotiation models for agent-based manufacturing scheduling. As a result, the proposed AFCN mechanism is flexible and useful for distributed manufacturing scheduling with unforeseen disturbances. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
38. Pitfalls in Memory Consistency Modelling*.
- Author
-
Higham, Lisa and Kawash, Jalal
- Subjects
- *
COMPUTER storage devices , *CONSISTENCY models (Computers) , *ARTIFICIAL intelligence , *MULTIPROCESSORS , *ALGORITHMS - Abstract
Five pitfalls or potential mistakes in memory consistency modelling are identified. Each pitfall can have dramatic impact on our ability to write correct and efficient programs, to determine the algorithmic capabilities or limitations of particular architectures, and to compare various systems. These potential problems are highlighted by illustrating each pitfall with examples from known consistency model definitions. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
39. Pitfalls in Memory Consistency Modelling*.
- Author
-
Higham, Lisa and Kawash, Jalal
- Subjects
COMPUTER storage devices ,CONSISTENCY models (Computers) ,ARTIFICIAL intelligence ,MULTIPROCESSORS ,ALGORITHMS - Abstract
Five pitfalls or potential mistakes in memory consistency modelling are identified. Each pitfall can have dramatic impact on our ability to write correct and efficient programs, to determine the algorithmic capabilities or limitations of particular architectures, and to compare various systems. These potential problems are highlighted by illustrating each pitfall with examples from known consistency model definitions. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
40. An Efficient Approach for Tackling Large Real World Qualitative Spatial Networks.
- Author
-
Sioutis, Michael, Condotta, Jean-François, and Koubarakis, Manolis
- Subjects
- *
QUALITATIVE research , *CONSISTENCY models (Computers) , *SPATIAL analysis (Statistics) , *CONSTRAINT satisfaction , *WEB services , *DATA analysis - Abstract
We improve the state-of-the-art method for checking the consistency of large qualitative spatial networks that appear in the Web of Data by exploiting the scale-free-like structure observed in their constraint graphs. We propose an implementation scheme that triangulates the constraint graphs of the input networks and uses a hash table based adjacency list to efficiently represent and reason with them. We generate random scale-free-like qualitative spatial networks using the Barabási-Albert (BA) model with a preferential attachment mechanism. We test our approach on the already existing random datasets that have been extensively used in the literature for evaluating the performance of qualitative spatial reasoners, our own generated random scale-free-like spatial networks, and real spatial datasets that have been made available as Linked Data. The analysis and experimental evaluation of our method presents significant improvements over the state-of-the-art approach, and establishes our implementation as the only possible solution to date to reason with large scale-free-like qualitative spatial networks efficiently. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
41. Smart knowledge sharing system for cyberinfrastructure.
- Author
-
Kim, Jin, Yu, Sung, and Park, Sang
- Subjects
- *
DISTRIBUTED databases , *CONSISTENCY models (Computers) , *CYBERINFRASTRUCTURE , *DISTRIBUTED computing , *ELECTRONIC data processing - Abstract
For development of science and technology, a lot of physical infrastructures are constructed, to be used in many fields. Recently the concept of Big Data, mass information and rich contents has emerged. In the case of particular infrastructure which is used for these data, there are many obstacles to approach them by user because their scale or leveraging is very complex and huge. However, many smart devices have been developed and spread. As a result, it is important to consider about the use of smart devices. In this paper smart, knowledge sharing system which is related the CyberLab is introduced. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
42. Motion-free exposure fusion based on inter-consistency and intra-consistency.
- Author
-
Zhang, Wei, Hu, Shengnan, Liu, Kan, and Yao, Jian
- Subjects
- *
PHOTOGRAPHIC exposure , *CONSISTENCY models (Computers) , *IMAGE processing , *IMAGE segmentation , *HISTOGRAMS , *PIXELS - Abstract
Exposure fusion often suffers from ghost artifacts, which are caused by the movement of objects when a dynamic scene is captured. In this paper, two types of consistency concepts are introduced for enforcing the guidance of a reference image for motion detection and ghost removal. Specifically, the inter-consistency, which represents the similarities of pixel intensities among different exposures, is weakened by the use of different exposure settings. Histogram matching is employed to recover the inter-consistency. Following this, pixel differences are mostly the result of changes in content caused by object movements, so motion can easily be detected. To further restrain the weights of outliers in fusion, motion detection is performed at a super-pixel level, to ensure that pixels with similar intensities and structures share similar fusion weights. This is referred to as intra-consistency. Experiments in various dynamic scenes demonstrate that the proposed algorithm can determine the motion more effectively than existing methods, and produce high quality fusion results that are free of ghost artifacts. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
43. Chapter 4: A Mathematical Meta-Theory of Contracts: 4.1 Components.
- Author
-
Benveniste, A., Caillaud, B., Nickovic, D., Passerone, R., Raclet, J.-B., Reinkemeier, Ph., Sangiovanni-Vincentelli, A., Damm, W., Henzinger, T., and Larsen, K. G.
- Subjects
CONTRACTS ,MATHEMATICAL models ,METATHEORY ,SOFTWARE compatibility ,CONSISTENCY models (Computers) - Published
- 2018
- Full Text
- View/download PDF
44. The Consistency and Complexity of Multiplicative Additive System Virtual.
- Author
-
HORNE, Ross
- Subjects
CONSISTENCY models (Computers) ,COMPUTATIONAL complexity ,MULTIPLICATION ,VIRTUAL reality ,MATHEMATICAL proofs ,STATISTICAL decision making - Abstract
This paper investigates the proof theory of multiplicative additive system virtual (MAV). MAV combines two established proof calculi: multiplicative additive linear logic (MALL) and basic system virtual (BV). Due to the presence of the self-dual non-commutative operator from BV, the calculus MAV is defined in the calculus of structures -- a generalisation of the sequent calculus where inference rules can be applied in any context. A generalised cut elimination result is proven for MAV, thereby establishing the consistency of linear implication defined in the calculus. The cut elimination proof involves a termination measure based on multisets of multisets of natural numbers to handle subtle interactions between operators of BV and MAV. Proof search in MAV is proven to be a PSPACE-complete decision problem. The study of this calculus is motivated by observations about applications in computer science to the verification of protocols and to querying. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
45. The method of polynomial ring calculus and its potentialities.
- Author
-
Carnielli, Walter and Matulovic, Mariana
- Subjects
- *
POLYNOMIAL rings , *LAMBDA calculus , *DERIVATIVES (Mathematics) , *MATHEMATICAL proofs , *CONSISTENCY models (Computers) - Abstract
This paper surveys some results on the role of formal polynomials as a representation method for logical derivation in classical and non-classical logics, emphasizing many-valued logics, paraconsistent logics and non-deterministic logics, as well as their potentialities for alternative algebraic representation and for automation. The resulting mechanizable proof method exposed here is of interest for automatic proof theory, as the proof methods are comparable to analytic tableaux in generality and intuitiveness, and seems also to indicate a new avenue for investigating questions on complexity. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
46. A consistency model for group decision making problems with interval multiplicative preference relations.
- Author
-
Zhang, Huimin
- Subjects
CONSISTENCY models (Computers) ,GROUP decision making ,LOGARITHMS ,INFORMATION theory ,PROBLEM solving - Abstract
The main aim of this paper is to present a consistency model for interval multiplicative preference relation (IMPR). To measure the consistency level for IMPR, a referenced consistent IMPR of a given IMPR is defined, which has the minimum logarithmic distance from the given IMPR. Based on the referenced consistent IMPR, the consistency level of an IMPR can be measured and an IMPR with unacceptable consistency can be adjusted by a proposed algorithm such that the revised IMPR is of acceptable consistency. A consistency model for group decision making (GDM) problems with IMPRs is proposed to obtain the collective IMPR with highest consistency level. Numerical examples are provided to illustrate the validity of the proposed approaches in decision making. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
47. Design of distributed database systems: an iterative genetic algorithm.
- Author
-
Song, Sukkyu
- Subjects
DISTRIBUTED databases ,CONSISTENCY models (Computers) ,GENETIC algorithms ,REACTION time ,DECISION support systems - Abstract
The two important aspects for design of distributed database systems are operation allocation and data allocation. Operation allocation refers to query execution plan indicating which operations (subqueries) should be allocated to which sites in a computer network, so that query processing costs are minimized. Data allocation is to allocate relations to sites so that the performance of distributed database are improved. In this research, we developed a solution technique for operation allocation and data allocation problem, using three objective functions: total time minimization or response time minimization, and the combination of total time and response time minimization. We formulated these allocation problems and provided analytical cost models for each objective function. Since the problem is NP-hard, we proposed a heuristic solution based on genetic algorithm (GA). Comparison of results with the exhaustive enumeration indicated that GA produced optimal solutions in all cases in much less time. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
48. Smooth change point estimation in regression models with random design.
- Author
-
Döring, Maik and Jensen, Uwe
- Subjects
- *
ESTIMATION theory , *REGRESSION analysis , *RANDOM effects model , *CONSISTENCY models (Computers) , *GAUSSIAN processes - Abstract
We consider the problem of estimating the location of a change point $$\theta _0$$ in a regression model. Most change point models studied so far were based on regression functions with a jump. However, we focus on regression functions, which are continuous at $$\theta _0$$ . The degree of smoothness $$q_0$$ has to be estimated as well. We investigate the consistency with increasing sample size $$n$$ of the least squares estimates $$(\hat{\theta }_n,\hat{q}_n)$$ of $$(\theta _0, q_0)$$ . It turns out that the rates of convergence of $$\hat{\theta }_n$$ depend on $$q_0$$ : for $$q_0$$ greater than $$1/2$$ we have a rate of $$\sqrt{n}$$ and the asymptotic normality property; for $$q_0$$ less than $$1/2$$ the rate is $$\displaystyle n^{1/(2q_0+1)}$$ and the change point estimator converges to a maximizer of a Gaussian process; for $$q_0$$ equal to $$1/2$$ the rate is $$\sqrt{n \cdot \mathrm{ln}(n)}$$ . Interestingly, in the last case the limiting distribution is also normal. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
49. A new ensemble-based consistency test for the Community Earth System Model.
- Author
-
Baker, A. H., Hammerling, D. M., Levy, M. N., Xu, H., Dennis, J. M., Eaton, B. E., Edwards, J., Hannay, C., Mickelson, S. A., Neale, R. B., Nychka, D., Shollenberger, J., Tribbia, J., Vertenstein, M., and Williamson, D.
- Subjects
- *
CLIMATE change models , *QUALITY assurance , *CONSISTENCY models (Computers) , *DISTRIBUTION (Probability theory) , *COMPUTER simulation - Abstract
Climate simulations codes, such as the Community Earth System Model (CESM), are especially complex and continually evolving. Their on-going state of development requires frequent software verification in the form of quality assurance to both preserve the quality 5 of the code and instill model confidence. To formalize and simplify this previously subjective and computationally-expensive aspect of the verification process, we have developed a new tool for evaluating climate consistency. Because an ensemble of simulations allows us to gauge the natural variability of the model's climate, our new tool uses an ensemble approach for consistency testing. In particular, an ensemble of CESM climate runs is created, from which we obtain a statistical distribution that can be used to determine whether a new climate run is statistically distinguishable from the original ensemble. The CESM Ensemble Consistency Test, referred to as CESMECT, is objective in nature and accessible to CESM developers and users. The tool has proven its utility in detecting errors in software and hardware environments and providing rapid feedback to model developers. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
50. A Novel Approach to Improve the Processor Performance with Page Replacement Method.
- Author
-
Abraham, Jisha P. and Mathew, Sheena
- Subjects
COMPUTER memory management ,CONSISTENCY models (Computers) ,LOOP tiling (Computer science) ,COMPUTER scheduling ,DISTRIBUTED shared memory - Abstract
Memory management techniques are available in modern OS which permits the execution of a program while it is partially available in memory thus providing an illusion of very large memory to the user and freeing the user from the concern of large program size. The basic assumption used in the traditional page replacement methods were invalidated,which resulting in a revival of research area. In this paper mainly focused on the improvement of the processor performance. In order to achieve this, the number of miss conditions are tried to reduce, by using the page replacement method in the correct way. In modern operating systems mainly make use of the Least recently used (LRU) page replacement method, in which only the time parameter is make used. Here we want to use both the time stamp and the frequency parameter for handling the page replacement. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.