661 results on '"Axiomatic system"'
Search Results
2. A new axiomatic approach to the impartial nomination problem
- Author
-
Paul H. Edelman and Attila Pór
- Subjects
Set (abstract data type) ,Mathematics::Logic ,Economics and Econometrics ,Consistency (negotiation) ,Computer science ,Structure (category theory) ,Axiomatic system ,Nomination ,Mathematical economics ,Finance ,Axiom ,Variable (mathematics) - Abstract
In this paper we introduce a new set of axioms that characterize uniform random dictatorship (URD) as a randomized impartial nomination rule. Unlike earlier work we use a variable population model which allows us to employ axioms that reflect consistency and proportionality–axioms that ensure that the rule behaves well with respect to the combinatorial structure of the nomination profile. Earlier work characterizing URD employed strong symmetry axioms and so it is surprising that our axioms characterize the same rule.
- Published
- 2021
3. Slack extender mechanism for greening dependent-tasks scheduling on DVFS-enabled computing platforms
- Author
-
Tarek Hagras
- Subjects
Schedule ,Computer science ,Distributed computing ,Axiomatic system ,Theoretical Computer Science ,Scheduling (computing) ,Reduction (complexity) ,Task (computing) ,Hardware and Architecture ,Key (cryptography) ,Scaling ,Software ,Energy (signal processing) ,Information Systems - Abstract
The task’s slack is the key issue to reduce the energy consumed by DVFS-enabled computing platforms. Despite the large number of scheduling algorithms that are presented in the literature, only a unique scaling axiomatic approach (SAA) is utilized in the scaling phase of the algorithms. SAA simply extends the execution of the task within its slack if a suitable scaling frequency is available. Unfortunately, when dependent-tasks applications are scheduled on such platforms, scheduling algorithms minimize the tasks’ slacks to reduce the overall completion time of the application tasks. This paper presents a mechanism that can be applied to any schedule produced by a dependent-task scheduling algorithm for both homogeneous and heterogeneous DVFS-enabled computing platforms. The proposed mechanism is called BlackLight. BlackLight attempts to extend the tasks’ slacks via rescheduling the application tasks without violating the overall completion time of the application tasks. The proposed mechanism is applied to a large number of dependent-tasks schedules of both random generated application graphs and two real-world application graphs. The experimental results based on a computer simulation show that the proposed mechanism significantly extends the tasks’ slacks compared with SAA , which leads to more reduction in the consumed energy.
- Published
- 2021
4. On the axiomatic approach to sharing the revenues from broadcasting sports leagues
- Author
-
Gustavo Bergantiños and Juan D. Moreno-Ternero
- Subjects
Structure (mathematical logic) ,Computer Science::Computer Science and Game Theory ,Economics and Econometrics ,Equal treatment of equals ,Computer science ,media_common.quotation_subject ,Axiomatic system ,Impartiality ,Additivity ,Broadcasting (networking) ,Order (exchange) ,Broadcasting problems ,Revenue ,Resource allocation ,Mathematical economics ,Social Sciences (miscellaneous) ,Axiom ,media_common ,Complement (set theory) - Abstract
We take the axiomatic approach to uncover the structure of the revenue-sharing problem from broadcasting sports leagues. We formalize two notions of impartiality, depending on the stance one takes with respect to the revenue generated in the games involving each pair of teams. We show that the resulting two axioms lead towards two broad categories of rules, when combined with additivity and some other basic axioms. We complement those results strengthening the impartiality notions to consider axioms of order preservation., Universidad Pablo de Olavide, Department of Economics, Carretera de Utrera, Km. 1, 41013, Seville, Spain
- Published
- 2021
5. The Problem of Quantifying Utility
- Author
-
V. M. Romanchak
- Subjects
Mathematical optimization ,Computer science ,Decision theory ,05 social sciences ,Axiomatic system ,utility function ,Measurement problem ,rating ,Function (mathematics) ,hierarchy analysis method ,01 natural sciences ,050105 experimental psychology ,stevens's law ,010309 optics ,criteria importance theory ,Economics as a science ,0103 physical sciences ,Ordered pair ,Independence (mathematical logic) ,measurement theory ,0501 psychology and cognitive sciences ,fechner's law ,HB71-74 ,Equivalence (measure theory) ,Axiom - Abstract
Purpose of the study. Analysis of the literature shows that the ordinal theory of utility is widespread in the theory of consumer behavior. To analyze consumer preferences, a utility function is used, which characterizes the value of the utility of the consumed goods and services on a scale of order. Moreover, to find the marginal utility of a product, arithmetic operations are used, which are impossible on a scale of order. To allow arithmetic operations, a quantitative analysis of the utility function is required. Consequently, the problem of quantitative measurement of the utility function is relevant.The measurement problem also arises in decision theory. For example, the hierarchy analysis method is a popular method for solving multicriteria problems, but contains an erroneous model of subjective measurement. For this reason, other methods appear in decision-making theory that should replace the method of analyzing hierarchies. The theory of the importance of criteria is being actively developed. However, the theory of the importance of criteria also does not solve the problem of quantitative measurement.For a long time, the problem of measurement has also existed in psychophysics. The existence of two mismatched psychophysical laws contradicts the classical theory of measurements. Recently, a rating solution has been proposed. The equivalence of the basic laws of psychophysics has been proved. In this paper, it is proposed to use the rating method to measure preferences in utility theory and in decision theory.Materials and methods. The domain of the rating is the set of ordered pairs of objects. Moreover, the composition (operation of addition) of objects is defined on the set of ordered pairs. A rating is a number that is assigned during a measurement to an ordered pair of objects.The rating is assumed to preserve the operation of composition of ordered pairs.An arithmetic operation is selected to carry out the measurement. The measurement result must match the result of the arithmetic operation. The result of an arithmetic operation is the difference or ratio of the values of the quantity. The rating values coincide with the result of the arithmetic operation (up to isomorphism).The additivity of the rating is used to check the adequacy of the measurement results. For this, it is assumed that the rating is independent of the measurement method. The theoretical justification for independence is the isomorphism condition. The empirical confirmation of independence is the equivalence of the basic psychophysical laws.Results. The paper presents an axiomatic approach to the measurement problem. Measurement can be carried out in both objective and subjective ways. The axiomatic and classical definition of the rating has been formulated. The axiomatic definition implies the classical definition for a special set of objects. The classic definition is constructive. To check the adequacy of the measurement results, it is enough to compare the ratings obtained by different measurement methods (method of alternatives).Conclusion. The rating method is a quantitative measurement method. The rating method can be used to construct a model of consumer behavior and in decision-making theory.
- Published
- 2021
6. Centralized Control System of Flotation Department: Axiomatic Approach
- Author
-
Pavel Khudyakov, O. S. Logunova, Yuliya Kukhta, Marat Muslimov, Jsc \\'Uchalinsky Gok\\', and Anatoly V. Lednov
- Subjects
Operations research ,Computer science ,Control system ,Axiomatic system - Published
- 2021
7. Comprehensive nonmodularity and interaction indices for decision analysis
- Author
-
Jian-Zhang Wu and Gleb Beliakov
- Subjects
Statistics and Probability ,0209 industrial biotechnology ,Theoretical computer science ,Index (economics) ,Property (philosophy) ,Computer science ,General Engineering ,Probabilistic logic ,Axiomatic system ,02 engineering and technology ,Multiple-criteria decision analysis ,020901 industrial engineering & automation ,Artificial Intelligence ,Phenomenon ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Construct (philosophy) ,Decision analysis - Abstract
Nonmodularity is a prominent property of capacity that deeply links to the internal interaction phenomenon of multiple decision criteria. Following the common architectures of the simultaneous interaction indices as well as of the bipartition interaction indices, in this paper, we construct and study the notion of probabilistic nonmodularity index and also its particular cases, such as Shapely and Banzhaf nonmodularity indices, which can be used to describe the comprehensive interaction situations of decision criteria. The connections and differences among three categories of interaction indices are also investigated and compared theoretically and empirically. It is shown that three types of interaction indices have the same roots in their first and second orders, but meanwhile the nonmodularity indices have involved less amount of subsets and can be adopted to describe the interaction phenomenon in decision analysis.
- Published
- 2021
8. An axiomatic method for goal-dependent allocation in life cycle assessment
- Author
-
Philippe Loubet, Guido Sonnemann, and Dieuwertje Schrijvers
- Subjects
Product system ,Identification (information) ,Operations research ,Computer science ,media_common.quotation_subject ,Substitution (logic) ,Axiomatic system ,Product (category theory) ,Life-cycle assessment ,Axiom ,General Environmental Science ,Reputation ,media_common - Abstract
How to apply allocation in an life cycle assessment (LCA) is a long-running and controversial debate. Consensus seems to exist on the fact that the allocation procedure should follow logically from the LCA goal definition. This paper proposes to use an axiomatic method to (1) identify an allocation procedure for co-production, joint treatment, and recycling, that best responds to a specific LCA goal and (2) communicate the rationale applied by the LCA practitioner transparently. The method is illustrated via a case study. The specific goal definition for which a suitable allocation procedure is identified is to evaluate what impacts can be attributed to a product, which could inform a company about potential sources of reputation damage. Subjective assumptions that reflect our vision of “what impacts can be attributed to a product” are described in definitions and axioms. Axioms are formulated that describe the system boundaries of the product system and the partitioning criterion. The derived allocation procedure corresponds to “Allocation at the Point of Substitution,” which is applied in one of the system models of ecoinvent. Partitioning is based on market information, which corresponds to a cause-oriented perspective on “what impacts can be attributed to a product.” Other LCA goal definitions and rationales could require different system boundaries or a different partitioning criterion. The axiomatic method presented in this paper supports the identification of a suitable allocation procedure for a defined LCA goal and the transparent communication of the rationale that backs up this procedure. Building forth on the approach of this paper, a collection of axioms and corresponding allocation procedures could be developed on which consensus might exist within the LCA community. Such goal-dependent allocation procedures could form the basis of future guidance on LCA.
- Published
- 2021
9. Exploring the Jungle of Intuitionistic Temporal Logics
- Author
-
Martín Diéguez, David Fernández-Duque, Philip Kremer, and Joseph Boudou
- Subjects
FOS: Computer and information sciences ,Computer Science - Logic in Computer Science ,Class (set theory) ,Theoretical computer science ,Semantics (computer science) ,Computer science ,02 engineering and technology ,01 natural sciences ,Theoretical Computer Science ,Linear temporal logic ,Artificial Intelligence ,Computer Science::Logic in Computer Science ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,0101 mathematics ,Logic programming ,Soundness ,Functional programming ,010102 general mathematics ,Axiomatic system ,Logic in Computer Science (cs.LO) ,Mathematics::Logic ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Type theory ,Computational Theory and Mathematics ,Hardware and Architecture ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,Software - Abstract
The importance of intuitionistic temporal logics in Computer Science and Artificial Intelligence has become increasingly clear in the last few years. From the proof-theory point of view, intuitionistic temporal logics have made it possible to extend functional languages with new features via type theory, while from its semantical perspective several logics for reasoning about dynamical systems and several semantics for logic programming have their roots in this framework. In this paper we consider several axiomatic systems for intuitionistic linear temporal logic and show that each of these systems is sound for a class of structures based either on Kripke frames or on dynamic topological systems. Our topological semantics features a new interpretation for the `henceforth' modality that is a natural intuitionistic variant of the classical one. Using the soundness results, we show that the seven logics obtained from the axiomatic systems are distinct., Comment: Under consideration in Theory and Practice of Logic Programming (TPLP). arXiv admin note: text overlap with arXiv:1803.05077
- Published
- 2021
10. An axiomatic characterization of a generalized ecological footprint
- Author
-
Radomir Pestow, Anja Zenker, and Thomas Kuhn
- Subjects
Economics and Econometrics ,Ecological footprint ,Computer science ,05 social sciences ,Axiomatic system ,010501 environmental sciences ,Management, Monitoring, Policy and Law ,Environmental Science (miscellaneous) ,Characterization (mathematics) ,01 natural sciences ,Set (abstract data type) ,Mathematics::Logic ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Measurement theory ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,Computer Science::Logic in Computer Science ,0502 economics and business ,050202 agricultural economics & policy ,Mathematical economics ,Axiom ,0105 earth and related environmental sciences - Abstract
The purpose of this paper is to propose an axiomatic characterization of ecological footprint indices. Using an axiomatic approach, we define a set of axioms representing the properties considered ...
- Published
- 2021
11. Axiomatic Theories and Improving the Relevance of Information Systems Research
- Author
-
Jae Kyu Lee, Victoria Y. Yoon, Shirley Gregor, and Jinsoo Park
- Subjects
Information Systems and Management ,Computer Networks and Communications ,Management science ,Computer science ,Axiomatic system ,Library and Information Sciences ,Design science ,Management Information Systems ,Information system ,Technology acceptance model ,Relevance (information retrieval) ,Axiom ,Information Systems ,Social influence ,Electronic data interchange - Abstract
Axiomatic Theories and Improving the Relevance of Information Systems Research This paper examines the fact that a significant number of empirical information systems (IS) studies engage in confirmative testing of self-evident axiomatic theories without yielding highly relevant knowledge for the IS community. The authors conduct both a horizontal analysis of 72 representative IS theories and an in-depth vertical analysis of 3 well-known theories (i.e., technology acceptance model, diffusion of innovation theory, and institutional theory) in order to measure how pervasive such testing of axiomatic theories is. The authors discovered that more than 60% of 666 hypotheses from the horizontal analysis could be regarded as axiomatic theory elements. In the vertical analysis, 68.1% of 1,301 hypotheses from 148 articles were axiomatic. Based on these findings, the authors propose four complementary IS research approaches: (1) identifying disconfirming boundary conditions, (2) measuring the relative importance of axiomatic causal factors, (3) measuring the stage of progression toward visionary goals when the nature of the axiomatic theory can be extended to future visions, and (4) engaging in the conceptual design of visionary axiomatic goals. They argue that these complementary IS research approaches can enhance the relevance of IS research outcomes without sacrificing methodological rigor.
- Published
- 2021
12. Mutual Influence between Different Views of Probability and Statistical Inference
- Author
-
Manfred Borovcnik
- Subjects
Conceptual-teórico ,Inferencia ,Interpretation (logic) ,Computer science ,Axiomatic system ,Mathematical theory ,Bayes' theorem ,Meaning (philosophy of language) ,Frequentist inference ,Significado ,Statistical inference ,Probabilidad ,Mathematical economics ,Statistical hypothesis testing - Abstract
In this paper, we analyse the various meanings of probability and its different applications, and we focus especially on the classical, the frequentist, and the subjectivist view. We describe the different problems of how probability can be measured in each of the approaches, and how each of them can be well justified by a mathematical theory. We analyse the foundations of probability, where the scientific analysis of the theory that allows for a frequentist interpretation leads to unsolvable problems. Kolmogorov’s axiomatic theory does not suffice to establish statistical inference without further definitions and principles. Finally, we show how statistical inference essentially determines the meaning of probability and a shift emerges from purely objectivist views to a complementary conception of probability with frequentist and subjectivist constituents. For didactical purpose, the result of the present analyses explains basic problems of teaching, originating from a biased focus on frequentist aspects of probability. It also indicates a high priority for the design of suitable learning paths to a complementary conception of probability. In the applications, modellers use information in a pragmatic way processing this information regardless of its connotation into formal mathematical models, which are always thought as essentially wrong but useful.
- Published
- 2021
13. Stochastic dynamic utilities and intertemporal preferences
- Author
-
Andrea Maran and Marco Maggis
- Subjects
Statistics and Probability ,050208 finance ,Computer science ,Mathematical finance ,media_common.quotation_subject ,05 social sciences ,Axiomatic system ,Decision maker ,01 natural sciences ,010104 statistics & probability ,State dependent ,0502 economics and business ,0101 mathematics ,Statistics, Probability and Uncertainty ,Function (engineering) ,Representation (mathematics) ,Preference (economics) ,Mathematical economics ,Finance ,media_common - Abstract
We propose an axiomatic approach which economically underpins the representation of dynamic intertemporal decisions in terms of a utility function, which randomly reacts to the information available to the decision maker throughout time. Our construction is iterative and based on time dependent preference connections, whose characterization is inspired by the original intuition given by Debreu’s State Dependent Utilities (1960).
- Published
- 2021
14. Robust Hierarchical Clustering for Directed Networks: An Axiomatic Approach
- Author
-
Santiago Segarra, Gunnar E. Carlsson, and Facundo Mémoli
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Algebra and Number Theory ,Computer science ,Applied Mathematics ,Stability (learning theory) ,Axiomatic system ,0102 computer and information sciences ,02 engineering and technology ,16. Peace & justice ,computer.software_genre ,01 natural sciences ,Machine Learning (cs.LG) ,Hierarchical clustering ,010201 computation theory & mathematics ,Robustness (computer science) ,Consistency (statistics) ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Geometry and Topology ,Data mining ,computer - Abstract
We provide a complete taxonomic characterization of robust hierarchical clustering methods for directed networks following an axiomatic approach. We begin by introducing three practical properties associated with the notion of robustness in hierarchical clustering: linear scale preservation, stability, and excisiveness. Linear scale preservation enforces imperviousness to change in units of measure whereas stability ensures that a bounded perturbation in the input network entails a bounded perturbation in the clustering output. Excisiveness refers to the local consistency of the clustering outcome. Algorithmically, excisiveness implies that we can reduce computational complexity by only clustering a subset of our data while theoretically guaranteeing that the same hierarchical outcome would be observed when clustering the whole dataset. In parallel to these three properties, we introduce the concept of representability, a generative model for describing clustering methods through the specification of their action on a collection of networks. Our main result is to leverage this generative model to give a precise characterization of all robust -- i.e., excisive, linear scale preserving, and stable -- hierarchical clustering methods for directed networks. We also address the implementation of our methods and describe an application to real data., arXiv admin note: substantial text overlap with arXiv:1607.06339
- Published
- 2021
15. The Borda class
- Author
-
Ulle Endriss and Zoi Terzopoulou
- Subjects
Economics and Econometrics ,Class (set theory) ,Computer science ,Applied Mathematics ,media_common.quotation_subject ,Rank (computer programming) ,Axiomatic system ,Set (abstract data type) ,Voting ,Mathematical economics ,Social choice theory ,Preference (economics) ,Axiom ,media_common - Abstract
The Borda rule, originally defined on profiles of individual preferences modelled as linear orders over the set of alternatives, is one of the most important voting rules. But voting rules often need to be used on preferences of a different format as well, such as top-truncated orders, where agents rank just their most preferred alternatives. What is the right generalisation of the Borda rule to such richer models of preference? Several suggestions have been made in the literature, typically considering specific contexts where the rule is to be applied. In this work, taking an axiomatic perspective, we conduct a principled analysis of the different options for defining the Borda rule on top-truncated preferences.
- Published
- 2021
16. Multicriteria Choice Based on Fuzzy Information
- Author
-
V. D. Noghin
- Subjects
Set (abstract data type) ,Mathematical optimization ,General Computer Science ,Computer science ,Goal programming ,Fuzzy set ,Feasible region ,Axiomatic system ,Multi-objective optimization ,Fuzzy logic ,Membership function - Abstract
This paper proposes a new method for solving the problem of multicriteria optimization of a numerical vector function on a fuzzy set. The membership function of a fuzzy feasible set is joined to the original set of criteria that allows the original problem of multi-criteria optimization to be treated as the task of finding a suitable compromise (Pareto-optimal) solution with respect to an extended set of criteria. It is assumed that in a search for the “best” compromise solution there is only fuzzy information about the preferences of decision maker in the form of information quanta. At the first stage of the proposed method, the search for a compromise is made on the basis of an axiomatic approach, with which the Pareto set is reduced. The result of the reduction is a fuzzy set with a membership function, which is determined on the basis of the used fuzzy information. At the second stage, the obtained membership function is added to the extended set of criteria, after which the scalarization procedure based on the idea of goal programming is used to solve the formed multicriteria problem.
- Published
- 2020
17. Encoder-Decoder Attention ≠ Word Alignment: Axiomatic Method of Learning Word Alignments for Neural Machine Translation
- Author
-
Tiejun Zhao, Masao Utiyama, Eiichiro Sumita, Akihiro Tamura, and Chunpeng Ma
- Subjects
Machine translation ,Computer science ,business.industry ,Speech recognition ,Deep learning ,Axiomatic system ,Encoder decoder ,Artificial intelligence ,computer.software_genre ,business ,computer ,Word (computer architecture) - Published
- 2020
18. On Proving a Program Shortest
- Author
-
Arindama Singh
- Subjects
Discrete mathematics ,Kolmogorov complexity ,Computer science ,String (computer science) ,Computer Science::Programming Languages ,Axiomatic system ,Education ,Zero (linguistics) - Abstract
We revisit a problem faced by all programmers. Can one write a program that determines whether any given program is the shortest program? How does one prove that a given program is the shortest? After answering these questions, we discuss very briefly the Kolmogorov complexity of a string of zero and one, which leads to a barrier on any axiomatic system, known as Chaitin’s barrier.
- Published
- 2020
19. Bitcoin: An Axiomatic Approach and an Impossibility Theorem
- Author
-
Philipp Strack and Jacob D. Leshno
- Subjects
050208 finance ,Computer science ,05 social sciences ,Axiomatic system ,Decentralised system ,Arrow's impossibility theorem ,Incentive ,0502 economics and business ,General Earth and Planetary Sciences ,050207 economics ,Impossibility ,Protocol (object-oriented programming) ,Mathematical economics ,Axiom ,General Environmental Science ,Anonymity - Abstract
Bitcoin’s main innovation lies in allowing a decentralized system that relies on anonymous, profit-driven miners who can freely join the system. We formalize these properties in three axioms: anonymity of miners, no incentives for miners to consolidate, and no incentive to assuming multiple fake identities. This novel axiomatic formalization allows us to characterize what other protocols are feasible: every protocol with these properties must have the same reward scheme as Bitcoin. This implies an impossibility result for risk-averse miners. Furthermore, any protocol either gives up on some degree of decentralization or its reward scheme is equivalent to Bitcoin’s. (JEL D82, E42, O33)
- Published
- 2020
20. An axiomatic approach to CG′3 logic
- Author
-
Mauricio Osorio Galindo, José R. Arrazola Ramírez, Miguel Pérez-Gaspar, and Alejandro Hernández-Tello
- Subjects
Algebra ,010201 computation theory & mathematics ,Logic ,Computer science ,010102 general mathematics ,Axiomatic system ,0102 computer and information sciences ,0101 mathematics ,01 natural sciences - Abstract
In memoriam José Arrazola Ramírez (1962–2018) The logic $\textbf{G}^{\prime}_3$ was introduced by Osorio et al. in 2008; it is a three-valued logic, closely related to the paraconsistent logic $\textbf{CG}^{\prime}_3$ introduced by Osorio et al. in 2014. The logic $\textbf{CG}^{\prime}_3$ is defined in terms of a multi-valued semantics and has the property that each theorem in $\textbf{G}^{\prime}_3$ is a theorem in $\textbf{CG}^{\prime}_3$. Kripke-type semantics has been given to $\textbf{CG}^{\prime}_3$ in two different ways by Borja et al. in 2016. In this work, we continue the study of $\textbf{CG}^{\prime}_3$, obtaining a Hilbert-type axiomatic system and proving a soundness and completeness theorem for this logic.
- Published
- 2020
21. An axiomatic approach to corpus-based cross-language information retrieval
- Author
-
Azadeh Shakery, Ali Montazeralghaem, and Razieh Rahimi
- Subjects
Computer science ,business.industry ,Computer Science::Information Retrieval ,InformationSystems_INFORMATIONSTORAGEANDRETRIEVAL ,Axiomatic system ,Computer Science::Computation and Language (Computational Linguistics and Natural Language and Speech Processing) ,Library and Information Sciences ,computer.software_genre ,Translation (geometry) ,Weighting ,Term (time) ,Ranking (information retrieval) ,Pattern recognition (psychology) ,Corpus based ,Artificial intelligence ,business ,computer ,Natural language processing ,Cross-language information retrieval ,Information Systems - Abstract
A major challenge in cross-language information retrieval (CLIR) is the adoption of translation knowledge in retrieval models, as it affects term weighting which is known to highly impact the retrieval performance. Despite its importance, how different approaches for integration of translation knowledge into retrieval models relatively perform has not been analytically examined. In this paper, we present an analytical investigation of using translation knowledge in CLIR. In particular, by adopting the axiomatic analysis framework, we formulate impacts of using translation knowledge on document ranking as constraints that any cross-language retrieval model should satisfy. We then consider state-of-the-art CLIR methods and check whether they satisfy these constraints. Our study shows that none of the existing methods satisfies all constraints. Based on the defined constraints, we propose the hierarchical query modeling method for CLIR which satisfies more constraints and achieves a higher CLIR performance, compared to the existing methods.
- Published
- 2020
22. Jeffrey Meets Kolmogorov
- Author
-
Alexander Meehan and Snow Zhang
- Subjects
Computer science ,010102 general mathematics ,Bayesian probability ,Foundation (evidence) ,Conditional probability ,Axiomatic system ,06 humanities and the arts ,Conditional probability distribution ,0603 philosophy, ethics and religion ,01 natural sciences ,Philosophy ,General theory ,060302 philosophy ,0101 mathematics ,Mathematical economics - Abstract
Jeffrey conditionalization is a rule for updating degrees of belief in light of uncertain evidence. It is usually assumed that the partitions involved in Jeffrey conditionalization are finite and only contain positive-credence elements. But there are interesting examples, involving continuous quantities, in which this is not the case. Q1 Can Jeffrey conditionalization be generalized to accommodate continuous cases? Meanwhile, several authors, such as Kenny Easwaran and Michael Rescorla, have been interested in Kolmogorov’s theory of regular conditional distributions (rcds) as a possible framework for conditional probability which handles probability-zero events. However the theory faces a major shortcoming: it seems messy and ad hoc. Q2 Is there some axiomatic theory which would justify and constrain the use of rcds, thus serving as a possible foundation for conditional probability? These two questions appear unrelated, but they are not, and this paper answers both. We show that when one appropriately generalizes Jeffrey conditionalization as in Q1, one obtains a framework which necessitates the use of rcds. It is then a short step to develop a general theory which addresses Q2, which we call the theory of extensions. The theory is a formal model of conditioning which recovers Bayesian conditionalization, Jeffrey conditionalization, and conditionalization via rcds as special cases.
- Published
- 2020
23. Nonlinear Dynamics and the Synchronization of Neural Ensembles in the Shaping of Attention
- Author
-
M. E. Mazurov
- Subjects
Nonlinear system ,Process (engineering) ,Computer science ,Synchronization (computer science) ,General Physics and Astronomy ,Axiomatic system ,Synchronizing ,Relaxation (approximation) ,Topology - Abstract
A way of studying the synchronization of relaxation self-oscillations is considered. The technique is based on a modified axiomatic approach that uses the properties of uniform near-periodic functions. Five regimes of synchronizing neural ensembles of 100 peripheral neurons in the shaping of attention are studied. The adequacy of the mathematical computational model used to describe the process of shaping attention is discussed.
- Published
- 2020
24. Dual and Axiomatic Systems for Constructive S4, a Formally Verified Equivalence
- Author
-
Favio Ezequiel Miranda-Perea, P. Selene Linares-Arévalo, and Lourdes del Carmen González Huesca
- Subjects
Natural deduction ,General Computer Science ,Computer science ,Proof assistant ,Modal logic ,Axiomatic system ,Constructive ,Theoretical Computer Science ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,Computer Science::Logic in Computer Science ,Calculus ,Derivation ,Equivalence (formal languages) ,Equivalence (measure theory) ,Axiom - Abstract
We present a proof of the equivalence between two deductive systems for the constructive modal logic S4. On one side, an axiomatic characterization inspired by Hakli and Negri's Hilbert-style system of derivations from assumptions for modal logic K. On the other side, the judgmental reconstruction given by Pfenning and Davies by means of a so-called dual natural deduction approach that makes a distinction between valid, true and possible formulas. Both systems and the proof of their equivalence are formally verified using the Coq proof assistant.
- Published
- 2020
25. KONFLIK KOGNITIF MAHASISWA DALAM MEMAHAMI KONSEP GEOMETRI HIPERBOLIK DAN ELLIPTIK
- Author
-
Rini Setyaningsih and Mega Teguh Budiarto
- Subjects
Elliptic geometry ,Computer science ,Hyperbolic geometry ,Axiomatic system ,Parallel ,Education ,Algebra ,cognitive conflict, hyperbolic geometry, elliptic, assimilation, accommodation ,Position (vector) ,Euclidean geometry ,QA1-939 ,Representation (mathematics) ,Parallels ,Mathematics - Abstract
Using schemes of Euclid's geometrical concepts in long-term memory to understand hyperbolic geometry and elliptic geometry concepts with assimilation and accommodation allows for cognitive conflict. This study aims to reduce the occurrence of cognitive conflict by understanding the mathematical content of the three of Euclidean geometries, hyperbolic and elliptic. The research was conduct used descriptive exploratory. The results indicate that Euclid's geometry representation is still used in representing hyperbolic and elliptic geometry so that cognitive conflict occurs. Cognitive conflicts that occur are related to the position of two lines, parallels, two triangles with the same that correspond angles, intersects one of two parallel lines, the number of angles in a triangle, and Sacherri's valid hypothesis. The efforts that can be made to reduce the occurrence of cognitive conflict are to change existing schemes or create new schemes so that the information obtained can be combined into existing schemes in a deductive axiomatic approach to material content through the accommodation process
- Published
- 2020
26. A Relevant Logic of Questions
- Author
-
Vít Punčochář
- Subjects
Programming language ,Computer science ,Semantics (computer science) ,010102 general mathematics ,Duality (mathematics) ,Axiomatic system ,Relevance logic ,06 humanities and the arts ,Extension (predicate logic) ,0603 philosophy, ethics and religion ,computer.software_genre ,01 natural sciences ,Dual (category theory) ,Philosophy ,Completeness (logic) ,060302 philosophy ,0101 mathematics ,computer - Abstract
This paper introduces the inquisitive extension of R, denoted as InqR, which is a relevant logic of questions based on the logic R as the background logic of declaratives. A semantics for InqR is developed, and it is shown that this semantics is, in a precisely defined sense, dual to Routley-Meyer semantics for R. Moreover, InqR is axiomatized and completeness of the axiomatic system is established. The philosophical interpretation of the duality between Routley-Meyer semantics and the semantics for InqR is also discussed.
- Published
- 2020
27. Extremal risk management: expected shortfall value verification using the bootstrap method
- Author
-
Marta Małecka
- Subjects
Computer science ,business.industry ,Applied Mathematics ,Axiomatic system ,Sample (statistics) ,Failure rate ,Residual ,Computer Science Applications ,Expected shortfall ,Market risk ,Econometrics ,business ,Formal verification ,Finance ,Risk management - Abstract
In this paper, we refer to the axiomatic theory of risk and investigate the problem of formal verification of the expected shortfall (ES) model based on a sample ES. Recognizing the infeasibility of parametric methods, we explore the bootstrap technique, which, unlike the current value-at-risk model-based (VaR model-based) Basel III testing framework, permits the creation of more powerful sample ES-based procedures. Our contribution to the debate on the possibilities of sample ES-based testing is twofold. First, we introduce a bootstrap test based on the idea of ES prediction corrected variables. In this way, we obtain a procedure that makes no distributional assumptions about the underlying returns process, and whose p-value computation does not assume any asymptotic convergence. Second, we provide a unifying framework for ES value verification, in which we compare alternative sample ES-based approaches: the residual-based procedures versus the ES prediction corrected tests as well as the VaR model-dependent approach versus the fixed failure rate tests. By examining its statistical properties and practical applicability, we find evidence that the proposed bootstrap procedure, based on ES prediction corrected variables, is superior to other methods. This provides important guidance for developing international standards of market risk management.
- Published
- 2020
28. On the properties that characterize privacy
- Author
-
Gail Gilboa-Freedman and Rann Smorodinsky
- Subjects
Sociology and Political Science ,Relation (database) ,Computer science ,Decision theory ,05 social sciences ,Rank (computer programming) ,General Social Sciences ,Axiomatic system ,Context (language use) ,Data science ,0502 economics and business ,050206 economic theory ,Statistics, Probability and Uncertainty ,Set (psychology) ,Personally identifiable information ,General Psychology ,Axiom ,050205 econometrics - Abstract
Privacy, in the sense of control over access to one’s personal information, is a central concern in the context of online decision making, both in general and in relation to online platforms in particular. For at least some agents, a belief that one online platform jeopardizes users’ privacy more than another may tip the scales in favor of the latter. Thus, understanding how privacy considerations come into play is central for any economic or social analysis. To this end, we study how agents rank online platforms (or mechanisms, as we call them) from a privacy perspective. We propose a very simple model of privacy-jeopardizing mechanisms, along with a normative methodology for understanding how these mechanisms are ranked. Similarly to classic work in decision theory, we postulate several axioms that we believe a privacy order should satisfy, and then characterize the set of orders that comply with these axioms. These orders turn out to be related to the notion of f -divergence from information theory, one example of which is KL divergence. We test the usefulness of our theoretical result by using it to rank clustering models based on data provided by the Recommendation Team at Microsoft Research.
- Published
- 2020
29. A Formal System of Axiomatic Set Theory in Coq
- Author
-
Tianyu Sun and Wensheng Yu
- Subjects
General Computer Science ,Computer science ,Cardinal number ,02 engineering and technology ,Mathematical proof ,Hausdorff maximal principle ,Coq proof assistant ,020204 information systems ,Peano axioms ,0502 economics and business ,0202 electrical engineering, electronic engineering, information engineering ,Calculus ,General Materials Science ,Axiom of choice ,Set theory ,Formal verification ,Abstract algebra ,Axiom ,05 social sciences ,Proof assistant ,General Engineering ,formalized mathematics ,Axiomatic system ,Axiom schema ,Formal system ,formal system ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Axiomatic set theory ,050211 marketing ,lcsh:Electrical engineering. Electronics. Nuclear engineering ,lcsh:TK1-9971 - Abstract
Formal verification technology has been widely applied in the fields of mathematics and computer science. The formalization of fundamental mathematical theories is particularly essential. Axiomatic set theory is a foundational system of mathematics and has important applications in computer science. Most of the basic concepts and theories in computer science are described and demonstrated in terms of set theory. In this paper, we present a formal system of axiomatic set theory based on the Coq proof assistant. The axiomatic system used in the formal system refers to Morse-Kelley set theory which is a relatively complete and concise axiomatic set theory. In this formal system, we complete the formalization of the basic definitions of sets, functions, ordinal numbers, and cardinal numbers and prove the most commonly used theorems in Coq. Moreover, the non-negative integers are defined, and Peano's postulates are proved as theorems. According to the axiom of choice, we also present formal proofs of the Hausdorff maximal principle and Schröeder-Bernstein theorem. The whole formalization of the system includes eight axioms, one axiom schema, 62 definitions, and 148 corollaries or theorems. The “axiomatic set theory” formal system is free from the more apparent paradoxes, and a complete axiomatic system is constructed through it. It is designed to give a foundation for mathematics quickly and naturally. On the basis of the system, we can prove many famous mathematical theorems and quickly formalize the theories of topology, modern algebra, data structure, database, artificial intelligence, and so on. It will become an essential theoretical basis for mathematics, computer science, philosophy, and other disciplines.
- Published
- 2020
30. How linguistic meaning harmonizes with information through meaning conservation
- Author
-
Prakash Chandra Mondal
- Subjects
Structure (mathematical logic) ,Linguistics and Language ,General Computer Science ,Computer science ,Black hole information paradox ,Axiomatic system ,02 engineering and technology ,Lexicon ,Language and Linguistics ,Linguistics ,Behavioral Neuroscience ,History and Philosophy of Science ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Natural (music) ,020201 artificial intelligence & image processing ,The Symbolic ,Meaning (existential) ,Natural language - Abstract
This paper aims to characterize the relationship between information as defined in the information-theoretic approach and linguistic meaning by way of formulation of computations over the lexicon of a natural language. Information in its information theoretic sense is supposed not to be equivalent to linguistic meaning, whereas linguistic meaning has an intrinsic connection to information as far as the form and structure of the lexicon of a language (in a non-lexicological sense) is concerned. We argue that these two apparently conflicting aspects of the relationship between information and linguistic meaning can be unified by showing that information conserves linguistic meaning, only insofar as computations of a certain kind are defined on the symbolic elements of a lexicon. This has consequences not merely for the nature of lexical learning – natural or artificial or otherwise – but also for the conservation of information in axiomatic systems.
- Published
- 2019
31. On Methods of the Verification and Elaboration of Development Programs for Agricultural Territories
- Author
-
J. L. Vega Vice and V. Y. Mikhailov
- Subjects
Structure (mathematical logic) ,Scheme (programming language) ,business.industry ,Computer science ,Axiomatic system ,Analogy ,02 engineering and technology ,Planning Domain Definition Language ,Algorithmic logic ,Development (topology) ,Control and Systems Engineering ,Signal Processing ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Software engineering ,business ,computer ,Software ,Axiom ,computer.programming_language - Abstract
Nowadays, the methods of program-targeted management for the development of various socio-economic systems of complex structure, such as agricultural areas, have become ubiquitous. Therefore, the current tasks at hand are the verification of already created development programs and the development of “proper” programs for the development of such systems, by analogy with the verification and development of proper computer programs through advanced disciplines in theoretical programming. In this paper, in order to solve the problem of the verification of development programs for agricultural territories, a structural scheme of the program is first constructed, through which an axiomatic theory is created using the Hoare’s algorithmic logic system. The main problem in the construction of the axiomatic theory is the development of the axioms of the theory that reflect preconditions and effects of the implementation of meaningful actions indicated in the text of the development program. The verification of the development program corresponds to the provability check of some Hoare triplet, according to the initial and target conditions of the program. For the task of elaborating proper development programs, we describe the mechanism for constructing a domain model using the PDDL family description languages. The description of a specific model is purely declarative in nature and consists of descriptions of predicates and actions of the chosen subject area. In this paper, using the described model with the help of intelligent planners, including temporal planners such as OPTIC, we show how to automatically build solutions to the targets of development programs. Based on expert knowledge and industry standards, a model of an agricultural territory is constructed, a brief description of which is given in this work. The conducted experiments showed the effectiveness of the proposed approach for the development of proper development programs.
- Published
- 2019
32. Analysis of decision making methods in the event of technogenic accidents in real-time management systems
- Author
-
V. Zavgorodnii and A. Zavgorodnya
- Subjects
Adaptive control ,business.industry ,Event (computing) ,Computer science ,Process (engineering) ,Organic Chemistry ,Axiomatic system ,Object (computer science) ,Biochemistry ,Automation ,Industrial Accident ,Risk analysis (engineering) ,Decision-making ,business - Abstract
New approaches that allow to expand the logical capabilities of control algorithms based on a thorough analysis of the logic of the decision making process for formalizing the process of object management is considered. Based on the analysis of decision-making methods, it is established that the automation of decision-making processes in the event of emergencies of an anthropogenic nature in the management of forces and facilities should be based on the semiotic principle. To solve the problem of developing solutions in emergency situations of anthropogenic nature, the axiomatic approach is the most acceptable. The development of a method for the synthesis of solution options allows the use of adaptive control algorithms with the modeling of human mental activity. The use of an axiomatic approach to formalizing and solving problems of developing solutions for managing the risks of an industrial accident can only be formalized on the basis of quantitative mathematical methods.
- Published
- 2019
33. Reliable population signal of subjective economic value from unreliable neurons in primate orbitofrontal cortex
- Author
-
Simone Ferrari-Toniolo and Wolfram Schultz
- Subjects
education.field_of_study ,Computer science ,Population ,Econometrics ,Code (cryptography) ,Axiomatic system ,Orbitofrontal cortex ,education ,Value (mathematics) ,Multiple choice ,Weighting ,Coding (social sciences) - Abstract
Behavior-related neuronal signals often vary between neurons. Despite the unreliability of individual neurons, brains are able to accurately represent and drive behavior. The notion may also apply to economic (‘value-based’) choices and the underlying reward signals. Reward value is subjective and can be defined by nonlinear weighting of magnitude (utility) and probability. Using a wide variety of reward magnitude and probability, we assessed subjective reward value at choice indifference between safe and risky rewards as prescribed by the continuity axiom that provides stringent criteria for meaningful choice. We found that individual neurons in the orbitofrontal cortex (OFC) of monkeys carry unreliable and heterogeneous neuronal signals for subjective value that largely fails to match the animal’s choice. However, the averaged neuronal signals matched well the animals’ choices, suggesting reliable subjective economic value encoding by the observed population of unreliable neurons.HighlightsDifferent from widely held views, reliable neuronal information processing may not require reliable processors.Neurons in monkey orbitofrontal cortex (OFC) process reward magnitude and probability heterogeneously and unreliably.Despite unreliable neuronal processing, OFC population activity codes choices reliably.Reliability systems performance from unreliable elements seems to be a broad feature of neuronal reward coding.In briefUsing stringent concepts of behavioral choice, Ferrari-Toniolo and Schultz describe unreliable individual reward neurons in monkey orbitofrontal cortex whose activity combines to a reliable population code for economic choice.
- Published
- 2021
34. A Neural Approach for Detecting Morphological Analogies
- Author
-
Miguel Couceiro, Amandine Decker, Pierre-Alexandre Murena, Puthineath Lay, Esteban Marquer, Safa Alsaidi, Knowledge representation, reasonning (ORPAILLEUR), Inria Nancy - Grand Est, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Department of Natural Language Processing & Knowledge Discovery (LORIA - NLPKD), Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL), Semantic Analysis of Natural Language (SEMAGRAMME), Université de Lorraine (UL), Aalto University, Inria Project Lab 'Hybrid Approaches for Interpretable AI' (IPL HyAIAI), GRID5000, Institut des Sciences du Digital, Management et Cognition (IDMC), Aalto University School of Science and Technology [Aalto, Finland], Helsinki Institute for Information Technology (HIIT), Aalto University-University of Helsinki, European Project: 952215,TAILOR(2020), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS), and Helsingin yliopisto = Helsingfors universitet = University of Helsinki-Aalto University
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Computer Science - Artificial Intelligence ,Semantics (computer science) ,Computer science ,Analogy ,Of the form ,[INFO.INFO-NE]Computer Science [cs]/Neural and Evolutionary Computing [cs.NE] ,computer.software_genre ,[INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL] ,Machine Learning (cs.LG) ,Data modeling ,[INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI] ,[INFO.INFO-LG]Computer Science [cs]/Machine Learning [cs.LG] ,semantic analogy ,analogy classification ,[INFO]Computer Science [cs] ,ComputingMilieux_MISCELLANEOUS ,Computer Science - Computation and Language ,morphological anaogy ,Kolmogorov complexity ,business.industry ,Deep learning ,Axiomatic system ,deep learning ,Artificial Intelligence (cs.AI) ,Character (mathematics) ,morphological analogy ,Artificial intelligence ,business ,Computation and Language (cs.CL) ,computer ,Natural language processing - Abstract
Analogical proportions are statements of the form "A is to B as C is to D" that are used for several reasoning and classification tasks in artificial intelligence and natural language processing (NLP). For instance, there are analogy based approaches to semantics as well as to morphology. In fact, symbolic approaches were developed to solve or to detect analogies between character strings, e.g., the axiomatic approach as well as that based on Kolmogorov complexity. In this paper, we propose a deep learning approach to detect morphological analogies, for instance, with reinflexion or conjugation. We present empirical results that show that our framework is competitive with the above-mentioned state of the art symbolic approaches. We also explore empirically its transferability capacity across languages, which highlights interesting similarities between them., Comment: Submitted and accepted by the 8th IEEE International Conference on Data Science and Advanced Analytics (DSAA)
- Published
- 2021
35. Symmetry and partial belief geometry
- Author
-
Stefan Lukits
- Subjects
Philosophy ,History and Philosophy of Science ,Alethic modality ,Computer science ,Scoring rule ,Credence ,Closeness ,Formal epistemology ,Axiomatic system ,Geometry ,Probabilism ,Principle of indifference - Abstract
When beliefs are quantified as credences, they are related to each other in terms of closeness and accuracy. The “accuracy first” approach in formal epistemology wants to establish a normative account for credences (probabilism, Bayesian conditioning, principle of indifference, and so on) based entirely on the alethic properties of the credence: how close it is to the truth. To pull off this project, there is a need for a scoring rule. There is widespread agreement about some constraints on this scoring rule (for example propriety), but not whether a unique scoring rule stands above the rest. The Brier score equips credences with a structure similar to metric space and induces a “geometry of reason.” It enjoys great popularity in the current debate. I point out many of its weaknesses in this article. An alternative approach is to reject the geometry of reason and accept information theory in its stead. Information theory comes fully equipped with an axiomatic approach which covers probabilism, standard conditioning, and Jeffrey conditioning. It is not based on an underlying topology of a metric space, but uses a non-commutative divergence instead of a symmetric distance measure. I show that information theory, despite initial promise, also fails to accommodate basic epistemic intuitions; and speculate on its remediation.
- Published
- 2021
36. A Reasoning System for Fuzzy Distributed Knowledge Representation in Multi-Agent Systems
- Author
-
Yoshihiro Maruyama
- Subjects
Reasoning system ,Distributed knowledge ,Theoretical computer science ,Knowledge representation and reasoning ,Computer science ,Multi-agent system ,Axiomatic system ,Completeness (statistics) ,Fuzzy logic ,Bounded rationality - Abstract
Knowledge does not necessarily exist only within a single agent; it may exist collectively within a number of agents combined together (cf. the wisdom of the crowd), and this sort of knowledge is called distributed knowledge. In light of bounded rationality and uncertainty, distributed knowledge can be incomplete, and there may be gradations in the certainty of distributed knowledge, that is, knowledge may be distributed within a system of agents up to some degree of certainty only. Fagin, Halpern, Moses, and Vardi gave an axiomatic system for reasoning about distributed knowledge, and developed a special technique to prove its fundamental properties such as completeness. Here we extend their classic results so as to incorporate fuzzy distributed knowledge; in addition we prove several other properties of fuzzy modal systems such as the finite model property and Godel-style translation theorems.
- Published
- 2021
37. Substantiation of the method of integrated group unification of machine and appliance designs
- Author
-
Gennadii Golub, Nataliya Tsyvenkova, Viacheslav Chuba, Anna Holubenko, and Marina Tereshchuk
- Subjects
Unification ,Standardization ,Process (engineering) ,Computer science ,020209 energy ,0211 other engineering and technologies ,Energy Engineering and Power Technology ,02 engineering and technology ,Industrial and Manufacturing Engineering ,Management of Technology and Innovation ,lcsh:Technology (General) ,021105 building & construction ,0202 electrical engineering, electronic engineering, information engineering ,lcsh:Industry ,Production (economics) ,Electrical and Electronic Engineering ,Structure (mathematical logic) ,primary element ,theorem of unification ,Applied Mathematics ,Mechanical Engineering ,theory of groups ,Axiomatic system ,Industrial engineering ,Computer Science Applications ,Control and Systems Engineering ,lcsh:T1-995 ,Design process ,A priori and a posteriori ,lcsh:HD2321-4730.9 ,integrated group unification - Abstract
The study object: group unification of designs of process machines and appliances. Unification is one of important means of improving efficiency of production and operation of assemblies (parts) which reduces cost of their manufacture and repair. Unification is also a standardization subsystem which significantly increases the interest in its study and implementation. One of the problems in development of group unification of designs consists in the lack of sufficient theoretical base and the studies towards unification are often reduced to simplification. This causes worsening of production efficiency because of slowdown of the process of creation and implementation of unified designs at a steady rate of growth of nomenclature of assemblies (parts), equipment and tools. An approach was proposed based on a hypothesis of possibility of finding criteria (formulas) that will allow designers to a priori assess conformity of the design structure to the established levels of unification, define laws and specify methods for optimizing the design structures by adapting to the technological equipment. This approach was implemented through the use of the axiomatic theory, laws of composition, theory of groups and symbolic logic. As a result of the study, definition of the primary element was obtained and a procedure of its construction was presented, formulas of unified parts were derived and the theorem of unification of assembly (part) design structure was formulated. Features of integrated unification of groups of parts and the equipment for their manufacture were considered. The study results will allow designers to improve the intellectual design process and promote widespread use of the systems of automatic design of process equipment. The study results are of interest for: ‒designers of enterprises when creating closed databases of unified parts (assemblies) which will significantly reduce time of development and introduction in manufacture of new products and increase their efficiency; ‒software users in creation of accessible open databases of unified parts (assemblies) aimed at concealed advertising and promotion of sales of unified products
- Published
- 2019
38. Multi-objective algebraic rewriting in XOR-majority graphs
- Author
-
Lunyao Wang, Lei Shi, Zhufei Chu, and Yinshui Xia
- Subjects
Theoretical computer science ,Computer science ,020208 electrical & electronic engineering ,Axiomatic system ,02 engineering and technology ,Extension (predicate logic) ,Inversion (discrete mathematics) ,Cellular automaton ,020202 computer hardware & architecture ,Transformation (function) ,Hardware and Architecture ,0202 electrical engineering, electronic engineering, information engineering ,Benchmark (computing) ,Rewriting ,Electrical and Electronic Engineering ,Representation (mathematics) ,Software ,Hardware_LOGICDESIGN - Abstract
XOR-Majority graph (XMG) is a logic representation that is using exclusive-OR (XOR), majority-of-three (MAJ), and inverters as primitives, which is an extension of Majority-Inverter Graph (MIG). Previous work presented an axiomatic system, Ω, and its derived transformation rules for the manipulation of MIGs. By additionally introducing XOR primitive, to enable powerful logic rewriting in XMGs is inseparable from the development of the identities of MAJ-XOR operations. Further, since many emerging nanotechnologies, such as quantum-dot cellular automata (QCA) technology, are inherently majority-based, XMGs/MIGs are efficient logic representations to map into these nanotechnologies. In this paper, we first proposed two MAJ-XOR identities and exploited their potential optimization opportunities during structural rewriting. Then, we consider the inversion optimization in XMGs since the cost to implement inversion is prohibitive in some nanotechnologies. After that, we discuss the rewriting rules for multi-objective optimization. Finally, the experiments conducted over EPFL benchmark suites show that the proposed method can optimize the number of nodes, inverters, and levels of XMGs, which in turn benefits to the implementation cost of these circuits using QCA as well as quantum circuits.
- Published
- 2019
39. AXIOMATIC-DEDUCTIVE STRATEGY FOR IT DISCIPLINE CONTENT FORMATION
- Author
-
Volodymyr Pasichnyk, Nataliia Kunanets, and Serhii Lupenko
- Subjects
Structure (mathematical logic) ,axiomatic-deductive system ,Management science ,Computer science ,Axiomatic system ,Web Ontology Language ,ontological approach ,Formal system ,lcsh:LB5-3640 ,lcsh:Theory and practice of education ,e-learning system ,Ontology ,ontology ,Mathematical structure ,computer ,Discipline ,Axiom ,computer.programming_language - Abstract
The paper presents the axiomatic-deductive strategy of organizing the content of an academic discipline with the help of ontological approach in the e-learning systems in the field of information technologies. The authors have taken into account that the necessary property of the system of axiomatic statements is their consistency. On the basis of axiomatic-deductive strategy, new approaches to the formation of the discipline content are proposed. It is proved that the system of true statements of an academic discipline is based on its terminology-conceptual apparatus, in particular, axiomatic statements. The developed mathematical structures that describe the axiomatic-deductive substrategy of the organization of the academic discipline general statements and the taxonomically oriented substrategy of the deployment of the academic discipline content are presented in the article. This ensures the transition from the content form of representation of the set of statements of the academic discipline to its presentation by means of artificial languages of mathematical logic. The use of descriptive logic ensures the formalization of the procedure for displaying an axiomatic informal system in an axiomatic formal system. The mathematical structures describe and detail the abstract logical-semantic core of the academic discipline in the form of a group of axiomatic systems. It is noted that the basic core of the content of academic discipline contains its basic concepts and judgments. This ensures a strictly logical transition from abstract general concepts and statements to the concepts and assertions of the lower level of universality and abstraction. It is noted that in order to accommodate the content of an academic discipline is advisable to develop a taxonomically oriented sub-strategy based on the multiple application of operations of general concept division. The mathematical structures allow for analysis of a generalized structure of interactions between the verbal level of the description of the academic discipline subject area, the formal level of description of the subject area and the description of the subject area at the level of computer ontology, which is implemented through the formalization, interpretation, encoding and decoding in the computer-ontology development environment. As an example of the application of the proposed axiomatic-deductive strategy, the elements of the glossary and taxonomies of the concepts of the discipline "Computer Logic", which are embodied in the Protégé environment with the help of OWL ontology description language have been developed.
- Published
- 2019
40. Verification Methods for the Computationally Complete Symbolic Attacker Based on Indistinguishability
- Author
-
Mitsuhiro Okada, Rohit Chadha, Gergei Bana, and Ajay Kumar Eeralla
- Subjects
Soundness ,Theoretical computer science ,General Computer Science ,Logic ,Computer science ,Proof assistant ,Axiomatic system ,0102 computer and information sciences ,Cryptographic protocol ,Mathematical proof ,01 natural sciences ,Theoretical Computer Science ,First-order logic ,Computational Mathematics ,010201 computation theory & mathematics ,Dolev–Yao model ,Axiom - Abstract
In recent years, a new approach has been developed for verifying security protocols with the aim of combining the benefits of symbolic attackers and the benefits of unconditional soundness: the technique of the computationally complete symbolic attacker of Bana and Comon (BC) [8]. In this article, we argue that the real breakthrough of this technique is the recent introduction of its version for indistinguishability [9], because, with the extensions we introduce here, for the first time, there is a computationally sound symbolic technique that is syntactically strikingly simple, to which translating standard computational security notions is a straightforward matter, and that can be effectively used for verification of not only equivalence properties but trace properties of protocols as well. We first fully develop the core elements of this newer version by introducing several new axioms. We illustrate the power and the diverse use of the introduced axioms on simple examples first. We introduce an axiom expressing the Decisional Diffie-Hellman property. We analyze the Diffie-Hellman key exchange, both in its simplest form and an authenticated version as well. We provide computationally sound verification of real-or-random secrecy of the Diffie-Hellman key exchange protocol for multiple sessions, without any restrictions on the computational implementation other than the DDH assumption. We also show authentication for a simplified version of the station-to-station protocol using UF-CMA assumption for digital signatures. Finally, we axiomatize IND-CPA, IND-CCA1, and IND-CCA2 security properties and illustrate their usage. We have formalized the axiomatic system in an interactive theorem prover, Coq, and have machine-checked the proofs of various auxiliary theorems and security properties of Diffie-Hellman and station-to-station protocol.
- Published
- 2019
41. Attachment centrality: Measure for connectivity in networks
- Author
-
Talal Rahwan, Oskar Skibski, Tomasz Michalak, and Makoto Yokoo
- Subjects
Linguistics and Language ,Theoretical computer science ,Computer science ,Existential quantification ,Node (networking) ,Axiomatic system ,02 engineering and technology ,Measure (mathematics) ,Language and Linguistics ,Artificial Intelligence ,Chordal graph ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Centrality ,Time complexity ,Value (mathematics) - Abstract
Centrality indices aim to quantify the importance of nodes or edges in a network. Much interest has been recently raised by the body of work in which a node's connectivity is understood less as its contribution to the quality or speed of communication in the network and more as its role in enabling communication altogether. Consequently, a node is assessed based on whether or not the network (or part of it) becomes disconnected if this node is removed. While these new indices deliver promising insights, to date very little is known about their theoretical properties. To address this issue, we propose an axiomatic approach. Specifically, we prove that there exists a unique centrality index satisfying a number of desirable properties. This new index, which we call the Attachment centrality, is equivalent to the Myerson value of a certain graph-restricted game. Building upon our theoretical analysis we show that, while computing the Attachment centrality is #P-complete, it has certain computational properties that are more attractive than the Myerson value for an arbitrary game. In particular, it can be computed in chordal graphs in polynomial time.
- Published
- 2019
42. On the Logic of Balance in Social Networks
- Author
-
Zuojun Xiong and Thomas Ågotnes
- Subjects
Linguistics and Language ,Theoretical computer science ,Computer science ,Field (Bourdieu) ,010102 general mathematics ,Axiomatic system ,Modal logic ,06 humanities and the arts ,0603 philosophy, ethics and religion ,01 natural sciences ,Bridge (interpersonal) ,Philosophy ,Completeness (order theory) ,060302 philosophy ,Computer Science (miscellaneous) ,0101 mathematics ,Balance theory ,Social network analysis ,Axiom - Abstract
Modal logics for reasoning about social networks is currently an active field of research. There is still a gap, however, between the state of the art in logical formalisations of concepts related to social networks and the much more mature field of social network analysis. In this paper we take a step to bridge that gap. One of the key foundations of social network analysis is balance theory, which is used to analyse signed social networks where agents can have positive (“friends”) or negative (“enemies”) relationships. Certain combinations of positive and negative relationships are considered to be unbalanced, or unstable—in particular the occurrence of cycles with an odd number of negative relationships. Especially relatively short cycles with an odd number of negative relationships are thought to put pressure on the agents to change one or more of the involved relationships from negative to positive or the other way around. Most existing logics for reasoning about social networks are defined for unsigned networks. In this paper we develop a modal logic for reasoning about structural properties of signed social networks, and give a sound and complete Hilbert-style axiomatic system. Furthermore, we completely axiomatise classes of signed social networks that are balanced to a certain degree n, in the sense that there are no cycles of length up to n with an odd number of negative relationships. Finally, we completely axiomatise the class of all fully balanced complete signed social networks, i.e., networks where everyone is connected with everyone else. Axiomatic completeness is non-trivial because neither the balance properties, nor the dichotomy between positive and negative relations, are modally definable. The paper thus provides a logical basis for reasoning about signed social networks in general and balanced networks in particular.
- Published
- 2019
43. Extension Removal in Abstract Argumentation – An Axiomatic Approach
- Author
-
Ringo Baumann and Gerhard Brewka
- Subjects
0301 basic medicine ,03 medical and health sciences ,030104 developmental biology ,Formalism (philosophy) ,Computer science ,030106 microbiology ,Calculus ,Axiomatic system ,General Medicine ,Impossibility ,Argumentation framework ,Axiom ,Argumentation theory - Abstract
This paper continues the rather recent line of research on the dynamics of non-monotonic formalisms. In particular, we consider semantic changes in Dung’s abstract argumentation formalism. One of the most studied problems in this context is the so-called enforcing problem which is concerned with manipulating argumentation frameworks (AFs) such that a certain desired set of arguments becomes an extension. Here we study the inverse problem, namely the extension removal problem: is it possible – and if so how – to modify a given argumentation framework in such a way that certain undesired extensions are no longer generated? Analogously to the well known AGM paradigm we develop an axiomatic approach to the removal problem, i.e. a certain set of axioms will determine suitable manipulations. Although contraction (that is, the elimination of a particular belief) is conceptually quite different from extension removal, there are surprisingly deep connections between the two: it turns out that postulates for removal can be directly obtained as reformulations of the AGM contraction postulates. We prove a series of formal results including conditional and unconditional existence and semantical uniqueness of removal operators as well as various impossibility results – and show possible ways out.
- Published
- 2019
44. Fairness measures for decision-making and conflict resolution
- Author
-
Apoorva M. Sampat and Victor M. Zavala
- Subjects
021103 operations research ,Control and Optimization ,Management science ,Computer science ,Mechanical Engineering ,0211 other engineering and technologies ,Aerospace Engineering ,Axiomatic system ,Social Welfare ,02 engineering and technology ,Financial engineering ,Conflict resolution ,Social planning ,Fairness measure ,Entropy (information theory) ,021108 energy ,Electrical and Electronic Engineering ,Game theory ,Software ,Civil and Structural Engineering - Abstract
Allocating utility among stakeholders is a fundamental decision-making task that arises in complex organizations, social planning, infrastructures, and markets. In this work, we reconcile concepts of fairness from the perspectives of game theory, economics, statistics, and engineering by using an axiomatic approach. Our work reveals significant deficiencies in the social welfare allocation approach (which is widely used in the engineering literature) and highlights interesting and desirable properties and connections between Nash and entropy allocation approaches.
- Published
- 2019
45. Axiomatizing Congestion Control
- Author
-
SchapiraMichael, ShenkerScott, MittalRadhika, and ZarchyDoron
- Subjects
Computer Networks and Communications ,Computer science ,Distributed computing ,Axiomatic system ,020206 networking & telecommunications ,02 engineering and technology ,Task (project management) ,Network congestion ,Range (mathematics) ,Hardware and Architecture ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Computer Science (miscellaneous) ,Safety, Risk, Reliability and Quality ,Design space ,computer ,Simula ,computer.programming_language - Abstract
The overwhelmingly large design space of congestion control protocols, along with the increasingly diverse range of application environments, makes evaluating such protocols a daunting task. Simulation and experiments are very helpful in evaluating the performance of designs in specific contexts, but give limited insight into the more general properties of these schemes and provide no information about the inherent limits of congestion control designs (such as, which properties are simultaneously achievable and which are mutually exclusive). In contrast, traditional theoretical approaches are typically focused on the design of protocols that achieve to specific, predetermined objectives (e.g., network utility maximization), or the analysis of specific protocols (e.g., from control-theoretic perspectives), as opposed to the inherent tensions/derivations between desired properties. To complement today's prevalent experimental and theoretical approaches, we put forth a novel principled framework for reasoning about congestion control protocols, which is inspired by the axiomatic approach from social choice theory and game theory. We consider several natural requirements ("axioms'') from congestion control protocols -- e.g., efficient resource-utilization, loss-avoidance, fairness, stability, and TCP-friendliness -- and investigate which combinations of these can be achieved within a single design. Thus, our framework allows us to investigate the fundamental tradeoffs between desiderata, and to identify where existing and new congestion control architectures fit within the space of possible outcomes.
- Published
- 2019
46. On the Procedural Character of Hilbert’s Axiomatic Method
- Author
-
Giambattista Formica
- Subjects
Philosophy ,Reflection (mathematics) ,Character (mathematics) ,Computer science ,Calculus ,Axiomatic system ,Image (mathematics) - Abstract
Hilbert’s methodological reflection has certainly shaped a new image of the axiomatic method. However, the discussion on the procedural character of the method is still open, with commentators subs...
- Published
- 2019
47. Covering-based intuitionistic fuzzy rough sets and applications in multi-attribute decision-making
- Author
-
Bingzhen Sun and Jianming Zhan
- Subjects
Linguistics and Language ,BETA (programming language) ,Generalization ,Computer science ,05 social sciences ,Granular computing ,050301 education ,Axiomatic system ,Intuitionistic fuzzy ,TOPSIS ,02 engineering and technology ,computer.software_genre ,Fuzzy logic ,Language and Linguistics ,Artificial Intelligence ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Data mining ,Rough set ,0503 education ,computer ,computer.programming_language - Abstract
Covering based intuitionistic fuzzy (IF) rough set is a generalization of granular computing and covering based rough sets. By combining covering based rough sets, IF sets and fuzzy rough sets, we introduce three classes of coverings based IF rough set models via IF$$\beta $$-neighborhoods and IF complementary $$\beta $$-neighborhood (IFC$$\beta $$-neighborhood). The corresponding axiomatic systems are investigated, respectively. In particular, the rough and precision degrees of covering based IF rough set models are discussed. The relationships among these types of coverings based IF rough set models and covering based IF rough set models proposed by Huang et al. (Knowl Based Syst 107:155–178, 2016). Based on the theoretical analysis for coverings based IF rough set models, we put forward intuitionistic fuzzy TOPSIS (IF-TOPSIS) methodology to multi-attribute decision-making (MADM) problem with the evaluation of IF information problem. An effective example is to illustrate the proposed methodology. Finally, we deal with MADM problem with the evaluation of fuzzy information based on CFRS models. By comparative analysis, we find that it is more effective to deal with MADM problem with the evaluation of IF information based on CIFRS models than the one with the evaluation of fuzzy information based on CFRS models.
- Published
- 2018
48. An Axiomatic Approach to Detect Information Leaks in Concurrent Programs
- Author
-
Sandip Ghosal and R. K. Shyamasundar
- Subjects
FOS: Computer and information sciences ,Computer Science - Programming Languages ,Computer Science - Cryptography and Security ,Correctness ,Computer science ,Programming language ,Control (management) ,Axiomatic system ,Context (language use) ,Security context ,computer.software_genre ,Timing attack ,TheoryofComputation_LOGICSANDMEANINGSOFPROGRAMS ,Concurrent computing ,Sleep (system call) ,Cryptography and Security (cs.CR) ,computer ,Programming Languages (cs.PL) - Abstract
Realizing flow security in a concurrent environment is extremely challenging, primarily due to non-deterministic nature of execution. The difficulty is further exacerbated from a security angle if sequential threads disclose control locations through publicly observable statements like print, sleep, delay, etc. Such observations lead to internal and external timing attacks. Inspired by previous works that use classical Hoare style proof systems for establishing correctness of distributed (real-time) programs, in this paper, we describe a method for finding information leaks in concurrent programs through the introduction of leaky assertions at observable program points. Specifying leaky assertions akin to classic assertions, we demonstrate how information leaks can be detected in a concurrent context. To our knowledge, this is the first such work that enables integration of different notions of non-interference used in functional and security context. While the approach is sound and relatively complete in the classic sense, it enables the use of algorithmic techniques that enable programmers to come up with leaky assertions that enable checking for information leaks in sensitive applications., 5 pages, 2 figures; accepted paper for the 43rd International Conference on Software Engineering (ICSE 2021), Track: New Ideas and Emerging Results (NIER)
- Published
- 2021
49. To Know Them, Remove Them: An Outer Methodological Approach to Biophysics and Humanities
- Author
-
Arturo Tozzi
- Subjects
Operationalization ,Action (philosophy) ,Computer science ,other ,Axiomatic system ,Non-classical logic ,Set (psychology) ,Turing ,computer ,Natural language ,computer.programming_language ,Complement (set theory) ,Epistemology - Abstract
Set theory faces two difficulties: formal definitions of sets/subsets are incapable of assessing biophysical issues; formal axiomatic systems are complete/inconsistent or incomplete/consistent. To overtake these problems reminiscent of the old-fashioned principle of individuation, we provide formal treatment/validation/operationalization of a methodological weapon termed “outer approach” (OA). The observer’s attention shifts from the system under evaluation to its surroundings, so that objects are investigated from outside. Subsets become just “holes” devoid of information inside larger sets. Sets are no longer passive containers, rather active structures enabling their content’s examination. Consequences/applications of OA include: a) operationalization of paraconsistent logics, anticipated by unexpected forerunners, in terms of advanced truth theories of natural language, anthropic principle and quantum dynamics; b) assessment of embryonic craniocaudal migration in terms of Turing’s spots; c) evaluation of hominids’ social behaviors in terms of evolutionary modifications of facial expression’s musculature; d) treatment of cortical action potentials in terms of collective movements of extracellular currents, leaving apart what happens inside the neurons; e) a critique of Shannon’s information in terms of the Arabic thinkers’ active/potential intellects. Also, OA provides an outer view of a) humanistic issues such as the enigmatic Celestino of Verona’s letter, Dante Alighieri’s “Hell” and the puzzling Voynich manuscript; b) historical issues such as Aldo Moro’s death and the Liston/Clay boxing fight. Summarizing, the safest methodology to quantify phenomena is to remove them from our observation and tackle an outer view, since mathematical/logical issues such as selective information deletion and set complement rescue incompleteness/inconsistency of biophysical systems.
- Published
- 2021
50. Nonhuman Primates Satisfy Utility Maximization in Compliance with the Continuity Axiom of Expected Utility Theory
- Author
-
Simone Ferrari-Toniolo, Philipe M. Bujold, Wolfram Schultz, Fabian Grabenhorst, Raymundo Báez-Mendoza, Ferrari-Toniolo, Simone [0000-0001-9009-0764], Bujold, Philipe M [0000-0002-2901-003X], Grabenhorst, Fabian [0000-0002-6455-0648], Báez-Mendoza, Raymundo [0000-0002-5989-7647], Schultz, Wolfram [0000-0002-8530-4518], and Apollo - University of Cambridge Repository
- Subjects
0301 basic medicine ,Male ,Primates ,Class (set theory) ,Computer science ,expected utility theory ,Behavioral/Cognitive ,Measure (mathematics) ,Choice Behavior ,decision making ,03 medical and health sciences ,0302 clinical medicine ,Reward ,0502 economics and business ,Animals ,continuity axiom ,Research Articles ,Axiom ,Expected utility hypothesis ,030304 developmental biology ,050205 econometrics ,0303 health sciences ,General Neuroscience ,05 social sciences ,Utility maximization ,Probabilistic logic ,Axiomatic system ,Function (mathematics) ,Subjective expected utility ,Macaca mulatta ,030104 developmental biology ,utility ,Non-human ,reward risk ,Mathematical economics ,030217 neurology & neurosurgery ,Photic Stimulation ,Psychomotor Performance ,Coding (social sciences) - Abstract
Expected Utility Theory (EUT), the first axiomatic theory of risky choice, describes choices as a utility maximization process: decision makers assign a subjective value (utility) to each choice option and choose the one with the highest utility. The continuity axiom, central to EUT and its modifications, is a necessary and sufficient condition for the definition of numerical utilities. The axiom requires decision makers to be indifferent between a gamble and a specific probabilistic combination of a more preferred and a less preferred gamble. While previous studies demonstrated that monkeys choose according to combinations of objective reward magnitude and probability, a concept-driven experimental approach for assessing the axiomatically defined conditions for maximizing subjective utility by animals is missing. We experimentally tested the continuity axiom for a broad class of gamble types in four male rhesus macaque monkeys, showing that their choice behavior complied with the existence of a numerical utility measure as defined by the economic theory. We used the numerical quantity specified in the continuity axiom to characterize subjective preferences in a magnitude-probability space. This mapping highlighted a trade-off relation between reward magnitudes and probabilities, compatible with the existence of a utility function underlying subjective value computation. These results support the existence of a numerical utility function able to describe choices, allowing for the investigation of the neuronal substrates responsible for coding such rigorously defined quantity.SIGNIFICANCE STATEMENTA common assumption of several economic choice theories is that decisions result from the comparison of subjectively assigned values (utilities). This study demonstrated the compliance of monkey behavior with the continuity axiom of Expected Utility Theory, implying a subjective magnitude-probability trade-off relation which supports the existence of numerical subjective utility directly linked to the theoretical economic framework. We determined a numerical utility measure able to describe choices, which can serve as a correlate for the neuronal activity in the quest for brain structures and mechanisms guiding decisions.
- Published
- 2021
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.