828 results on '"propositional logic"'
Search Results
2. The existential fragment of second-order propositional intuitionistic logic is undecidable.
- Author
-
Fujita, Ken-etsu, Schubert, Aleksy, Urzyczyn, Paweł, and Zdanowski, Konrad
- Subjects
PROPOSITION (Logic) ,LAMBDA calculus ,FIRST-order logic - Abstract
The provability problem in intuitionistic propositional second-order logic with existential quantifier and implication $ (\exists,\to) $ (∃ , →) is proved to be undecidable in presence of free type variables (constants). This contrasts with the result that inutitionistic propositional second-order logic with existential quantifier, conjunction and negation is decidable. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
3. A Method for Backward Failure Propagation in Conceptual System Design.
- Author
-
Mansoor, Ali, Diao, Xiaoxu, and Smidts, Carol
- Subjects
- *
CONCEPTUAL design , *PRESSURIZED water reactors , *LOGIC , *SYSTEM failures - Abstract
The increased complexity of modern system designs and demands for quicker time to market have made safety-related verification and validation of such systems more challenging. Incorporating safety and risk considerations at the early stages of design is one way to acquire a more robust initial design for novel systems. Inductive fault analysis has its significance at final stages of design, e.g., verification and validation. However, to preclude certain system failure states—especially for the systems with high failure consequences, a designer would innately prefer to trace back and remedy the causes of failure, as compared to a more cumbersome activity of identifying the faults individually and sifting the combinations that lead to the failure of interest. The work presented in this paper is aimed at the development of a backward failure propagation methodology for analyzing the origins of functional failures in a conceptual design of systems including but not limited to nuclear, mechanical, aerospace, process, electrical/electronics, telecommunication, automotive, etc. This method allows the designer to achieve a robust early design based on the analyses of the system's functional dependencies before proceeding to the detailed design and testing stages. The insights provided by the analysis at the conceptual design stage also reduce redesign efforts, testing costs, and project delays. The proposed method is a functional analysis approach that extends the Integrated System Failure Analysis for backward failure propagation. When provided with an abstract system configuration, a system's functional model, and a system's behavioral model, it utilizes a known functional state (typically a failure) to acquire system component modes and the states of other functions. The method includes inversion of the functional failure logic and component behavioral rules using propositional logic and deductive analysis to assess valid states of a system that satisfy the given initial conditions. To test the method's scalability, we applied the proposed method to a simplified representation of the secondary loop of a typical pressurized water reactor. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
4. The Logic of Lexical Connectives.
- Author
-
Sbardolini, Giorgio
- Subjects
- *
NATURAL languages , *LINGUISTICS , *EXPRESSIVE language , *LOGIC , *LEXICON , *NEGATION (Logic) - Abstract
Natural language does not express all connectives definable in classical logic as simple lexical items. Coordination in English is expressed by conjunction and, disjunction or, and negated disjunction nor. Other languages pattern similarly. Non-lexicalized connectives are typically expressed compositionally: in English, negated conjunction is typically expressed by combining negation and conjunction (not both). This is surprising: if ∧ and ∨ are duals, and the negation of the latter can be expressed lexically (nor), why not the negation of the former? I present a two-tiered model of the semantics of the binary connectives. The first tier captures the expressive power of the lexicon: it is a bilateral state-based semantics that, under a restriction, can express all and only the distinctions that can be expressed by the lexicon of natural language (and, or, nor). This first tier is characterized by rejection as non-assertion and a Neglect Zero assumption. The second tier is obtained by dropping the Neglect Zero assumption and enforcing a stronger notion of rejection, thereby recovering classical logic and thus definitions for all Boolean connectives. On the two-tiered model, we distinguish the limited expressive resources of the lexicon and the greater combinatorial expressive power of the language as a whole. This gives us a logic-based account of compositionality for the Boolean fragment of the language. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. Large Language Models and Logical Reasoning
- Author
-
Robert Friedman
- Subjects
large language models ,deep learning ,symbolic logic ,propositional logic ,logical reasoning ,Science - Abstract
In deep learning, large language models are typically trained on data from a corpus as representative of current knowledge. However, natural language is not an ideal form for the reliable communication of concepts. Instead, formal logical statements are preferable since they are subject to verifiability, reliability, and applicability. Another reason for this preference is that natural language is not designed for an efficient and reliable flow of information and knowledge, but is instead designed as an evolutionary adaptation as formed from a prior set of natural constraints. As a formally structured language, logical statements are also more interpretable. They may be informally constructed in the form of a natural language statement, but a formalized logical statement is expected to follow a stricter set of rules, such as with the use of symbols for representing the logic-based operators that connect multiple simple statements and form verifiable propositions.
- Published
- 2023
- Full Text
- View/download PDF
6. Unified Deductive Systems: An Outline.
- Author
-
Citkin, Alex
- Abstract
Our goal is to develop a syntactical apparatus for propositional logics in which the accepted and rejected propositions have the same status and obeying treated in the same way. The suggested approach is based on the ideas of Łukasiewicz used for the classical logic and in addition, it includes the use of multiple conclusion rules. More precisely, a consequence relation is defined on a set of statements of forms "proposition A is accepted" and "proposition A is rejected", where A is a proposition,—a unified consequence relation. Accordingly, the rules defining a unified consequence relation,—the unified rules, have statements as premises and as conclusions. A special attention is paid to the logics in which each proposition is either accepted or rejected. If we express this property via unified rules and add them to a unified deductive system, such a unified deductive system defines a reversible unified consequence: a statement "proposition B is accepted" is derived from the statement "proposition A is accepted" if and only if a statement "proposition A is rejected" is derived from the statement "proposition B is rejected". [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
7. Students’ propositional logic thinking in higher education from the perspective of disciplines
- Author
-
Zoltán Fehér, Ladislav Jaruska, Katarína Szarka, and Eva Tóthová Tarová
- Subjects
propositional logic ,logical thinking ,scientific tasks ,fields of study ,university students ,Education (General) ,L7-991 - Abstract
Logic and logical thinking are present and play an important role in most of the disciplines at the university level but in different ways. In our research, which has been ongoing for several years, we are investigating the use of propositional logic among university students in different study programmes. Our current study evaluated data from 1,429 respondents involving students from 15 universities. The non-standardised knowledge test was previously pilot-tested and consisted of 15 tasks from selected elements of propositional logic in a different natural science subject-specific context. Significant differences in average results were found in terms of students’ gender, age, type of secondary school leaving exam and parents’ highest education level. Our research mainly aimed to compare students’ test scores by students’ fields of study. On average, mathematics-informatics students had the highest success rate of 67.4%, compared to students in engineering (61.0%), economics (57.9%), education (56.6%), science (56.5%) and humanities (54.7%). The result is significant (F = 13.521, p-value < 0.001). Furthermore, we found that the students performed differently in three selected areas of formal logic (F = 1108, df = 2, p < 0.001), with the lowest performance on statement negation tasks. The difference in means across groups of tasks is significant by the gender of the students and by their secondary education level.
- Published
- 2023
- Full Text
- View/download PDF
8. Text as tautology: an exploration in inference, transitivity, and logical compression.
- Author
-
Potter, Andrew
- Subjects
- *
PLEONASM , *DISCOURSE analysis , *ISOMORPHISM (Mathematics) , *SEMANTICS , *PHILOSOPHY - Abstract
Rhetorical structure theory (RST) and relational propositions have been shown useful in analyzing texts as expressions in propositional logic. Because these expressions are systematically derived, they may be expected to model discursive reasoning as articulated in the text. If this is the case, it would follow that logical operations performed on the expressions would be reflected in the texts. In this paper the logic of relational propositions is used to demonstrate the applicability of transitive inference to discourse. Starting with a selection of RST analyses from the research literature, analyses of the logic of relational propositions are performed to identify their corresponding logical expressions and within each expression to identify the inference path implicit within the text. By eliminating intermediary relational propositions, transitivity is then used to progressively compress the expression. The resulting compressions are applied to the corresponding texts and their compressed RST analyses. The application of transitive inference to logical expressions results in abridged texts that are intuitively coherent and logically compatible with their originals. This indicates an underlying isomorphism between the inferential structure of logical expressions and discursive coherence, and it confirms that these expressions function as logical models of the text. Potential areas for application include knowledge representation, logic and argumentation, and RST validation. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. A Deletion Algorithm for the Marginal Problem in Propositional Logic Based on Boolean Arrays.
- Author
-
Díaz-Macías, Efraín and Moral, Serafín
- Subjects
- *
PROPOSITION (Logic) , *BAYESIAN analysis , *PYTHON programming language , *ALGORITHMS - Abstract
This paper proposes a deletion algorithm for the marginal problem in propositional logic. The algorithm is based on the general Davis and Putnam deletion algorithm DP, expressed as a bucket elimination algorithm, representing sets of clauses with the same set of variables employing a Boolean array. The main contribution is the development of alternative procedures when deleting a variable which allow more efficient computations. In particular, it takes advantage of the case in which the variable to delete is determined by a subset of the rest of the variables. It also provides a set of useful results and tools for reasoning with Boolean tables. The algorithms are implemented using Python and the NumPy library. Experiments show that this procedure is feasible for intermediate problems and for difficult problems from hard Bayesian networks cases [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
10. Large Language Models and Logical Reasoning.
- Author
-
Friedman, Robert
- Subjects
- *
LANGUAGE models , *MODEL-based reasoning , *NATURAL languages , *DEEP learning - Abstract
Definition: In deep learning, large language models are typically trained on data from a corpus as representative of current knowledge. However, natural language is not an ideal form for the reliable communication of concepts. Instead, formal logical statements are preferable since they are subject to verifiability, reliability, and applicability. Another reason for this preference is that natural language is not designed for an efficient and reliable flow of information and knowledge, but is instead designed as an evolutionary adaptation as formed from a prior set of natural constraints. As a formally structured language, logical statements are also more interpretable. They may be informally constructed in the form of a natural language statement, but a formalized logical statement is expected to follow a stricter set of rules, such as with the use of symbols for representing the logic-based operators that connect multiple simple statements and form verifiable propositions. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
11. A Comprehensive Formalization of Propositional Logic in Coq: Deduction Systems, Meta-Theorems, and Automation Tactics.
- Author
-
Guo, Dakai and Yu, Wensheng
- Subjects
- *
PROPOSITION (Logic) , *LOGIC , *COMPLETENESS theorem , *SOFTWARE verification , *MATHEMATICAL proofs , *DIGITAL electronics - Abstract
The increasing significance of theorem proving-based formalization in mathematics and computer science highlights the necessity for formalizing foundational mathematical theories. In this work, we employ the Coq interactive theorem prover to methodically formalize the language, semantics, and syntax of propositional logic, a fundamental aspect of mathematical reasoning and proof construction. We construct four Hilbert-style axiom systems and a natural deduction system for propositional logic, and establish their equivalences through meticulous proofs. Moreover, we provide formal proofs for essential meta-theorems in propositional logic, including the Deduction Theorem, Soundness Theorem, Completeness Theorem, and Compactness Theorem. Importantly, we present an exhaustive formal proof of the Completeness Theorem in this paper. To bolster the proof of the Completeness Theorem, we also formalize concepts related to mappings and countability, and deliver a formal proof of the Cantor–Bernstein–Schröder theorem. Additionally, we devise automated Coq tactics explicitly designed for the propositional logic inference system delineated in this study, enabling the automatic verification of all tautologies, all internal theorems, and the majority of syntactic and semantic inferences within the system. This research contributes a versatile and reusable Coq library for propositional logic, presenting a solid foundation for numerous applications in mathematics, such as the accurate expression and verification of properties in software programs and digital circuits. This work holds particular importance in the domains of mathematical formalization, verification of software and hardware security, and in enhancing comprehension of the principles of logical reasoning. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
12. A Fuzzy Model for Reasoning and Predicting Student's Academic Performance.
- Author
-
Hegazi, Mohamed O., Almaslukh, Bandar, and Siddig, Khadra
- Subjects
ACADEMIC achievement ,EDUCATIONAL planning ,PROPOSITION (Logic) ,FUZZY sets ,DECISION making - Abstract
Evaluating students' academic performance is crucial for assessing the quality of education and educational strategies. However, it can be challenging to predict and evaluate academic performance under uncertain and imprecise conditions. To address this issue, many research works have employed fuzzy concepts to analyze, predict, and make decisions about students' academic performance. This paper investigates the use of fuzzy concepts in research related to evaluating, analyzing, predicting, or making decisions about student academic performance. The paper proposes a fuzzy model, called FPM (Fuzzy Propositional Model), for reasoning and predicting students' academic performance. FPM aims to address the limitations of previous studies by incorporating propositional logic with fuzzy sets concept, which allows for the representation of uncertainty and imprecision in the data. FPM integrates and transforms if-then rules into weighted fuzzy production rules to predict and evaluate academic performance. This paper tests and evaluates the FPM in two scenarios. In the first scenario, the model predicts and examines the impact of absenteeism on academic performance where there is no clear relation between the two parts of the dataset. In the second scenario, the model predicts the final exam results using the lab exam results, where the data are more related. The FPM provides good results in both scenarios, demonstrating its effectiveness in predicting and evaluating students' academic performance. A comparison study of the FPM's results with a linear regression model and previous work showed that the FPM performs better in predicting academic performance and provides more insights into the underlying factors affecting it. Therefore, the FPM could be useful in educational institutions to predict and evaluate students' academic performance, identify underlying factors affecting it, and improve educational strategies. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
13. Measuring Inconsistency in Generalized Propositional Logic Extended with Nonunary Operators.
- Author
-
Grant, John
- Abstract
As consistency is such an important topic in logic, researchers have for a long time investigated how to attain and maintain it. But consistency can also be studied from the point of view of its opposite, inconsistency. The problem with inconsistency in classical logic is that by the principle of explosion a single inconsistency leads to triviality. Paraconsistent logics were introduced to get around this problem by defining logics in such a way that the explosion principle does not apply to them. Another approach stays in the classical framework and evaluates the amount of inconsistency in a set of formulas. The great bulk of this work has been done for propositional logic and presents many interesting issues about inconsistency. A previous paper introduced the concept of generalized propositional logic (GPL) to provide a uniform method for measuring inconsistency in logics that allow the application of unary operator pairs, such as for modality, time, and space, to propositional logic formulas. The universality of GPL manifests itself in the fact that such an operator pair is evaluated in a uniform manner across all such logics. The difference lies solely in the choice of a frame for each logic. But some logics also contain nonunary operators. For example, temporal logics typically contain a binary Until operator. The purpose of this paper is to show how to extend generalized propositional logic to extended generalized propositional logic (EGPL) by adding nonunary operator pairs and measure inconsistency in such logics. The universality of EGPL manifests itself in the fact that once the evaluation of the nonunary operators is given, it carries over to all such logics. For example, the temporal Until operator becomes applicable to modal logic. Furthermore, while relative inconsistency measures were previously considered for GPL, they are now extended to EGPL and a new approach removes an undesirable feature from the previous version. Also, this paper provides results about various properties of the new inconsistency measures. Many examples and explanations are given to illustrate the issues involving this extension. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
14. Modular normalisation of classical proofs
- Author
-
Ralph, Benjamin, Laird, James, and Guglielmi, Alessio
- Subjects
004 ,proof theory ,deep inference ,normalisation ,propositional logic ,first-order logic ,herbrand's theorem - Abstract
The main contribution of this thesis is to present a study of two normalisation theorems and proofs in classical logic: one propositional, one first-order. For propositional logic, we show a local cycle removal procedure through reductions on merge contractions that ensures that proofs can be decomposed-that contractions can be pushed to the bottom of a proof-in a straightforward way. For first-order logic, we show how decomposition of proofs can correspond to two presentations of Herbrand's Theorem, and how we can use translations into expansion proofs to give a new, indirect cut elimination theorem for first-order logic. In addition, an old but interesting cut elimination method for propositional logic, the experiments method, is formally presented for the first time, and we extend the theory of merge contractions to first-order logic.
- Published
- 2019
15. Enhancing a student productivity model for adaptive problem-solving assistance.
- Author
-
Maniktala, Mehak, Chi, Min, and Barnes, Tiffany
- Subjects
INTELLIGENT tutoring systems ,PROBLEM solving ,HELP-seeking behavior - Abstract
Research on intelligent tutoring systems has been exploring data-driven methods to deliver effective adaptive assistance. While much work has been done to provide adaptive assistance when students seek help, they may not seek help optimally. This had led to the growing interest in proactive adaptive assistance, where the tutor provides unsolicited assistance upon predictions of struggle or unproductivity. Determining when and whether to provide personalized support is a well-known challenge called the assistance dilemma. Addressing this dilemma is particularly challenging in open-ended domains, where there can be several ways to solve problems. Researchers have explored methods to determine when to proactively help students, but few of these methods have taken prior hint usage into account. In this paper, we present a novel data-driven approach to incorporate students' hint usage in predicting their need for help. We explore its impact in an intelligent tutor that deals with the open-ended and well-structured domain of logic proofs. We present a controlled study to investigate the impact of an adaptive hint policy based on predictions of HelpNeed that incorporate students' hint usage. We show empirical evidence to support that such a policy can save students a significant amount of time in training and lead to improved posttest results, when compared to a control without proactive interventions. We also show that incorporating students' hint usage significantly improves the adaptive hint policy's efficacy in predicting students' HelpNeed, thereby reducing training unproductivity, reducing possible help avoidance, and increasing possible help appropriateness (a higher chance of receiving help when it was likely to be needed). We conclude with suggestions on the domains that can benefit from this approach as well as the requirements for adoption. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
16. Rare Correlated Coherent Association Rule Mining With CLS-MMS.
- Author
-
Datta, Subrata, Mali, Kalyani, Ghosh, Udit, Bose, Subrata, Das, Sourav, and Ghosh, Sourav
- Subjects
- *
ASSOCIATION rule mining , *PROPOSITION (Logic) , *DATABASES - Abstract
The study of coherent association rules based on propositional logic is an important area of association rule mining. Users may get a large number of itemsets for low minsup and lose valuable itemsets for high minsup. Mining without minsup may cause itemset explosions that contain spurious itemsets with low correlations and take a long time to mine. For mining coherence rules, existing approaches consider only the frequent itemsets, ignoring rare itemsets. Moreover, all items in the database are regarded equally important, which is not practical in real-world applications. By using the confidence-lift specified multiple minimum supports combined with propositional logic, we propose an efficient approach called rare correlated coherent association rule mining that addresses all of the problems stated above. We define and incorporate termination bound of support (|${s}_{TB}$|) and termination bound of dissociation (|${d}_{TB}$|) for early pruning of the candidate itemsets. In the proposed approach, support thresholds are automatically applied to the itemsets and coherent association rules are derived from the frequent and rare itemsets with high correlation and confidence. Experimental results obtained from real-life datasets show the effectiveness of the proposed approach in terms of itemsets and rule generation, correlation, confidence, runtime and scalability. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
17. The Truth Table Formulation of Propositional Logic.
- Author
-
Haze, Tristan Grøtvedt
- Subjects
- *
TRUTH tables (Mathematical logic) , *PROPOSITION (Logic) , *SEMANTICS , *PROOF theory , *SYMBOLISM , *FINITE groups , *LOGIC , *FORMAL languages , *INTELLECT - Abstract
Developing a suggestion of Wittgenstein, I provide an account of truth tables as formulas of a formal language. I define the syntax and semantics of TPL (the language of Tabular Propositional Logic) and develop its proof theory. Single formulas of TPL, and finite groups of formulas with the same top row and TF matrix (depiction of possible valuations), are able to serve as their own proofs with respect to metalogical properties of interest. The situation is different, however, for groups of formulas whose top rows differ. [ABSTRACT FROM AUTHOR]
- Published
- 2023
18. On the Convergence of Tsetlin Machines for the IDENTITY- and NOT Operators.
- Author
-
Zhang, Xuan, Jiao, Lei, Granmo, Ole-Christoffer, and Goodwin, Morten
- Subjects
- *
PATTERN recognition systems , *MACHINE learning , *TIME perspective , *MACHINERY , *MATHEMATICAL analysis - Abstract
The Tsetlin Machine (TM) is a recent machine learning algorithm with several distinct properties, such as interpretability, simplicity, and hardware-friendliness. Although numerous empirical evaluations report on its performance, the mathematical analysis of its convergence is still open. In this article, we analyze the convergence of the TM with only one clause involved for classification. More specifically, we examine two basic logical operators, namely, the “IDENTITY”- and “NOT” operators. Our analysis reveals that the TM, with just one clause, can converge correctly to the intended logical operator, learning from training data over an infinite time horizon. Besides, it can capture arbitrarily rare patterns and select the most accurate one when two candidate patterns are incompatible, by configuring a granularity parameter. The analysis of the convergence of the two basic operators lays the foundation for analyzing other logical operators. These analyses altogether, from a mathematical perspective, provide new insights on why TMs have obtained state-of-the-art performance on several pattern recognition problems. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
19. Multi-adjoint lattice logic and truth-stressing hedges.
- Author
-
Cornejo, M. Eugenia, Fariñas del Cerro, Luis, and Medina, Jesús
- Subjects
- *
PROPOSITION (Logic) , *MANY-valued logic , *LOGIC - Abstract
This paper presents the multi-adjoint lattice logic (MLL) and its completeness and soundness. Specifically, the proposed many-valued propositional logic framework is defined on a multi-adjoint algebra whose underlying algebraic structure is a bounded lattice, which embeds the well-known basic logic (BL) given by Hájek on residuated lattices. The consideration of truth-stressing hedges in the multi-adjoint algebra is also studied and, as a consequence, two new logics extensions of MLL arise: the multi-adjoint lattice logic very true intensified (MLL v t) and the multi-adjoint lattice logic ∨-very true intensified (MLL ∨ − v t). Finally, the soundness and completeness of the aforementioned logics are also proven. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
20. A Deletion Algorithm for the Marginal Problem in Propositional Logic Based on Boolean Arrays
- Author
-
Efraín Díaz-Macías and Serafín Moral
- Subjects
marginal problem ,satisfiability problem ,propositional logic ,propagation algorithm ,calculus with potentials ,Mathematics ,QA1-939 - Abstract
This paper proposes a deletion algorithm for the marginal problem in propositional logic. The algorithm is based on the general Davis and Putnam deletion algorithm DP, expressed as a bucket elimination algorithm, representing sets of clauses with the same set of variables employing a Boolean array. The main contribution is the development of alternative procedures when deleting a variable which allow more efficient computations. In particular, it takes advantage of the case in which the variable to delete is determined by a subset of the rest of the variables. It also provides a set of useful results and tools for reasoning with Boolean tables. The algorithms are implemented using Python and the NumPy library. Experiments show that this procedure is feasible for intermediate problems and for difficult problems from hard Bayesian networks cases
- Published
- 2023
- Full Text
- View/download PDF
21. A Comprehensive Formalization of Propositional Logic in Coq: Deduction Systems, Meta-Theorems, and Automation Tactics
- Author
-
Dakai Guo and Wensheng Yu
- Subjects
Coq ,formalization ,propositional logic ,theorem proving ,Hilbert-style axiom system ,natural deduction system ,Mathematics ,QA1-939 - Abstract
The increasing significance of theorem proving-based formalization in mathematics and computer science highlights the necessity for formalizing foundational mathematical theories. In this work, we employ the Coq interactive theorem prover to methodically formalize the language, semantics, and syntax of propositional logic, a fundamental aspect of mathematical reasoning and proof construction. We construct four Hilbert-style axiom systems and a natural deduction system for propositional logic, and establish their equivalences through meticulous proofs. Moreover, we provide formal proofs for essential meta-theorems in propositional logic, including the Deduction Theorem, Soundness Theorem, Completeness Theorem, and Compactness Theorem. Importantly, we present an exhaustive formal proof of the Completeness Theorem in this paper. To bolster the proof of the Completeness Theorem, we also formalize concepts related to mappings and countability, and deliver a formal proof of the Cantor–Bernstein–Schröder theorem. Additionally, we devise automated Coq tactics explicitly designed for the propositional logic inference system delineated in this study, enabling the automatic verification of all tautologies, all internal theorems, and the majority of syntactic and semantic inferences within the system. This research contributes a versatile and reusable Coq library for propositional logic, presenting a solid foundation for numerous applications in mathematics, such as the accurate expression and verification of properties in software programs and digital circuits. This work holds particular importance in the domains of mathematical formalization, verification of software and hardware security, and in enhancing comprehension of the principles of logical reasoning.
- Published
- 2023
- Full Text
- View/download PDF
22. A Fuzzy Model for Reasoning and Predicting Student’s Academic Performance
- Author
-
Mohamed O. Hegazi, Bandar Almaslukh, and Khadra Siddig
- Subjects
propositional logic ,fuzzy set ,machine learning ,prediction ,student performance ,absenteeism ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
Evaluating students’ academic performance is crucial for assessing the quality of education and educational strategies. However, it can be challenging to predict and evaluate academic performance under uncertain and imprecise conditions. To address this issue, many research works have employed fuzzy concepts to analyze, predict, and make decisions about students’ academic performance. This paper investigates the use of fuzzy concepts in research related to evaluating, analyzing, predicting, or making decisions about student academic performance. The paper proposes a fuzzy model, called FPM (Fuzzy Propositional Model), for reasoning and predicting students’ academic performance. FPM aims to address the limitations of previous studies by incorporating propositional logic with fuzzy sets concept, which allows for the representation of uncertainty and imprecision in the data. FPM integrates and transforms if-then rules into weighted fuzzy production rules to predict and evaluate academic performance. This paper tests and evaluates the FPM in two scenarios. In the first scenario, the model predicts and examines the impact of absenteeism on academic performance where there is no clear relation between the two parts of the dataset. In the second scenario, the model predicts the final exam results using the lab exam results, where the data are more related. The FPM provides good results in both scenarios, demonstrating its effectiveness in predicting and evaluating students’ academic performance. A comparison study of the FPM’s results with a linear regression model and previous work showed that the FPM performs better in predicting academic performance and provides more insights into the underlying factors affecting it. Therefore, the FPM could be useful in educational institutions to predict and evaluate students’ academic performance, identify underlying factors affecting it, and improve educational strategies.
- Published
- 2023
- Full Text
- View/download PDF
23. The logic behind desirable sets of things, and its filter representation.
- Author
-
de Cooman, Gert, Van Camp, Arthur, and De Bock, Jasper
- Subjects
- *
PROPOSITION (Logic) , *LOGIC , *SET theory , *INTERNET gambling , *GAMBLING - Abstract
We identify the (filter representation of the) logic behind the recent theory of coherent sets of desirable (sets of) things, which generalise coherent sets of desirable (sets of) gambles as well as coherent choice functions, and show that this identification allows us to establish various representation results for such coherent models in terms of simpler ones. • We identify the (filter representation of the) logic behind the recent theory of coherent sets of desirable (sets of) things. • We represent coherent sets of desirable sets of things in terms of collections of coherent sets of desirable things. • We identify two interesting special cases: uncertainty modelling using choice functions, and classical propositional logic. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. Complete inference via knowledge Petri nets and resolution rules.
- Author
-
Tan, Kaicheng, Luo, Jiliang, Li, Zepei, Zhou, Jiazhong, and Li, Zhiwu
- Subjects
- *
PETRI nets , *INFERENCE (Logic) , *COMPUTATIONAL complexity , *KNOWLEDGE base , *PROPOSITION (Logic) , *VIDEO coding - Abstract
A novel inference approach is presented based on knowledge Petri nets and resolution rules. First, a knowledge base is modeled as a knowledge Petri net, where logical clauses are represented by monitor places, called semantic places. Second, a resolution pair is defined to identify a pair of semantic places, which, corresponding to two clauses that can be resolved, can be used to generate a new place in the knowledge Petri net. Such a new place represents an unknown clause implied by the knowledge base. Third, a method is proposed to decide whether a resolution inference is redundant based on the resolution pair. The size of a knowledge Petri net can be reduced, and the inference computation is saved in this way. Fourth, an inference algorithm is proposed employing the resolution pair and knowledge Petri net, proven to be sound and complete. Its computational complexity is polynomial with respect to the number of logical variables and clauses. In addition, another inference algorithm is developed for a given complete knowledge base and a set of newfound clauses. The algorithm is proven to be sound and complete, which offers significantly reduced computational complexity and specifically is linear concerning the number of new clauses. Finally, the wumpus world problem is taken as an example to illustrate and verify the proposed inference algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. Embeddings as epistemic states: Limitations on the use of pooling operators for accumulating knowledge.
- Author
-
Schockaert, Steven
- Subjects
- *
GRAPH neural networks , *NONMONOTONIC logic , *KNOWLEDGE representation (Information theory) , *SATISFACTION - Abstract
Various neural network architectures rely on pooling operators to aggregate information coming from different sources. It is often implicitly assumed in such contexts that vectors encode epistemic states, i.e. that vectors capture the evidence that has been obtained about some properties of interest, and that pooling these vectors yields a vector that combines this evidence. We study, for a number of standard pooling operators, under what conditions they are compatible with this idea, which we call the epistemic pooling principle. While we find that all the considered pooling operators can satisfy the epistemic pooling principle, this only holds when embeddings are sufficiently high-dimensional and, for most pooling operators, when the embeddings satisfy particular constraints (e.g. having non-negative coordinates). We furthermore show that these constraints have important implications on how the embeddings can be used in practice. In particular, we find that when the epistemic pooling principle is satisfied, in most cases it is impossible to verify the satisfaction of propositional formulas using linear scoring functions, with two exceptions: (i) max-pooling with embeddings that are upper-bounded and (ii) Hadamard pooling with non-negative embeddings. This finding helps to clarify, among others, why Graph Neural Networks sometimes under-perform in reasoning tasks. Finally, we also study an extension of the epistemic pooling principle to weighted epistemic states, which are important in the context of non-monotonic reasoning, where max-pooling emerges as the most suitable operator. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Heterogeneous-branch integration framework: Introducing first-order predicate logic in Logical Reasoning Question Answering.
- Author
-
Yue, Jianyu, Bi, Xiaojun, and Chen, Zheng
- Subjects
- *
NATURAL language processing , *FIRST-order logic , *PREDICATE (Logic) , *PROPOSITION (Logic) , *QUESTION (Logic) - Abstract
The logical reasoning question-answering is a critical task in natural language processing, as it equips models with human-like logical reasoning intelligence. Existing approaches focus on extracting and leveraging the hidden logical structures within text. However, previous works explore partial logical relationships and neglect the holistic extraction within the text. Moreover, they struggle to fully model logical connections, including long-distance dependencies and local topology information. To address these issues, we propose a novel heterogeneous-branch integration framework. Our framework is based on first-order predicate logic theory and consists of three primary components. First, we construct two heterogeneous logical graphs to model logical relationships within and between propositions. Second, we propose a novel Graph-Masked Transformer with a novel graph-masked multi-head attention mechanism to enable distant node interactions and local sparse relationship modeling. Third, we propose a novel multi-branch fusion module to integrate information from multiple sources and generate answer predictions. The proposed heterogeneous-branch integration framework outperforms the VDGN method by 2.73% in accuracy on the ReClor dataset and 2.15% on the LogiQA dataset. Our code and models will be made available at https://github.com/starry-y/HBI. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. Towards logical foundations for probabilistic computation.
- Author
-
Antonelli, Melissa, Dal Lago, Ugo, and Pistone, Paolo
- Subjects
- *
PROPOSITION (Logic) - Abstract
The overall purpose of the present work is to lay the foundations for a new approach to bridge logic and probabilistic computation. To this aim we introduce extensions of classical and intuitionistic propositional logic with counting quantifiers , that is, quantifiers that measure to which extent a formula is true. The resulting systems, called cCPL and iCPL , respectively, admit a natural semantics, based on the Borel σ -algebra of the Cantor space, together with a sound and complete proof system. Our main results consist in relating cCPL and iCPL with some central concepts in the study of probabilistic computation. On the one hand, the validity of cCPL -formulae in prenex form characterizes the corresponding level of Wagner's hierarchy of counting complexity classes, closely related to probabilistic complexity. On the other hand, proofs in iCPL correspond, in the sense of Curry and Howard, to typing derivations for a randomized extension of the λ -calculus, so that counting quantifiers reveal the probability of termination of the underlying probabilistic programs. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Classes of propositional UMU formulas and their extensions to minimal unsatisfiable formulas.
- Author
-
Kleine Büning, Hans
- Subjects
- *
PROPOSITION (Logic) , *SIMPLICITY - Abstract
The class UMU consists of propositional formulas which are a union of minimal unsatisfiable formulas (MU formulas). The structural complexity of UMU formulas depends essentially on the type and degree of intertwining of the MU subformulas. Starting from class of formulas that consist only of clause (or variable) disjoint minimal unsatisfiable subformulas, we study various UMU subclasses given by weakening and these conditions. Generalizing an idea from [6] , we investigate a characterization of UMU formulas by whether they allow transformation into a MU formula by adding literals and/or clauses. It can be shown that simplicity in constructing of such extensions correlates with the degree of intertwining of the MU subformulas. For UMU formulas, however, we can give only extensions that have an exponential size in the worst case. The question of the existence of short MU combinations of MU-formulas by adding literals and clauses remains open. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
29. A logical framework to study concept-learning biases in the presence of multiple explanations.
- Author
-
Abriola, Sergio, Tano, Pablo, Romano, Sergio, and Figueira, Santiago
- Subjects
- *
PROPOSITION (Logic) , *EYE tracking , *CONCEPT learning , *EXPLANATION , *ATTENTION control - Abstract
When people seek to understand concepts from an incomplete set of examples and counterexamples, there is usually an exponentially large number of classification rules that can correctly classify the observed data, depending on which features of the examples are used to construct these rules. A mechanistic approximation of human concept-learning should help to explain how humans prefer some rules over others when there are many that can be used to correctly classify the observed data. Here, we exploit the tools of propositional logic to develop an experimental framework that controls the minimal rules that are simultaneously consistent with the presented examples. For example, our framework allows us to present participants with concepts consistent with a disjunction and also with a conjunction, depending on which features are used to build the rule. Similarly, it allows us to present concepts that are simultaneously consistent with two or more rules of different complexity and using different features. Importantly, our framework fully controls which minimal rules compete to explain the examples and is able to recover the features used by the participant to build the classification rule, without relying on supplementary attention-tracking mechanisms (e.g. eye-tracking). We exploit our framework in an experiment with a sequence of such competitive trials, illustrating the emergence of various transfer effects that bias participants' prior attention to specific sets of features during learning. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
30. Propositional Satisfiability Logic via Ant Colony Optimization in Hopfield Neural Network.
- Author
-
Kho, L. C., Kasihmuddin, M. S. M., Mansor, M. A., and Sathasivam, S.
- Subjects
- *
ANT algorithms , *HOPFIELD networks , *PROPOSITION (Logic) , *COST functions , *BIOLOGICALLY inspired computing , *ANTS , *COMBINATORIAL optimization - Abstract
Minimizing the cost function that corresponds to propositional logic is vital to ensure the learning phase of HNN can occur optimally. In that regard, optimal and non-biased algorithm is required to ensure HNN will always converge to global solution. Ant Colony Optimization (ACO) is a population-based and nature-inspired algorithm to solve various combinatorial optimization problems. ACO simulates the behaviour of the real ants that forage for food and communication of ants through pheromone density. In this work, ACO will be used to minimize the cost function that corresponds to the logical rule in Hopfield Neural Network. ACO will utilize pheromone density to find the optimal path that leads to zero cost function without consuming more learning iteration. Performance for all learning models will be evaluated based on various performance metrics. Results collected from computer simulation implies that ACO outperformed conventional learning model in minimizing the logical cost function. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
31. Four-valued expansions of Dunn-Belnap's logic (I): Basic characterizations
- Author
-
Alexej P. Pynko
- Subjects
propositional logic ,logical matrix ,dunn-belnap's logic ,expansion ,[bounded] distributive/de morgan lattice ,equality determinant ,Logic ,BC1-199 - Abstract
Basic results of the paper are that any four-valued expansion L4 of Dunn-Belnap's logic DB4 is de_ned by a unique (up to isomorphism) conjunctive matrix ℳ4 with exactly two distinguished values over an expansion 𝔄4 of a De Morgan non-Boolean four-valued diamond, but by no matrix with either less than four values or a single [non-]distinguished value, and has no proper extension satisfying Variable Sharing Property (VSP). We then characterize L4's having a theorem / inconsistent formula, satisfying VSP and being [inferentially] maximal / subclassical / maximally paraconsistent, in particular, algebraically through ℳ4|𝔄4's (not) having certain submatrices|subalebras. Likewise, [providing 𝔄4 is regular / has no three-element subalgebra] L4 has a proper consistent axiomatic extension if[f] ℳ4 has a proper paraconsistent / two-valued submatrix [in which case the logic of this submatrix is the only proper consistent axiomatic extension of L4 and is relatively axiomatized by the Excluded Middle law axiom]. As a generic tool (applicable, in particular, to both classically-negative and implicative expansions of DB4), we also prove that the lattice of axiomatic extensions of the logic of an implicative matrix ℳ with equality determinant is dual to the distributive lattice of lower cones of the set of all submatrices of ℳ with non-distinguished values.
- Published
- 2020
- Full Text
- View/download PDF
32. Propositional logic concept for fault diagnosis in complex systems
- Author
-
Yunus Bicen
- Subjects
Propositional logic ,Power system ,Transformer ,Fault ,Diagnosis ,Engineering (General). Civil engineering (General) ,TA1-2040 - Abstract
A great number of monitoring technologies have been developed especially for complex systems within the critical zones such as electric power substations, nuclear energy systems. But also, there is no single instruction or standardization in fault-focused on-line/off-line monitoring applications due to the acceleration of technological developments. Field experts have difficulty in choosing which test and measurement systems should be used in which stage of the complex systems. In this study, the propositional logic-based concept is presented, which field experts can use to manage this process. In this concept, test and measurement systems can be grouped according to the priority-order. According to the results of the graded groups on this concept, the suspected fault is verified by the cause of the occurrence. The applicability of the proposed concept has been tried to be explained by creating possible failure scenarios on the transformer. The theoretically validated concept can be used for even more fault situations. This concept can also be used in another complex systems with a large number of T&M systems where very different fault conditions can occur. 2009 Elsevier Ltd. All rights reserved.
- Published
- 2020
- Full Text
- View/download PDF
33. A logical characterization of multi-adjoint algebras.
- Author
-
Cornejo, M. Eugenia, Fariñas del Cerro, Luis, and Medina, Jesús
- Subjects
- *
MATHEMATICAL logic , *ALGEBRA , *PROPOSITION (Logic) , *COMPLETENESS theorem , *FUZZY logic , *LOGIC , *AXIOMS - Abstract
This paper introduces a logical characterization of multi-adjoint algebras with a twofold contribution. On the one hand, the study of multi-adjoint algebras, from a logical perspective, will allow us to discover both the core and new features of these algebras. On the other hand, the axiomatization of multi-adjoint algebras will be useful to take advantage of the properties of the logical connectives considered in the corresponding deductive system. The mechanism considered to carry out the mentioned axiomatization follows the one given by Petr Hájek for residuated lattices. Specifically, the paper presents the bounded poset logic (BPL) as an axiomatization of a bounded poset, since this algebraic structure is the most simple structure from which a multi-adjoint algebra is defined. In the following, the language of BPL is enriched with a family of pairs, composed of a conjunctor and an implication, and its axiomatic system is endowed with new axioms, giving rise to the multi-adjoint logic (ML). The soundness and completeness of BPL and ML are proven. Finally, a comparison between the axiomatization of the multi-adjoint logic and the one given for the BL logic is introduced. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
34. TrueBiters, an educational game to practice the truth tables of propositional logic: Development, evaluation, and lessons learned
- Author
-
Olga De Troyer, Renny Lindberg, and Pejman Sajjadi
- Subjects
Educational games ,Serious games ,Player-centered design ,Propositional logic ,Truth tables ,Multiple intelligences ,Special aspects of education ,LC8-6691 - Abstract
Abstract Since years, the logic course in the first year of our Bachelor program in Computer Science, dealing with propositional and predicate logic, suffers from poor pass rates. The students perceive the formal and abstract mechanisms of logic as difficult and awkward to deal with. Many students do not study the course material on a regular basis, which results in poor mastery of the basic concepts and principles. Consequently, students often fall behind as the course progresses. Previous attempts to remedy this procrastination behavior had little success. Since educational games are commended as an enjoyable way to foster learning, we decided in 2016 to develop an educational game for the course. This game, called TrueBiters, is a two-player competitive game inspired by an existing card game. We adapted this card game to propositional logic and digitized it as an app. The development was done in an iterative way. Each version was evaluated and using the received user feedback and evaluation results the game was improved and reevaluated. Based on the results of the evaluations we can conclude that the game is well suited for its target audience, i.e., logically-mathematically intelligent people, and is a good supplement, and even replacement for some of the traditional face-to-face exercise sessions. However, we also learned that a proper embedding into the course is needed to ensure that all students actually use the game and benefit from playing it. In this paper we present the game, explain and motivate its evolution, discuss the evaluations performed, and present lessons learned.
- Published
- 2019
- Full Text
- View/download PDF
35. Logical derivation search with assumption traceability
- Author
-
Adomas Birštunas and Elena Reivytytė
- Subjects
propositional logic ,traceability ,loop checking ,Mathematics ,QA1-939 - Abstract
In this paper authors research the problem of traceability of assumptions in logical derivation. The essence of this task is to trace which assumptions from the available knowledge base of assumptions are necessary to derive a certain conclusion. The paper presents a new derivation procedure for propositional logic, which ensures traceability feature. For the derivable conclusion formula derivation procedure also returns the smallest set of assumptions those are enough to get derivation of the conclusion formula. Verification of the procedure were performed using authors implementation.
- Published
- 2021
- Full Text
- View/download PDF
36. On structures of regular standard contradictions in propositional logic.
- Author
-
He, Xingxing, Li, Yingfang, and Feng, Yanghe
- Subjects
- *
PROPOSITION (Logic) , *CONTRADICTION , *LINEAR orderings - Abstract
The contradiction separation (CS) based automated deduction is an extension of the binary resolution principle for deductive inference rules, whose CSE prover and combined versions have unique performance among others in CADE ATP system competitions since their first appearances in 2018. The contradictions constructing method is one of the most important factors to determine the efficiency of the CS based deduction, which is more complex compared to the complementary pair finding in the binary resolution. To get structures and concrete forms of contradictions, this paper examines two types of standard contradictions, i.e., the regular standard contradictions and the standard contradictions for non-unit clauses sets. The concepts of the regular standard contradiction and its extended form are presented. Some properties of them are discussed. In order to get simpler contradiction separation clauses in every deductive step, four types of structures for standard contradictions obtained by adding related literals in the regular standard contradiction and its extended form are gotten. Two structures for the standard contradictions for non-unit clauses set are obtained. Two unified algorithms for constructing standard contradictions are also contrived. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
37. On the role of logical separability in knowledge compilation.
- Author
-
Qiu, Junming, Li, Wenqing, Fang, Liangda, Guan, Quanlong, Xiao, Zhanhao, Lai, Zhao-Rong, and Dong, Qian
- Subjects
- *
DUAL-task paradigm , *KNOWLEDGE base , *PROPOSITION (Logic) - Abstract
Knowledge compilation is an alternative solution to address demanding reasoning tasks with high complexity via converting knowledge bases into a suitable target language. The notion of logical separability, proposed by Levesque, offers a general explanation for the tractability of clausal entailment for two remarkable languages: decomposable negation normal form and prime implicates. It is interesting to explore what role logical separability plays in problem tractability. In this paper, we apply the notion of logical separability to a number of reasoning problems within the context of propositional logic: satisfiability checking (CO), clausal entailment checking (CE), model counting (CT), model enumeration (ME) and forgetting (FO), as well as their dual tasks, contributing to several recursive procedures. We provide the corresponding logical separability based properties: CO-logical separability, CE-logical separability, CT-logical separability, ME-logical separability and their duals. Based on these properties, we then identify four novel normal forms: CO - LSNNF , CE - LSNNF , CT - LSNNF and ME - LSNNF , as well as their dual languages. We show that each of them is the necessary and sufficient condition under which the corresponding procedure is correct. We finally integrate the above normal forms into the knowledge compilation map. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
38. Generation and Use of Hints and Feedback in a Hilbert-Style Axiomatic Proof Tutor.
- Author
-
Lodder, Josje, Heeren, Bastiaan, Jeuring, Johan, and Neijenhuis, Wendy
- Subjects
PSYCHOLOGICAL feedback ,EVIDENCE ,TUTORS & tutoring ,STUDENTS - Abstract
This paper describes logax, an interactive tutoring tool that gives hints and feedback to a student who stepwise constructs a Hilbert-style axiomatic proof in propositional logic. logax generates proofs to calculate hints and feedback. We compare these generated proofs with expert proofs and student solutions, and conclude that the quality of the generated proofs is comparable to that of expert proofs. logax recognizes most steps that students take when constructing a proof. Even if a student diverges from the generated solution, logax still provides hints, including next steps or reachable subgoals, and feedback. With a few improvements in the design of the set of buggy rules, logax will cover about 80% of the mistakes made by students by buggy rules. The hints help students to complete the exercises. [ABSTRACT FROM AUTHOR]
- Published
- 2021
- Full Text
- View/download PDF
39. Crime prevention in terms of criminal intent criteria in white-collar crime : A propositional analysis
- Author
-
Alalehto, Tage
- Published
- 2018
- Full Text
- View/download PDF
40. FOUR-VALUED EXPANSIONS OF DUNN-BELNAP'S LOGIC (I): BASIC CHARACTERIZATIONS.
- Author
-
Pynko, Alexej P.
- Subjects
- *
DISTRIBUTIVE lattices , *LOGIC , *BOOLEAN matrices , *INCONSISTENCY (Logic) - Abstract
Basic results of the paper are that any four-valued expansion L4 of Dunn-Belnap's logic DB4 is defined by a unique (up to isomorphism) conjunctive matrix M4 with exactly two distinguished values over an expansion U4 of a De Morgan non-Boolean four-valued diamond, but by no matrix with either less than four values or a single [non-]distinguished value, and has no proper extension satisfying Variable Sharing Property (VSP). We then characterize L4's having a theorem / inconsistent formula, satisfying VSP and being [inferentially] maximal / subclassical / maximally paraconsistent, in particular, algebraically through M4jA4's (not) having certain submatricesjsubalgebras. Likewise, [providing U4 is regular / has no three-element subalgebra] L4 has a proper consistent axiomatic extension if[f]M4 has a proper paraconsistent / two-valued submatrix [in which case the logic of this submatrix is the only proper consistent axiomatic extension of L4 and is relatively axiomatized by the Excluded Middle law axiom]. As a generic tool (applicable, in particular, to both classically-negative and implicative expansions of DB4), we also prove that the lattice of axiomatic extensions of the logic of an implicative matrixMwith equality determinant is dual to the distributive lattice of lower cones of the set of all submatrices of M with non-distinguished values. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
41. Simulating Strong Practical Proof Systems with Extended Resolution.
- Author
-
Kiesl, Benjamin, Rebola-Pardo, Adrián, Heule, Marijn J. H., and Biere, Armin
- Subjects
PROOF complexity ,GENERALIZATION ,MATHEMATICAL models ,BOOLEAN algebra ,POLYNOMIALS - Abstract
Proof systems for propositional logic provide the basis for decision procedures that determine the satisfiability status of logical formulas. While the well-known proof system of extended resolution—introduced by Tseitin in the sixties—allows for the compact representation of proofs, modern SAT solvers (i.e., tools for deciding propositional logic) are based on different proof systems that capture practical solving techniques in an elegant way. The most popular of these proof systems is likely DRAT, which is considered the de-facto standard in SAT solving. Moreover, just recently, the proof system DPR has been proposed as a generalization of DRAT that allows for short proofs without the need of new variables. Since every extended-resolution proof can be regarded as a DRAT proof and since every DRAT proof is also a DPR proof, it was clear that both DRAT and DPR generalize extended resolution. In this paper, we show that—from the viewpoint of proof complexity—these two systems are no stronger than extended resolution. We do so by showing that (1) extended resolution polynomially simulates DRAT and (2) DRAT polynomially simulates DPR. We implemented our simulations as proof-transformation tools and evaluated them to observe their behavior in practice. Finally, as a side note, we show how Kullmann's proof system based on blocked clauses (another generalization of extended resolution) is related to the other systems. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
42. A software implementation and case study application of Lempp’s propositional model of conflict resolution
- Author
-
Lempp, Frieder
- Published
- 2017
- Full Text
- View/download PDF
43. Teoria kategorii i niektóre jej logiczne aspekty
- Author
-
Mariusz Stopa
- Subjects
category theory ,topos theory ,categorical logic ,propositional logic ,intuitionistic logic ,non-classical logic ,Philosophy (General) ,B1-5802 - Abstract
This article is intended for philosophers and logicians as a short partial introduction to category theory (CT) and its peculiar connection with logic. First, we consider CT itself. We give a brief insight into its history, introduce some basic definitions and present examples. In the second part, we focus on categorical topos semantics for propositional logic. We give some properties of logic in toposes, which, in general, is an intuitionistic logic. We next present two families of toposes whose tautologies are identical with those of classical propositional logic. The relatively extensive bibliography is given in order to support further studies.
- Published
- 2018
44. Symbolic AI for XAI: Evaluating LFIT Inductive Programming for Explaining Biases in Machine Learning
- Author
-
Alfonso Ortega, Julian Fierrez, Aythami Morales, Zilong Wang, Marina de la Cruz, César Luis Alonso, and Tony Ribeiro
- Subjects
explainable artificial intelligence ,inductive logic programming ,fair recruitment ,fair income level ,propositional logic ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Machine learning methods are growing in relevance for biometrics and personal information processing in domains such as forensics, e-health, recruitment, and e-learning. In these domains, white-box (human-readable) explanations of systems built on machine learning methods become crucial. Inductive logic programming (ILP) is a subfield of symbolic AI aimed to automatically learn declarative theories about the processing of data. Learning from interpretation transition (LFIT) is an ILP technique that can learn a propositional logic theory equivalent to a given black-box system (under certain conditions). The present work takes a first step to a general methodology to incorporate accurate declarative explanations to classic machine learning by checking the viability of LFIT in a specific AI application scenario: fair recruitment based on an automatic tool generated with machine learning methods for ranking Curricula Vitae that incorporates soft biometric information (gender and ethnicity). We show the expressiveness of LFIT for this specific problem and propose a scheme that can be applicable to other domains. In order to check the ability to cope with other domains no matter the machine learning paradigm used, we have done a preliminary test of the expressiveness of LFIT, feeding it with a real dataset about adult incomes taken from the US census, in which we consider the income level as a function of the rest of attributes to verify if LFIT can provide logical theory to support and explain to what extent higher incomes are biased by gender and ethnicity.
- Published
- 2021
- Full Text
- View/download PDF
45. The complete set of minimal simple graphs that support unsatisfiable 2-CNFs.
- Author
-
Karve, Vaibhav and Hirani, Anil N.
- Subjects
- *
COMPLETE graphs , *MULTIGRAPH , *MATROIDS , *NORMAL forms (Mathematics) - Abstract
A propositional logic sentence in conjunctive normal form that has clauses of length at most two (a 2-CNF) can be associated with a multigraph in which the vertices correspond to the variables and edges to clauses. We show that every 2-CNF that has been reduced under the application of certain tautologies, is equisatisfiable to a 2-CNF whose associated multigraph is, in fact, a simple graph. Our main result is a complete characterization of graphs that can support unsatisfiable 2-CNF sentences. We show that a simple graph can support an unsatisfiable reduced 2-CNF sentence if and only if it contains any one of four specific small graphs as a topological minor. Equivalently, all reduced 2-CNF sentences supported on a given simple graph are satisfiable if and only if all subdivisions of those four graphs are forbidden as subgraphs of the original graph. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
46. Truth Diagrams Versus Extant Notations for Propositional Logic.
- Author
-
Cheng, Peter C.-H.
- Subjects
CHARTS, diagrams, etc. ,LOGIC ,PROPOSITIONAL calculus ,REPRESENTATIONS of graphs ,TRUTH - Abstract
Truth diagrams (TDs) are introduced as a novel graphical representation for propositional logic (PL). To demonstrate their epistemic efficacy a set of 28 concepts are proposed that any comprehensive representation for PL should encompass. TDs address all the criteria whereas seven other existing representations for PL only provide partial coverage. These existing representations are: the linear formula notation, truth tables, a PL specific interpretation of Venn Diagrams, Frege's conceptual notation, diagrams from Wittgenstein's Tractatus, Pierce's alpha graphs and Gardner's shuttle diagrams. The comparison of the representations succeeds in distinguishing ideas that are fundamental to PL from features of common PL representations that are somewhat arbitrary. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
47. Pool resolution is NP-hard to recognize
- Author
-
Buss, Samuel R.
- Subjects
Mathematics ,Algebra ,Mathematics, general ,Mathematical Logic and Foundations ,Resolution ,Proof search ,Computational complexity ,Propositional logic ,NP-completeness ,Satisfiability - Abstract
A pool resolution proof is a dag-like resolution proof which admits a depth-first traversal tree in which no variable is used as a resolution variable twice on any branch. The problem of determining whether a given dag-like resolution proof is a valid pool resolution proof is shown to be NP-complete.
- Published
- 2009
48. TrueBiters, an educational game to practice the truth tables of propositional logic: Development, evaluation, and lessons learned.
- Author
-
De Troyer, Olga, Lindberg, Renny, and Sajjadi, Pejman
- Subjects
EDUCATIONAL games ,CARD games ,PREDICATE (Logic) ,POKEMON Go ,LOGIC ,EDUCATION - Abstract
Since years, the logic course in the first year of our Bachelor program in Computer Science, dealing with propositional and predicate logic, suffers from poor pass rates. The students perceive the formal and abstract mechanisms of logic as difficult and awkward to deal with. Many students do not study the course material on a regular basis, which results in poor mastery of the basic concepts and principles. Consequently, students often fall behind as the course progresses. Previous attempts to remedy this procrastination behavior had little success. Since educational games are commended as an enjoyable way to foster learning, we decided in 2016 to develop an educational game for the course. This game, called TrueBiters, is a two-player competitive game inspired by an existing card game. We adapted this card game to propositional logic and digitized it as an app. The development was done in an iterative way. Each version was evaluated and using the received user feedback and evaluation results the game was improved and reevaluated. Based on the results of the evaluations we can conclude that the game is well suited for its target audience, i.e., logically-mathematically intelligent people, and is a good supplement, and even replacement for some of the traditional face-to-face exercise sessions. However, we also learned that a proper embedding into the course is needed to ensure that all students actually use the game and benefit from playing it. In this paper we present the game, explain and motivate its evolution, discuss the evaluations performed, and present lessons learned. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
49. On Computing All Abductive Explanations from a Propositional Horn Theory.
- Author
-
Eiter, Thomas and Makino, Kazuhisa
- Subjects
ALGORITHMS ,MONOTONE operators ,POLYNOMIALS ,ALGEBRA ,COMPUTER logic ,MATHEMATICAL logic - Abstract
Abduction is a fundamental mode of reasoning with applications in many areas of AI and Computer Science. The computation of abductive explanations is an important computational problem, which is at the core of early systems such as the ATMS and Clause Management Systems and is intimately related to prime implicate generation in propositional logic. Many algorithms have been devised for computing some abductive explanation, and the complexity of the problem has been well studied. However, little attention has been paid to the problem of computing multiple explanations, and in particular all explanations for an abductive query. We fill this gap and consider the computation of all explanations of an abductive query from a propositional Horn theory, or of a polynomial subset of them. Our study pays particular attention to the form of the query, ranging from a literal to a compound formula, to whether explanations are based on a set of abducible literals and to the representation of the Horn theory, either by a Horn conjunctive normal form (CNF) or model-based in terms of its characteristic models. For these combinations, we present either tractability results in terms of polynomial total-time algorithms, intractability results in terms of nonexistence of such algorithms (unless P = NP), or semi-tractability results in terms of solvability in quasi-polynomial time, established by polynomial-time equivalence to the problem of dualizing a monotone CNF expression. Our results complement previous results in the literature, and refute a longstanding conjecture by Selman and Levesque. They elucidate the complexity of generating all abductive explanations and shed light on related problems such as generating sets of restricted prime implicates of a Horn theory. The algorithms for tractable cases can be readily applied for generating a polynomial subset of explanations in polynomial time. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
50. Resolution Lower Bounds for the Weak Pigeonhole Principle.
- Author
-
Raz, Ran
- Subjects
COMPUTATIONAL complexity ,PROOF theory ,THEORY ,MATHEMATICAL logic ,PROPOSITION (Logic) - Abstract
We prove that any Resolution proof for the weak pigeonhole principle, with n holes and any number of pigeons, is of length ω(2
n ∈ ), (for some global constant ∈ > 0). One corollary is that a certain propositional formulation of the statement NP ⊄ P/poly does not have short Resolution proofs. [ABSTRACT FROM AUTHOR]- Published
- 2004
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.