1,207 results
Search Results
2. Formal Versus Rigorous Mathematics: How to Get Your Papers Published.
- Author
-
Rosenthal, Erik and Beckert, Bernhard
- Abstract
This talk will consider rigorous mathematics and the nature of proof. It begins with an historical perspective and follows the development of formal mathematics. The talk will conclude with examples demonstrating that understanding the relationship between formal mathematics and rigorous proof can assist with both the discovery and the quality of real proofs of real results. Keywords: rigor, formal mathematics, nature of proof. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
3. Mizar's Soft Type System.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Schneider, Klaus, Brandt, Jens, and Wiedijk, Freek
- Abstract
In Mizar, unlike in most other proof assistants, the types are not part of the foundations of the system. Mizar is based on untyped set theory, which means that in Mizar expressions are typed but the values of those expressions are not. In this paper we present the Mizar type system as a collection of type inference rules. We will interpret Mizar types as soft types, by translating Mizar's type judgments into sequents of untyped first order predicate logic. We will then prove that the Mizar type system is correct with respect to this translation in the sense that each derivable type judgment translates to a provable sequent. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
4. Cost-Based Fragmentation for Distributed Complex Value Databases.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Parent, Christine, Schewe, Klaus-Dieter, Storey, Veda C., Thalheim, Bernhard, and Ma, Hui
- Abstract
The major purpose of the design of distributed databases is to improve system performance and to increase system reliability. Fragmentation and allocation play important roles in the development of a cost-efficient system. This paper addresses the problem of fragmentation in the context of complex value databases, which cover the common aspects of object-oriented databases, object-relational databases and XML. In this paper, we present a cost-based approach for horizontal and vertical fragmentation. Our approach is based on a cost model that takes the structure of complex value databases into account. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
5. Research Issues in Active Conceptual Modeling of Learning: Summary of Panel Discussions in Two Workshops (May 2006) and (November 2006).
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Chen, Peter P., Wong, Leah Y., Delcambre, Lois, Akoka, Jacky, and Sølvberg, Arne
- Abstract
The SPAWAR Systems Center (SSC San Diego) of the U.S. Navy hosted two workshops on Active Conceptual Modeling of Learning (ACM-L). The first workshop was held at SSC San Diego on May 10-12, 2006 to introduce the Science &Technology (S&T) Initiative and identify a Research and Development agenda for the technology development investigation. Eleven invited researchers in Conceptual Modeling presented position papers on the proposed S&T Initiative. The second workshop was held on November 8, 2006 at the 25th International Conference on Conceptual Modeling, ER 2006, 6-9 November 2006, in Tucson, Arizona. Complementary to the May Workshop, the November workshop was a forum for the international researchers and practitioners to present their papers as a result of a call for papers and to exchange ideas from various perspectives of the subject. This paper describes research issues identified by participants from the two ACM-L workshops. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
6. Ready to Preorder: Get Your BCCSP Axiomatization for Free!
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Mossakowski, Till, Montanari, Ugo, Haveraaen, Magne, Aceto, Luca, and Fokkink, Wan
- Abstract
This paper contributes to the study of the equational theory of the semantics in van Glabbeek's linear time - branching time spectrum over the language BCCSP, a basic process algebra for the description of finite synchronization trees. It offers an algorithm for producing a complete (respectively, ground-complete) equational axiomatization of any behavioral congruence lying between ready simulation equivalence and partial traces equivalence from a complete (respectively, ground-complete) inequational axiomatization of its underlying precongruence—that is, of the precongruence whose kernel is the equivalence. The algorithm preserves finiteness of the axiomatization when the set of actions is finite. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
7. A Framework for Incorporating Trust into Formal Systems Development.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Jones, Cliff B., Liu, Zhiming, Woodcock, Jim, Degerlund, Fredrik, and Sere, Kaisa
- Abstract
Formal methods constitute a means of developing reliable and correctly behaving software based on a specification. In scenarios where information technology is used as a foundation to enable human communication, this is, however, not always enough. Successful interaction between humans often depends on the concept of trust, which is different from program correctness. In this paper, we present a framework for integrating trust into a formal development process, allowing for the construction of formally correct programs for communication, embracing trust as a central concept. We present a coordination language for use with action systems, taking a modular approach of separating trust aspects from other functionality. We also believe that our work can be adapted to modelling other aspects beside trust. Throughout the paper, we employ a case study as a testbed for our concepts. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
8. Domain Theory: Practice and Theories A Discussion of Possible Research Topics.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Jones, Cliff B., Zhiming Liu, Woodcock, Jim, and Bjørner, Dines
- Abstract
By a domain we mean a universe of discourse. Typical examples are (partially) man-made universes of discourse - such as Air Traffic, Airports, Financial Services (banks, insurance companies, securities trading [brokers, traders, stock exchanges]), Health Care (hospitals etc.), Secure IT Systems (according to Intl. ISO/IEC Standard 17799), The Market (consumers, retailers, wholesalers, producers, "the supply chain"), Transportation (road, air, sea and/or rail transport), etc. We shall outline how one might describe such (infrastructure component) domains, informally and formally - what the current descriptional limitations appear to be, and, hence, the prospects for future research as well as practice. The current paper is based on Part IV, Chaps. 8-16 of [3]. The volume is one of [1,2,3]. The aim of this paper is to suggest a number of areas of domain theory and methodology research. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
9. A Proposal for Combining Formal Concept Analysis and Description Logics for Mining Relational Data.
- Author
-
Carbonell, Jaime G., Siekmann, Jörg, Kuznetsov, Sergei O., Schmidt, Stefan, Rouane, Mohamed Hacene, Huchard, Marianne, Napoli, Amedeo, and Valtchev, Petko
- Abstract
Recent advances in data and knowledge engineering have emphasized the need for formal concept analysis (fca) tools taking into account structured data. There are a few adaptations of the classical fca methodology for handling contexts holding on complex data formats, e.g. graph-based or relational data. In this paper, relational concept analysis (rca) is proposed, as an adaptation of fca for analyzing objects described both by binary and relational attributes. The rca process takes as input a collection of contexts and of inter-context relations, and yields a set of lattices, one per context, whose concepts are linked by relations. Moreover, a way of representing the concepts and relations extracted with rca is proposed in the framework of a description logic. The rca process has been implemented within the Galicia platform, offering new and efficient tools for knowledge and software engineering. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
10. Improving Saddleback Search: A Lesson in Algorithm Design.
- Author
-
Uustalu, Tarmo and Bird, Richard S.
- Abstract
Over the past twenty-five years or so Saddleback search has been used as an paradigm of how methods of formal program construction can quickly lead to a simple and effective algorithm for searching an ordered table. In this paper we revisit the problem and show that saddleback search is not in fact the best algorithm when one dimension of the table is much smaller than the other. The paper is structured in the form of a classroom discussion involving a teacher and four very clever students. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
11. Decomposing Interactions.
- Author
-
Johnson, Michael, Vene, Varmo, and Bowles, Juliana Küster Filipe
- Abstract
In UML 2.0 sequence diagrams have been considerably extended and are now fundamentally better structured. Interactions in sequence diagrams can be structured using so-called interaction fragments, including alt (alternative behaviour), par (parallel behaviour), neg (forbidden behaviour), assert (mandatory behaviour) and ref (reference another diagram). The operator ref in particular greatly improves the way diagrams can be decomposed. In previous work we have given a semantics to a subset of sequence diagrams using labelled event structures, a true-concurrent model that naturally captures alternative and parallel behaviour. In this paper, we expand that work to address refinement and show how to obtain a refined model by means of a powerful categorical construction over two categories of labelled event structures. The underlying motivation for this work is reasoning and verification of complex scenario-based inter-object behavioural models. We conclude the paper with a discussion on future work. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
12. Interactive Verification of Medical Guidelines.
- Author
-
Misra, Jayadev, Nipkow, Tobias, Sekerinski, Emil, Schmitt, Jonathan, Hoffmann, Alwin, Balser, Michael, Reif, Wolfgang, and Marcos, Mar
- Abstract
In the medical domain, there is a tendency to standardize health care by providing medical guidelines as summary of the best evidence concerning a particular topic. Based on the assumption that guidelines are similar to software, we try to carry over techniques from software engineering to guideline development. In this paper, we show how to apply formal methods, namely interactive verification to improve the quality of guidelines. As an example, we have worked on a guideline from the American Academy of Pediatrics for the management of jaundice in newborns. Contributions of this paper are as follows: (I) a formalized model of a nontrivial example guideline, (II) an approach to verify properties of medical guidelines interactively, and (III) verification of a first example property.This work has been partially funded by the European Commission's IST program, under contract number IST-FP6-508794 Protocure II. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
13. An MDA Approach Towards Integrating Formal and Informal Modeling Languages.
- Author
-
Fitzgerald, John, Hayes, Ian J., Tarlecki, Andrzej, Kim, Soon-Kyeong, Burger, Damian, and Carrington, David
- Abstract
The Model Driven Architecture (MDA) involves automated trans-formations between software models defined in different languages at different abstraction levels. This paper takes an MDA approach to integrate a formal modeling language (Object-Z) with an informal modeling language (UML) via model transformation. This paper shows how formal and informal modeling languages can be cooperatively used in the MDA framework and how the trans-formations between models in these languages can be achieved using an MDA development environment. The MDA model transformation techniques allow us to have a reusable transformation between formal and informal modeling languages. The integrated approach provides an effective V&V technique for the MDA. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
14. Using Unified Modeling Language for Conceptual Modelling of Knowledge-Based Systems.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Parent, Christine, Schewe, Klaus-Dieter, Storey, Veda C., Thalheim, Bernhard, and Abdullah, Mohd Syazwan
- Abstract
This paper discusses extending the Unified Modelling Language by means of a profile for modelling knowledge-based system in the context of Model Driven Architecture (MDA) framework. The profile is implemented using the eXecutable Modelling Framework (XMF) Mosaic tool. A case study from the health care domain demonstrates the practical use of this profile; with the prototype implemented in Java Expert System Shell (Jess). The paper also discusses the possible mapping of the profile elements to the platform specific model (PSM) of Jess and provides some discussion on the Production Rule Representation (PRR) standardisation work. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
15. Towards Automated Reasoning on ORM Schemes.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Parent, Christine, Schewe, Klaus-Dieter, Storey, Veda C., Thalheim, Bernhard, and Jarrar, Mustafa
- Abstract
The goal of this article is to formalize Object Role Modeling (ORM) using the $\mathcal{DLR}$ description logic. This would enable automated reasoning on the formal properties of ORM diagrams, such as detecting constraint contradictions and implications. In addition, the expressive, methodological, and graphical capabilities of ORM make it a good candidate for use as a graphical notation for most description logic languages. In this way, industrial experts who are not IT savvy will still be able to build and view axiomatized theories (such as ontologies, business rules, etc.) without needing to know the logic or reasoning foundations underpinning them. Our formalization in this paper is structured as 29 formalization rules, that map all ORM primitives and constraints into $\mathcal{DLR}$, and 2 exceptions of complex cases. To this end, we illustrate the implementation of our formalization as an extension to DogmaModeler, which automatically maps ORM into DIG and uses Racer as a background reasoning engine to reason about ORM diagrams. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
16. Relational Data Tailoring Through View Composition.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Parent, Christine, Schewe, Klaus-Dieter, Storey, Veda C., Thalheim, Bernhard, and Bolchini, Cristiana
- Abstract
This paper presents a methodology to derive views over a relational database by applying a sequence of appropriately defined operations to the global schema. Such tailoring and composition process aims at offering personalized views over the database schema, so as to improve its ability to support the new needs of customers, support evolutionary software development, and fix existing legacy database design problems. The process is driven by the designer's knowledge of the possible operational contexts, in terms of the various dimensions that contribute to determine which portions of the global schema are relevant with respect to the different actors and situations. We formally introduce some operators, defined on sets of relations, which tailor the schema and combine the intermediate views to derive different final views, suitable for the different envisioned situations. The application to a case study is also presented, to better clarify the proposed approach. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
17. Using Ontological Modeling in a Context-Aware Summarization System to Adapt Text for Mobile Devices.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Chen, Peter P., Wong, Leah Y., Garcia, Luís Fernando Fortes, de Lima, José Valdeni, and Loh, Stanley
- Abstract
This paper presents a context-aware text summarizer based on ontologies intended to be used for adapting information to mobile devices. The system generates summaries from texts according to the profile of the user and the context where he/she is at the moment. Context is determined by spatial and temporal localization. Ontologies are used to allow identifying which parts of the texts are related to the user's profile and to the context. Ontologies are structured as hierarchies of concepts and concepts are represented by keywords with a weight associated. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
18. Actively Evolving Conceptual Models for Mini-World and Run-Time Environment Changes.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Chen, Peter P., Wong, Leah Y., Krishna, P. Radha, and Karlapalem, Kamalakar
- Abstract
Run-time application environments are affected by the changes in mini-world or technology changes. Large number of applications are process driven. For robust applications that can evolve over time, there is a need for a methodology that implicitly handles changes at various levels from mini-world to run-time environment through a layers of models and systems. In this paper, we present ER* methodology for evolving applications. In the context of this paper, the role of two-way active behaviour and template driven development of applications is presented. This methodology facilitates capturing active behaviour from run-time transactions and provides a means of using this knowledge to guide subsequent application design and its evolution. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
19. Parametric (Co)Iteration vs. Primitive Direcursion.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Mossakowski, Till, Montanari, Ugo, Haveraaen, Magne, and Glimming, Johan
- Abstract
Freyd showed that in certain CPO-categories, locally continuous functors have minimal invariants, which possess a structure that he termed dialgebra. This gives rise to a category of dialgebras and homomorphisms, where the minimal invariants are initial, inducing a powerful recursion scheme (direcursion) on a cpo. In this paper, we identify a problem appearing when translating (co)iterative functions (on a fixed parameterised datatype) to direcursion (on the same datatype), and present a solution to this problem as a recursion scheme (primitive direcursion), generalising and symmetrising primitive (co)recursion for endofunctors. To this end, we give a uniform technique for translating (co)iterative maps into direcursive maps. This immediately gives a plethora of examples of direcursive functions, improving on the situation in the literature where only a few examples have appeared. Moreover, a technical trick proposed in a POPL paper is avoided for the translated maps. We conclude the paper by applying the results to a denotational semantics of Abadi and Cardelli's typed object calculus, and linking them to previous work on higher-order coalgebra and to bisimulations. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
20. Higher Dimensional Trees, Algebraically.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Mossakowski, Till, Montanari, Ugo, Haveraaen, Magne, Ghani, Neil, and Kurz, Alexander
- Abstract
In formal language theory, James Rogers published a series of innovative papers generalising strings and trees to higher dimensions. Motivated by applications in linguistics, his goal was to smoothly extend the core theory of the formal languages of strings and trees to higher dimensions. Rogers' definitions rely on a specific representation of higher dimensional trees. This paper presents an alternative approach which focusses more on their universal properties and is based upon category theory, algebras, coalgebras and containers. Our approach reveals that Rogers' trees are canonical constructions which are also particularly beautiful. We also provide new theoretical results concerning higher dimensional trees. Finally, we provide evidence for our devout conviction that clean mathematical theories provide the basis for clean implementations by indicating how our abstract presentation will make computing with higher dimensional trees easier. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
21. A Randomized Algorithm for BBCSPs in the Prover-Verifier Model.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Jones, Cliff B., Liu, Zhiming, Woodcock, Jim, and Subramani, K.
- Abstract
In this paper we introduce a Prover-Verifier model for analyzing the computational complexity of a class of Constraint Satisfaction problems termed Binary Boolean Constraint Satisfaction problems (BBCSPs). BBCSPs represent an extremely general class of constraint satisfaction problems and find applications in a wide variety of domains including constraint programming, puzzle solving and program testing. We establish that each instance of a BBCSP admits a coin-flipping Turing Machine that halts in time polynomial in the size of the input. The prover P in the Prover-Verifier model is endowed with very limited powers; in particular, it has no memory and it can only pose restricted queries to the verifier. The verifier on the other hand is both omniscient in that it is cognizant of all the problem details and insincere in that it does not have to decide a priori on the intended proof. However, the verifier must stay consistent in its responses. Inasmuch as our provers will be memoryless and our verifiers will be asked for extremely simple certificates, our work establishes the existence of a simple, randomized algorithm for BBCSPs. Our model itself serves as a basis for the design of zero-knowledge machine learning algorithms in that the prover ends up learning the proof desired by the verifier. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
22. Compensable Programs.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Jones, Cliff B., Zhiming Liu, Woodcock, Jim, and He Jifeng
- Abstract
Transaction-based services are increasingly being applied in solving many universal interoperability problems. Compensation is one typical feature for long-running transactions. This paper presents a design matrix model for specifying the behaviour of compensable programs and provides new healthiness conditions to capture these new programming features. The new model for handling exception and compensation is built as conservative extension of the standard relational model in the sense that the algebraic laws presented in [14] remain valid. The paper also shows that the design matrix model is a retract of the design model. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
23. Approximating a Behavioural Pseudometric Without Discount for Probabilistic Systems.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Rangan, C. Pandu, Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Seidl, Helmut, van Breugel, Franck, Sharma, Babita, and Worrell, James
- Abstract
Desharnais, Gupta, Jagadeesan and Panangaden introduced a family of behavioural pseudometrics for probabilistic transition systems. These pseudometrics are a quantitative analogue of probabilistic bisimilarity. Distance zero captures probabilistic bisimilarity. Each pseudometric has a discount factor, a real number in the interval (0, 1]. The smaller the discount factor, the more the future is discounted. If the discount factor is one, then the future is not discounted at all. Desharnais et al. showed that the behavioural distances can be calculated up to any desired degree of accuracy if the discount factor is smaller than one. In this paper, we show that the distances can also be approximated if the future is not discounted. A key ingredient of our algorithm is Tarski's decision procedure for the first order theory over real closed fields. By exploiting the Kantorovich-Rubinstein duality theorem we can restrict to the existential fragment for which more efficient decision procedures exist. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
24. The Complexity of Generalized Satisfiability for Linear Temporal Logic.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Rangan, C. Pandu, Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Seidl, Helmut, Bauland, Michael, Schneider, Thomas, Schnoor, Henning, and Schnoor, Ilka
- Abstract
In a seminal paper from 1985, Sistla and Clarke showed that satisfiability for Linear Temporal Logic (LTL) is either NP-complete or PSPACE-complete, depending on the set of temporal operators used . If, in contrast, the set of propositional operators is restricted, the complexity may decrease. This paper undertakes a systematic study of satisfiability for LTL formulae over restricted sets of propositional and temporal operators. Since every propositional operator corresponds to a Boolean function, there exist infinitely many propositional operators. In order to systematically cover all possible sets of them, we use Post's lattice. With its help, we determine the computational complexity of LTL satisfiability for all combinations of temporal operators and all but two classes of propositional functions. Each of these infinitely many problems is shown to be either PSPACE-complete, NP-complete, or in P. Keywords: computational complexity, linear temporal logic. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
25. Separation Logic for Small-Step cminor.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Schneider, Klaus, Brandt, Jens, Appel, Andrew W., and Blazy, Sandrine
- Abstract
cminor is a mid-level imperative programming language; there are proved-correct optimizing compilers from C to cminor and from cminor to machine language. We have redesigned cminor so that it is suitable for Hoare Logic reasoning and we have designed a Separation Logic for cminor. In this paper, we give a small-step semantics (instead of the big-step of the proved-correct compiler) that is motivated by the need to support future concurrent extensions. We detail a machine-checked proof of soundness of our Separation Logic. This is the first large-scale machine-checked proof of a Separation Logic w.r.t. a small-step semantics. The work presented in this paper has been carried out in the Coq proof assistant. It is a first step towards an environment in which concurrent cminor programs can be verified using Separation Logic and also compiled by a proved-correct compiler with formal end-to-end correctness guarantees. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
26. A Temporal Logic of Robustness.
- Author
-
Carbonell, Jaime G., Siekmann, Jörg, Konev, Boris, Wolter, Frank, French, Tim, Mc Cabe-Dansted, John C., and Reynolds, Mark
- Abstract
It can be desirable to specify polices that require a system to achieve some outcome even if a certain number of failures occur. This paper proposes a logic, RoCTL*, which extends CTL* with operators from Deontic logic, and a novel operator referred to as "Robustly". This novel operator acts as variety of path quantifier allowing us to consider paths which deviate from the desired behaviour of the system. Unlike most path quantifiers, the Robustly operator must be evaluated over a path rather than just a state; the Robustly operator quantifies over paths produced from the current path by altering a single step. The Robustly operator roughly represents the phrase "even if an additional failure occurs now or in the future". This paper examines the expressivity of this new logic, motivates its use and shows that it is decidable. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
27. Towards Efficient Satisfiability Checking for Boolean Algebra with Presburger Arithmetic.
- Author
-
Carbonell, Jaime G., Siekmann, Jörg, Pfenning, Frank, Kuncak, Viktor, and Rinard, Martin
- Abstract
Boolean Algebra with Presburger Arithmetic (BAPA) is a decidable logic that combines 1) Boolean algebra of sets of uninterpreted elements (BA) and 2) Presburger arithmetic (PA). BAPA can express relationships between integer variables and cardinalities of unbounded sets. In combination with other decision procedures and theorem provers, BAPA is useful for automatically verifying quantitative properties of data structures. This paper examines QFBAPA, the quantifier-free fragment of BAPA. The computational complexity of QFBAPA satisfiability was previously unknown; previous QFBAPA algorithms have non-deterministic exponential time complexity due to an explosion in the number of introduced integer variables. This paper shows, for the first time, how to avoid such exponential explosion. We present an algorithm for checking satisfiability of QFBAPA formulas by reducing them to formulas of quantifier-free PA, with only O(n log(n)) increase in formula size. We prove the correctness of our algorithm using a theorem about sparse solutions of integer linear programming problems. This is the first proof that QFBAPA satisfiability is in NP and therefore NP-complete. We implemented our algorithm in the context of the Jahob verification system. Our preliminary experiments suggest that our algorithm, although not necessarily better for proving formula unsatisfiability, is more effective in detecting formula satisfiability than previous approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
28. Formalization of Continuous Probability Distributions.
- Author
-
Carbonell, Jaime G., Siekmann, Jörg, Pfenning, Frank, Hasan, Osman, and Tahar, Sofiène
- Abstract
Continuous probability distributions are widely used to mathematically describe random phenomena in engineering and physical sciences. In this paper, we present a methodology that can be used to formalize any continuous random variable for which the inverse of the cumulative distribution function can be expressed in a closed mathematical form. Our methodology is primarily based on the Standard Uniform random variable, the classical cumulative distribution function properties and the Inverse Transform method. The paper includes the higher-order-logic formalization details of these three components in the HOL theorem prover. To illustrate the practical effectiveness of the proposed methodology, we present the formalization of Exponential, Uniform, Rayleigh and Triangular random variables. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
29. Local Proofs for Global Safety Properties.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Damm, Werner, Hermanns, Holger, Cohen, Ariel, and Namjoshi, Kedar S.
- Abstract
This paper explores the concept of locality in proofs of global safety properties of asynchronously composed, multi-process programs. Model checking on the full state space is often infeasible due to state explosion. A local proof, in contrast, is a collection of per-process invariants, which together imply the global safety property. Local proofs can be compact: but a central problem is that local reasoning is incomplete. In this paper, we present a "completion" algorithm, which gradually exposes facts about the internal state of components, until either a local proof or a real error is discovered. Experiments show that local reasoning can have significantly better performance over a reachability computation. Moreover, for some parameterized protocols, a local proof can be used to show correctness for all instances. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
30. Improvements to the Tableau Prover PITP.
- Author
-
Carbonell, Jaime G., Siekmann, Jörg, Olivetti, Nicola, Avellone, Alessandro, Fiorino, Guido, and Moscato, Ugo
- Abstract
In this paper we discuss the new version of PITP, a procedure to decide propositional intuitionistic logic, which turns out at the moment to be the best propositional prover on ILTP. The changes in the strategy and implementation make the new version of PITP faster and capable of deciding more formulas than the previous one. We give a short account both of the old optimizations and the changes in the strategy with respect to the previous version. We use ILTP library and random generated formulas to compare the implementation described in this paper to the other provers (including our old version of PITP). [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
31. Extended Linear Scan: An Alternate Foundation for Global Register Allocation.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Rangan, C. Pandu, Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Krishnamurthi, Shriram, Odersky, Martin, Sarkar, Vivek, and Barik, Rajkishore
- Abstract
In this paper, we extend past work on Linear Scan register allocation, and propose two Extended Linear Scan (ELS) algorithms that retain the compile-time efficiency of past Linear Scan algorithms while delivering performance that can match or surpass that of Graph Coloring. Specifically, this paper makes the following contributions: - We highlight three fundamental theoretical limitations in using Graph Coloring as a foundation for global register allocation, and introduce a basic Extended Linear Scan algorithm, ELS0, which addresses all three limitations for the problem of Spill-Free Register Allocation. - We introduce the ELS1 algorithm which extends ELS0 to obtain a greedy algorithm for the problem of Register Allocation with Total Spills. - Finally, we present experimental results to compare the Graph Coloring and Extended Linear Scan algorithms. Our results show that the compile-time speedups for ELS1 relative to GC were significant, and varied from 15× to 68×. In addition, the resulting execution time improved by up to 5.8%, with an average improvement of 2.3%. Together, these results show that Extended Linear Scan is promising as an alternate foundation for global register allocation, compared to Graph Coloring, due to its compile-time scalability without loss of execution time performance. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
32. Type Safety for FJ and FGJ.
- Author
-
Barkaoui, Kamel, Cavalcanti, Ana, Cerone, Antonio, Wang, Shuling, Long, Quan, and Qiu, Zongyan
- Abstract
Mainly concerned with type safety, Featherweight Java, or FJ, is a well known minimal core for Java and Generic Java. However, in the type system of FJ, the treatment of downcast is omitted. In this paper we propose a stronger type system for FJ and FGJ. In order to deal with the cast problems, we introduce some special techniques for types, and also strengthen the types for expressions and methods in terms of the type declaration notations. Supported by the type system and our techniques, we can ensure properties stronger than the ones proved in Igarashi et al's original FJ paper. Examples making the above mentioned contributions clear are illustrated throughout this paper. Furthermore a case study on design patterns showing the advantages of our results is given. Keywords: Featherweight Java, Downcast, Type Safety, Observer Pattern. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
33. Catching and Identifying Bugs in Register Allocation.
- Author
-
Yi, Kwangkeun, Huang, Yuqiang, Childers, Bruce R., and Soffa, Mary Lou
- Abstract
Although there are many register allocation algorithms that work well, it can be difficult to correctly implement these algorithms. As a result, it is common for bugs to remain in the register allocator, even after the compiler is released. The register allocator may run, but bugs can cause it to produce incorrect output code. The output program may even execute properly on some test data, but errors can remain. In this paper, we propose novel data flow analyses to statically check that the output code from the register allocator is correct in terms of its data dependences. The approach is accurate, fast, and can identify and report error locations and types. No false alarms are produced. The paper describes our approach, called SARAC, and a tool, called ra-analyzer, that statically checks a register allocation and reports the errors it finds. The tool has an average compile-time overhead of only 8% and a modest average memory overhead of 85KB. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
34. A Practical String Analyzer by the Widening Approach.
- Author
-
Kobayashi, Naoki, Tae-Hyoung Choi, Oukseh Lee, Hyunha Kim, and Kyung-Goo Doh
- Abstract
The static determination of approximated values of string expressions has many potential applications. For instance, approximated string values may be used to check the validity and security of generated strings, as well as to collect the useful string properties. Previous string analysis efforts have been focused primarily on the maxmization of the precision of regular approximations of strings. These methods have not been completely satisfactory due to the difficulties in dealing with heap variables and context sensitivity. In this paper, we present an abstract-interpretation-based solution that employs a heuristic widening method. The presented solution is implemented and compared to JSA. In most cases, our solution gives results as precise as those produced by previous methods, and it makes the additional contribution of easily dealing with heap variables and context sensitivity in a very natural way. We anticipate the employment of our method in practical applications. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
35. Automatic Testing of Higher Order Functions.
- Author
-
Kobayashi, Naoki, Koopman, Pieter, and Plasmeijer, Rinus
- Abstract
This paper tackles a problem often overlooked in functional programming community: that of testing. Fully automatic test tools like ${\sf{Quickcheck}}$ and ${\sf {G{\forall}ST}}$ can test first order functions successfully. Higher order functions, HOFs, are an essential and distinguishing part of functional languages. Testing HOFs automatically is still troublesome since it requires the generation of functions as test argument for the HOF to be tested. Also the functions that are the result of the higher order function needs to be identified. If a counter example is found, the generated and resulting functions should be printed, but that is impossible in most functional programming languages. Yet, bugs in HOFs do occur and are usually more subtle due to the high abstraction level. In this paper we present an effective and efficient technique to test higher order functions by using intermediate data types. Such a data type mimics and controls the structure of the function to be generated. A simple additional function transforms this data structure to the function needed. We use a continuation based parser library as main example of the tests. Our automatic testing method for HOFs reveals errors in the library that was used for a couple of years without problems. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
36. XML Querying Using Ontological Information.
- Author
-
Alferes, José Júlio, Bailey, James, May, Wolfgang, Schwertel, Uta, Svensson, Hans Eric, and Wilk, Artur
- Abstract
The paper addresses the problem of using semantic annotations in XML documents for better querying XML data. We assume that the annotations refer to an ontology defined in OWL (Web Ontology Language). The intention is then to combine syntactic querying techniques on XML documents with OWL ontology reasoning to filter out semantically irrelevant answers. The solution presented in this paper is an extension of the declarative rule-based XML query and transformation language Xcerpt. The extension allows to interface an ontology reasoner from Xcerpt rules. This makes it possible to use Xcerpt to filter extracted XML data using ontological information. Additionally it allows to retrieve ontological information by sending semantic queries to a reasoner. The prototype implementation uses DIG (Description Logic interface) for communication with the OWL reasoner RacerPro where the ontology queries are answered. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
37. Reasoning with Temporal Constraints in RDF.
- Author
-
Alferes, José Júlio, Bailey, James, May, Wolfgang, Schwertel, Uta, Hurtado, Carlos, and Vaisman, Alejandro
- Abstract
Time management is a key feature needed in any query language for web and semistructured data. However, only recently this has been addressed by the Semantic Web community, through the study of temporal extensions to RDF (Resource Description Framework). In this paper we show that the ability of the RDF data model of handling unknown resources by means of blank nodes, naturally yields a rich framework for temporal reasoning in RDF. That is, even without knowing the interval of validity of some statements we can still entail useful knowledge from temporal RDF databases. To take advantage of this, we incorporate a class of temporal constraints over anonymous timestamps based on Allen's interval algebra. We show that testing entailment in temporal graphs with the constraints proposed reduces to closure computation and mapping discovery, that is, an extended form of the standard approach for testing entailment in non-temporal RDF graphs. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
38. Towards More Precise Typing Rules for Xcerpt.
- Author
-
Alferes, José Júlio, Bailey, James, May, Wolfgang, Schwertel, Uta, and Drabent, Włodzimierz
- Abstract
In previous papers we presented a type system for a substantial fragment of the Web query language Xcerpt. It is a descriptive type system: the typing of a program is an approximation of its semantics. The type system was expressed by means of rules, which could be seen as a comprehensible abstraction of a typing algorithm. That system treats some query terms in a rather simplistic way. As a result the approximations produced for them are rather imprecise. In this paper we provide an improved type system, producing more precise results. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
39. Combining Safe Rules and Ontologies by Interfacing of Reasoners.
- Author
-
Alferes, José Júlio, Bailey, James, May, Wolfgang, Schwertel, Uta, Aßmann, Uwe, Henriksson, Jakob, and Małuszyński, Jan
- Abstract
The paper presents a framework for hybrid combination of rule languages with constraint languages including but not restricted to Description-Logic-based ontology languages. It shows how reasoning in a combined language can be done by interfacing reasoners of the component languages. A prototype system based on the presented principle integrates Datalog with OWL by interfacing XSB Prolog[2] with a DIG-compliant[1] DL reasoner (e.g. Racer[17] ). [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
40. Proofs of Randomized Algorithms in Coq.
- Author
-
Uustalu, Tarmo, Audebaud, Philippe, and Paulin-Mohring, Christine
- Abstract
Randomized algorithms are widely used either for finding efficiently approximated solutions to complex problems, for instance primality testing, or for obtaining good average behavior, for instance in distributed computing. Proving properties of such algorithms requires subtle reasoning both on algorithmic and probabilistic aspects of the programs. Providing tools for the mechanization of reasoning is consequently an important issue. Our paper presents a new method for proving properties of randomized algorithms in a proof assistant based on higher-order logic. It is based on the monadic interpretation of randomized programs as probabilistic distribution [1]. It does not require the definition of an operational semantics for the language nor the development of a complex formalization of measure theory, but only use functionals and algebraic properties of the unit interval. Using this model, we show the validity of general rules for estimating the probability for a randomized algorithm to satisfy certain properties, in particular in the case of general recursive functions. We apply this theory for formally proving a program implementing a Bernoulli distribution from a coin flip and the termination of a random walk. All the theories and results presented in this paper have been fully formalized and proved in the Coq proof assistant [2]. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
41. Towards a Reference Ontology for Business Models.
- Author
-
Embley, David W., Olivé, Antoni, Ram, Sudha, Andersson, Birger, Bergholtz, Maria, Edirisuriya, Ananda, Ilayperuma, Tharaka, Johannesson, Paul, Gordijn, Jaap, Grégoire, Bertrand, Schmitt, Michael, Dubois, Eric, Abels, Sven, Hahn, Axel, Wangler, Benkt, and Weigand, Hans
- Abstract
Ontologies are viewed as increasingly important tools for structuring domains of interests. In this paper we propose a reference ontology of business models using concepts from three established business model ontologies; the REA, BMO, and e3-value. The basic concepts in the reference ontology concern actors, resources, and the transfer of resources between actors. Most of the concepts in the reference ontology are taken from one of the original ontologies, but we have also introduced a number of additional concepts, primarily related to resource transfers between business actors. The purpose of the proposed ontology is to increase the understanding of the original ontologies as well as the relationships between them, and also to seek opportunities to complement and improve on them. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
42. A Framework for Integrating XML Transformations.
- Author
-
Embley, David W., Olivé, Antoni, Ram, Sudha, Ce Dong, and Bailey, James
- Abstract
XML is the de facto standard for representing and exchanging data on the World Wide Web and XSLT is a primary language for XML transformation. Integration of XML data is an increasingly important problem and many methods have been developed. In this paper, we study the related and more difficult problem of how to integrate XSLT programs. Program integration can be particularly important for server-side XSLT applications, where it is necessary to generate a global XSLT program, that is a combination of some initial XSLT programs and which is required to operate over a newly integrated XML database. This global program should inherit as much functionality from the initial XSLT programs as possible, since designing a brand new global XSLT program from scratch could be expensive, slow and error prone, especially when the initial XSLT programs are large or/and complicated. However, it is a challenging task to develop methods to support XSLT integration. Difficulties such as template identification, unmapped template processing and template equivalence all need to be resolved. In this paper, we propose a framework for semi-automatic integration of XSLT programs. Our method makes use of static analysis techniques for XSLT and consists of four key steps: i) Pattern Specialization, ii) Template Translation, iii) Lost Template Processing and iv) Program Integration. We are not aware of any previous work that deals with integrating XML transformations. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
43. A Quantitative Summary of XML Structures.
- Author
-
Embley, David W., Olivé, Antoni, Ram, Sudha, Zi Lin, Bingsheng He, and Byron Choi
- Abstract
Statistical summaries in relational databases mainly focus on the distribution of data values and have been found useful for various applications, such as query evaluation and data storage. As xml has been widely used, e.g. for online data exchange, the need for (corresponding) statistical summaries in xml has been evident. While relational techniques may be applicable to the data values in xml documents, novel techniques are requried for summarizing the structures of xml documents. In this paper, we propose metrics for major structural properties, in particular, nestings of entities and one-to-many relationships, of XML documents. Our technique is different from the existing ones in that we generate a quantitative summary of an xml structure. By using our approach, we illustrate that some popular real-world and synthetic xml benchmark datasets are indeed highly skewed and hardly hierarchical and contain few recursions. We wish this preliminary finding shreds insight on improving the design of xml benchmarking and experimentations. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
44. Schema-Mediated Exchange of Temporal XML Data.
- Author
-
Embley, David W., Olivé, Antoni, Ram, Sudha, Dyreson, Curtis, Snodgrass, Richard T., Currim, Faiz, and Currim, Sabah
- Abstract
When web servers publish data formatted in XML, only the current state of the data is (generally) published. But data evolves over time as it is updated. Capturing that evolution is vital to recovering past versions, tracking changes, and evaluating temporal queries. This paper presents a system to build a temporal data collection, which records the history of each published datum rather than just its current state. The key to exchanging temporal data is providing a temporal schema to mediate the interaction between the publisher and the reader. The schema describes how to construct a temporal data collection by "gluing" individual states into an integrated history. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
45. The DeltaGrid Abstract Execution Model: Service Composition and Process Interference Handling.
- Author
-
Embley, David W., Olivé, Antoni, Ram, Sudha, Yang Xiao, Urban, Susan D., and Ning Liao
- Abstract
This paper introduces the DeltaGrid abstract execution model as a foundation for building a semantically robust execution environment for concurrent processes executing over Delta-Enabled Grid Services (DEGS). A DEGS is a Grid Service with an enhanced capability to capture incremental data changes, known as deltas, associated with service execution in the context of global processes. The abstract model contains a service composition model that provides multi-level protection against service execution failure, thus maximizing the forward recovery of a process. The model also contains a process recovery model to handle the possible impact caused by failure recovery of a process on other concurrently executing processes using data dependencies derived from a global execution history and using user-defined correctness criteria. This paper presents the abstract execution model and demonstrates its use. We also outline future directions for incorporating application exception handling and build a simulation framework for the DeltaGrid system. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
46. A Compositional Semantics of Plan Revision in Intelligent Agents.
- Author
-
Johnson, Michael, Vene, Varmo, Riemsdijk, M. Birna, and Meyer, John-Jules Ch.
- Abstract
This paper revolves around the so-called plan revision rules of the agent programming language 3APL. These rules can be viewed as a generalization of procedures. This generalization however results in the semantics of programs of the 3APL language no longer being compositional. This gives rise to problems when trying to define a proof system for the language. In this paper we define a restricted version of plan revision rules which extends procedures, but which does have a compositional semantics, as we will formally show. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
47. Proving Safety Properties of an Aircraft Landing Protocol Using I/O Automata and the PVS Theorem Prover: A Case Study.
- Author
-
Misra, Jayadev, Nipkow, Tobias, Sekerinski, Emil, Umeno, Shinya, and Lynch, Nancy
- Abstract
This paper presents an assertional-style verification of the aircraft landing protocol of NASA's SATS (Small Aircraft Transportation System) concept [1] using the I/O automata framework and the PVS theorem prover. We reconstructed the mathematical model of the landing protocol presented in [2] as an I/O automaton. In addition, we translated the I/O automaton into a corresponding PVS specification, and conducted a verification of the safety properties of the protocol using the assertional proof technique and the PVS theorem prover. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
48. Timed Testing with TorX.
- Author
-
Fitzgerald, John, Hayes, Ian J., Tarlecki, Andrzej, Bohnenkamp, Henrik, and Belinfante, Axel
- Abstract
TorX is a specification-based, on-the-fly testing tool that tests for ioco conformance of implementations w.r.t. a formal specification. This paper describes an extension of TorX to not only allow testing for functional correctness, but also for correctness w.r.t. timing properties expressed in the specification. An implementation then passes a timed test if it passes according to ioco, and if occurrence times of outputs or of quiescence signals are legal according to the specification. The specifications are described by means of non-deterministic safety timed automata. This paper describes the basic algorithms for ioco, the necessary modifications to standard safety timed automata to make them usable as an input formalism, a test-derivation algorithm from timed automata, and the concrete algorithms implemented in TorX for timed testing. Finally, practical concerns with respect to timed testing are discussed. Keywords: Model-based on-the-fly Testing, Timed Automata, Real-Time Testing, TorX, Tools. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
49. Automatic Hidden-Web Table Interpretation by Sibling Page Comparison.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Parent, Christine, Schewe, Klaus-Dieter, Storey, Veda C., Thalheim, Bernhard, and Tao, Cui
- Abstract
The longstanding problem of automatic table interpretation still illudes us. Its solution would not only be an aid to table processing applications such as large volume table conversion, but would also be an aid in solving related problems such as information extraction and semi-structured data management. In this paper, we offer a conceptual modeling solution for the common special case in which so-called sibling pages are available. The sibling pages we consider are pages on the hidden web, commonly generated from underlying databases. We compare them to identify and connect nonvarying components (category labels) and varying components (data values). We tested our solution using more than 2,000 tables in source pages from three different domains—car advertisements, molecular biology, and geopolitical information. Experimental results show that the system can successfully identify sibling tables, generate structure patterns, interpret tables using the generated patterns, and automatically adjust the structure patterns, if necessary, as it processes a sequence of hidden-web pages. For these activities, the system was able to achieve an overall F-measure of 94.5%. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
50. A Conceptual Model for Multidimensional Analysis of Documents.
- Author
-
Hutchison, David, Kanade, Takeo, Kittler, Josef, Kleinberg, Jon M., Mattern, Friedemann, Mitchell, John C., Naor, Moni, Nierstrasz, Oscar, Pandu Rangan, C., Steffen, Bernhard, Sudan, Madhu, Terzopoulos, Demetri, Tygar, Doug, Vardi, Moshe Y., Weikum, Gerhard, Parent, Christine, Schewe, Klaus-Dieter, Storey, Veda C., Thalheim, Bernhard, and Ravat, Franck
- Abstract
Data warehousing and OLAP are mainly used for the analysis of transactional data. Nowadays, with the evolution of Internet, and the development of semi-structured data exchange format (such as XML), it is possible to consider entire fragments of data such as documents as analysis sources. As a consequence, an adapted multidimensional analysis framework needs to be provided. In this paper, we introduce an OLAP multidimensional conceptual model without facts. This model is based on the unique concept of dimensions and is adapted for multidimensional document analysis. We also provide a set of manipulation operations. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.