207 results on '"Implicit computational complexity"'
Search Results
2. Implicit recursion-theoretic characterizations of counting classes.
- Author
-
Dal Lago, Ugo, Kahle, Reinhard, and Oitavem, Isabel
- Subjects
- *
TURING machines , *COMPUTATIONAL complexity , *COUNTING , *FUNCTION algebras , *RECURSIVE functions , *POLYNOMIAL time algorithms - Abstract
We give recursion-theoretic characterizations of the counting class # P , the class of those functions which count the number of accepting computations of non-deterministic Turing machines working in polynomial time. Moreover, we characterize in a recursion-theoretic manner all the levels { # P k } k ∈ N of the counting hierarchy of functions FCH , which result from allowing queries to functions of the previous level, and FCH itself as a whole. This is done in the style of Bellantoni and Cook's safe recursion, and it places # P in the context of implicit computational complexity. Namely, it relates # P with the implicit characterizations of FPTIME (Bellantoni and Cook, Comput Complex 2:97–110, 1992) and FPSPACE (Oitavem, Math Log Q 54(3):317–323, 2008), by exploiting the features of the tree-recursion scheme of FPSPACE . [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
3. Types for Complexity of Parallel Computation in Pi-calculus.
- Author
-
BAILLOT, PATRICK and GHYSELEN, ALEXIS
- Subjects
- *
PROGRAMMING languages , *CALCULUS , *FRACTIONAL calculus - Abstract
Type systems as a technique to analyse or control programs have been extensively studied for functional programming languages. In particular, some systems allow one to extract from a typing derivation a complexity bound on the program. We explore how to extend such results to parallel complexity in the setting of pi-calculus, considered as a communication-based model for parallel computation. Two notions of time complexity are given: the total computation time without parallelism (the work) and the computation time under maximal parallelism (the span). We define operational semantics to capture those two notions and present two type systems from which one can extract a complexity bound on a process. The type systems are inspired both by sized types and by input/output types, with additional temporal information about communications. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
4. The Power of Non-determinism in Higher-Order Implicit Complexity : Characterising Complexity Classes Using Non-deterministic Cons-Free Programming
- Author
-
Kop, Cynthia, Simonsen, Jakob Grue, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, and Yang, Hongseok, editor
- Published
- 2017
- Full Text
- View/download PDF
5. Polynomial time in untyped elementary linear logic.
- Author
-
Laurent, Olivier
- Subjects
- *
POLYNOMIAL time algorithms , *LOGIC , *TARDINESS , *COMPUTATIONAL complexity - Abstract
We show how to represent polynomial time computation in an untyped version of proof-nets for elementary linear logic. This follows previous work by P. Baillot but which was developed in a typed and affine setting. We describe how these two properties can be adapted. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
6. Combining linear logic and size types for implicit complexity.
- Author
-
Baillot, Patrick and Ghyselen, Alexis
- Subjects
- *
POLYNOMIAL time algorithms , *LOGIC , *LINEAR systems , *SELF-expression , *SIZE , *PROGRAMMABLE controllers - Abstract
Several type systems have been proposed to statically control the time complexity of lambda-calculus programs and characterize complexity classes such as FPTIME or FEXPTIME. A first line of research stems from linear logic and restricted versions of its !-modality controlling duplication. An instance of this is light linear logic for polynomial time computation [5]. A second approach relies on the idea of tracking the size increase between input and output, and together with a restricted recursion scheme, to deduce time complexity bounds. This second approach is illustrated for instance by non-size-increasing types [8]. However, both approaches suffer from limitations. The first one, that of linear logic, has a limited intensional expressivity, that is to say some natural polynomial time programs are not typable. As to the second approach it is essentially linear, more precisely it does not allow for a non-linear use of functional arguments. In the present work we incorporate both approaches into a common type system, in order to overcome their respective constraints. The source language we consider is a lambda-calculus with data-types and iteration, that is to say a variant of Gödel's system T. Our goal is to design a system for this language allowing both to handle non-linear functional arguments and to keep a good intensional expressivity. We illustrate our methodology by choosing the system of elementary linear logic (ELL) and combining it with a system of linear size types. We discuss the expressivity of this new type system, called sEAL, and prove that it gives a characterization of the complexity classes FPTIME and 2k-FEXPTIME, for k ≥ 0. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
7. Implicit Computational Complexity of Subrecursive Definitions and Applications to Cryptographic Proofs.
- Author
-
Baillot, Patrick, Barthe, Gilles, and Dal Lago, Ugo
- Subjects
CRYPTOGRAPHY ,MACHINE theory ,ALGORITHMS ,MATHEMATICAL variables ,ITERATIVE methods (Mathematics) - Abstract
We define a call-by-value variant of Gödel's system T with references, and equip it with a linear dependent type and effect system, called d ℓ T , that can estimate the time complexity of programs, as a function of the size of their inputs. We prove that the type system is intentionally sound, in the sense that it over-approximates the complexity of executing the programs on a variant of the CEK abstract machine. Moreover, we define a sound and complete type inference algorithm which critically exploits the subrecursive nature of d ℓ T . Finally, we demonstrate the usefulness of d ℓ T for analyzing the complexity of cryptographic reductions by providing an upper bound for the constructed adversary of the Goldreich–Levin theorem. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
8. Characterizing Polynomial and Exponential Complexity Classes in Elementary Lambda-Calculus
- Author
-
Baillot, Patrick, De Benedetti, Erika, Ronchi Della Rocca, Simona, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Kobsa, Alfred, Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Nierstrasz, Oscar, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Diaz, Josep, editor, Lanese, Ivan, editor, and Sangiorgi, Davide, editor
- Published
- 2014
- Full Text
- View/download PDF
9. Complexity Information Flow in a Multi-threaded Imperative Language
- Author
-
Marion, Jean-Yves, Péchoux, Romain, Hutchison, David, editor, Kanade, Takeo, editor, Kittler, Josef, editor, Kleinberg, Jon M., editor, Kobsa, Alfred, editor, Mattern, Friedemann, editor, Mitchell, John C., editor, Naor, Moni, editor, Nierstrasz, Oscar, editor, Pandu Rangan, C., editor, Steffen, Bernhard, editor, Terzopoulos, Demetri, editor, Tygar, Doug, editor, Weikum, Gerhard, editor, Gopal, T. V., editor, Agrawal, Manindra, editor, Li, Angsheng, editor, and Cooper, S. Barry, editor
- Published
- 2014
- Full Text
- View/download PDF
10. Type-Based Complexity Analysis for Fork Processes
- Author
-
Hainry, Emmanuel, Marion, Jean-Yves, Péchoux, Romain, Hutchison, David, editor, Kanade, Takeo, editor, Kittler, Josef, editor, Kleinberg, Jon M., editor, Mattern, Friedemann, editor, Mitchell, John C., editor, Naor, Moni, editor, Nierstrasz, Oscar, editor, Pandu Rangan, C., editor, Steffen, Bernhard, editor, Sudan, Madhu, editor, Terzopoulos, Demetri, editor, Tygar, Doug, editor, Vardi, Moshe Y., editor, Weikum, Gerhard, editor, and Pfenning, Frank, editor
- Published
- 2013
- Full Text
- View/download PDF
11. A New Order-Theoretic Characterisation of the Polytime Computable Functions
- Author
-
Avanzini, Martin, Eguchi, Naohi, Moser, Georg, Hutchison, David, editor, Kanade, Takeo, editor, Kittler, Josef, editor, Kleinberg, Jon M., editor, Mattern, Friedemann, editor, Mitchell, John C., editor, Naor, Moni, editor, Nierstrasz, Oscar, editor, Pandu Rangan, C., editor, Steffen, Bernhard, editor, Sudan, Madhu, editor, Terzopoulos, Demetri, editor, Tygar, Doug, editor, Vardi, Moshe Y., editor, Weikum, Gerhard, editor, Jhala, Ranjit, editor, and Igarashi, Atsushi, editor
- Published
- 2012
- Full Text
- View/download PDF
12. A Formalization of Polytime Functions
- Author
-
Heraud, Sylvain, Nowak, David, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Nierstrasz, Oscar, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Sudan, Madhu, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Vardi, Moshe Y., Series editor, Weikum, Gerhard, Series editor, van Eekelen, Marko, editor, Geuvers, Herman, editor, Schmaltz, Julien, editor, and Wiedijk, Freek, editor
- Published
- 2011
- Full Text
- View/download PDF
13. Interaction Graphs.
- Author
-
Seiller, Thomas
- Subjects
GRAPHIC methods ,SEMANTICS ,COMPLEXITY (Philosophy) ,MACHINE theory ,COMPUTER science - Abstract
This article exhibits a series of semantic characterisations of sublinear nondeterministic complexity classes. These results fall into the general domain of logic-based approaches to complexity theory and so-called implicit computational complexity (icc), i.e., descriptions of complexity classes without reference to specific machine models. In particular, it relates strongly to icc results based on linear logic, since the semantic framework considered stems from work on the latter. Moreover, the obtained characterisations are of a geometric nature: each class is characterised by a specific action of a group by measure-preserving maps. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
14. Characterizing polynomial and exponential complexity classes in elementary lambda-calculus.
- Author
-
Baillot, Patrick, De Benedetti, Erika, and Ronchi Della Rocca, Simona
- Subjects
- *
POLYNOMIALS , *CALCULUS , *LAMBDA meson , *ALGORITHMS , *COMPLEXITY (Philosophy) - Abstract
In this paper an implicit characterization of the complexity classes k - EXP and k - FEXP , for k ≥ 0 , is given, by a type assignment system for a stratified λ -calculus, where types for programs are witnesses of the corresponding complexity class. Types are formulae of Elementary Linear Logic ( ELL ), and the hierarchy of complexity classes k - EXP is characterized by a hierarchy of types. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
15. Some Complexity and Expressiveness Results on Multimodal and Stratified Proof Nets
- Author
-
Roversi, Luca, Vercelli, Luca, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Nierstrasz, Oscar, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Sudan, Madhu, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Vardi, Moshe Y., Series editor, Weikum, Gerhard, Series editor, Berardi, Stefano, editor, Damiani, Ferruccio, editor, and de’Liguoro, Ugo, editor
- Published
- 2009
- Full Text
- View/download PDF
16. Taming Modal Impredicativity: Superlazy Reduction
- Author
-
Dal Lago, Ugo, Roversi, Luca, Vercelli, Luca, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Nierstrasz, Oscar, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Sudan, Madhu, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Vardi, Moshe Y., Series editor, Weikum, Gerhard, Series editor, Artemov, Sergei, editor, and Nerode, Anil, editor
- Published
- 2009
- Full Text
- View/download PDF
17. Linear, Polynomial or Exponential? Complexity Inference in Polynomial Time
- Author
-
Ben-Amram, Amir M., Jones, Neil D., Kristiansen, Lars, Hutchison, David, editor, Kanade, Takeo, editor, Kittler, Josef, editor, Kleinberg, Jon M., editor, Mattern, Friedemann, editor, Mitchell, John C., editor, Naor, Moni, editor, Nierstrasz, Oscar, editor, Pandu Rangan, C., editor, Steffen, Bernhard, editor, Sudan, Madhu, editor, Terzopoulos, Demetri, editor, Tygar, Doug, editor, Vardi, Moshe Y., editor, Weikum, Gerhard, editor, Beckmann, Arnold, editor, Dimitracopoulos, Costas, editor, and Löwe, Benedikt, editor
- Published
- 2008
- Full Text
- View/download PDF
18. mwp-Analysis Improvement and Implementation: Realizing Implicit Computational Complexity
- Author
-
Clément Aubert and Thomas Rubiano and Neea Rusch and Thomas Seiller, Aubert, Clément, Rubiano, Thomas, Rusch, Neea, Seiller, Thomas, Clément Aubert and Thomas Rubiano and Neea Rusch and Thomas Seiller, Aubert, Clément, Rubiano, Thomas, Rusch, Neea, and Seiller, Thomas
- Abstract
Implicit Computational Complexity (ICC) drives better understanding of complexity classes, but it also guides the development of resources-aware languages and static source code analyzers. Among the methods developed, the mwp-flow analysis [Jones and Lars Kristiansen, 2009] certifies polynomial bounds on the size of the values manipulated by an imperative program. This result is obtained by bounding the transitions between states instead of focusing on states in isolation, as most static analyzers do, and is not concerned with termination or tight bounds on values. Those differences, along with its built-in compositionality, make the mwp-flow analysis a good target for determining how ICC-inspired techniques diverge compared with more traditional static analysis methods. This paper’s contributions are three-fold: we fine-tune the internal machinery of the original analysis to make it tractable in practice; we extend the analysis to function calls and leverage its machinery to compute the result of the analysis efficiently; and we implement the resulting analysis as a lightweight tool to automatically perform data-size analysis of C programs. This documented effort prepares and enables the development of certified complexity analysis, by transforming a costly analysis into a tractable program, that furthermore decorrelates the problem of deciding if a bound exist with the problem of computing it.
- Published
- 2022
- Full Text
- View/download PDF
19. Types for Complexity of Parallel Computation in Pi-Calculus
- Author
-
Patrick Baillot, Alexis Ghyselen, Preuves et Langages (PLUME), Laboratoire de l'Informatique du Parallélisme (LIP), École normale supérieure de Lyon (ENS de Lyon)-Université Claude Bernard Lyon 1 (UCBL), Université de Lyon-Université de Lyon-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-École normale supérieure de Lyon (ENS de Lyon)-Université Claude Bernard Lyon 1 (UCBL), Université de Lyon-Université de Lyon-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS), ANR-10-LABX-0070,MILYON,Community of mathematics and fundamental computer science in Lyon(2010), École normale supérieure - Lyon (ENS Lyon)-Université Claude Bernard Lyon 1 (UCBL), Université de Lyon-Université de Lyon-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-École normale supérieure - Lyon (ENS Lyon)-Université Claude Bernard Lyon 1 (UCBL), Baillot, Patrick, and Community of mathematics and fundamental computer science in Lyon - - MILYON2010 - ANR-10-LABX-0070 - LABX - VALID
- Subjects
[INFO.INFO-LO] Computer Science [cs]/Logic in Computer Science [cs.LO] ,Implicit Computational Complexity ,Computer science ,Process calculus ,Computation ,0102 computer and information sciences ,02 engineering and technology ,Parallel computing ,Type (model theory) ,01 natural sciences ,Operational semantics ,Article ,Pi-calculus ,Size Types ,0202 electrical engineering, electronic engineering, information engineering ,Time complexity ,Functional programming ,Type Systems ,Process (computing) ,[INFO.INFO-LO]Computer Science [cs]/Logic in Computer Science [cs.LO] ,020207 software engineering ,16. Peace & justice ,Process Calculi ,Complexity Analysis ,010201 computation theory & mathematics ,Parallelism (grammar) ,Software - Abstract
Type systems as a technique to analyse or control programs have been extensively studied for functional programming languages. In particular some systems allow to extract from a typing derivation a complexity bound on the program. We explore how to extend such results to parallel complexity in the setting of the pi-calculus, considered as a communication-based model for parallel computation. Two notions of time complexity are given: the total computation time without parallelism (the work) and the computation time under maximal parallelism (the span). We define operational semantics to capture those two notions, and present two type systems from which one can extract a complexity bound on a process. The type systems are inspired both by size types and by input/output types, with additional temporal information about communications.
- Published
- 2021
20. Substructural Verification and Computational Feasibility
- Author
-
Leivant, Daniel, Baeza-Yates, Ricardo, editor, Montanari, Ugo, editor, and Santoro, Nicola, editor
- Published
- 2002
- Full Text
- View/download PDF
21. Linear Ramified Higher Type Recursion and Parallel Complexity
- Author
-
Aehlig, Klaus, Johannsen, Jan, Schwichtenberg, Helmut, Terwijn, Sebastiaan A., Goos, Gerhard, editor, Hartmanis, Juris, editor, van Leeuwen, Jan, editor, Kahle, Reinhard, editor, Schroeder-Heister, Peter, editor, and Stärk, Robert, editor
- Published
- 2001
- Full Text
- View/download PDF
22. Extending the Implicit Computational Complexity Approach to the Sub-elementary Time-Space Classes
- Author
-
Covino, Emanuele, Pani, Giovanni, Caporaso, Salvatore, Goos, Gerhard, editor, Hartmanis, Juris, editor, van Leeuwen, Jan, editor, Bongiovanni, Giancarlo, editor, Petreschi, Rossella, editor, and Gambosi, Giorgio, editor
- Published
- 2000
- Full Text
- View/download PDF
23. A tier-based typed programming language characterizing Feasible Functionals
- Author
-
Bruce M. Kapron, Romain Péchoux, Jean-Yves Marion, Emmanuel Hainry, Designing the Future of Computational Models (MOCQUA), Inria Nancy - Grand Est, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Department of Formal Methods (LORIA - FM), Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS), University of Victoria [Canada] (UVIC), Carbone (CARBONE), Department of Formal Methods (LORIA - FM), and Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
FOS: Computer and information sciences ,[INFO.INFO-CC]Computer Science [cs]/Computational Complexity [cs.CC] ,tiering ,Computer Science - Logic in Computer Science ,General Computer Science ,Computer science ,Inference ,implicit computational complexity ,0102 computer and information sciences ,02 engineering and technology ,Type (model theory) ,computer.software_genre ,01 natural sciences ,type-2 ,Oracle ,[INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL] ,Theoretical Computer Science ,type system ,0202 electrical engineering, electronic engineering, information engineering ,Time complexity ,Class (computer programming) ,Computer Science - Programming Languages ,Programming language ,Type inference ,[INFO.INFO-LO]Computer Science [cs]/Logic in Computer Science [cs.LO] ,Logic in Computer Science (cs.LO) ,Decidability ,Imperative programming ,Feasible functionals ,010201 computation theory & mathematics ,Computer Science::Programming Languages ,020201 artificial intelligence & image processing ,BFF ,computer ,Programming Languages (cs.PL) - Abstract
International audience; The class of Basic Feasible Functionals BFF$_2$ is the type-2 counterpart of the class FP of type-1 functions computable in polynomial time. Several characterizations have been suggested in the literature, but none of these present a programming language with a type system guaranteeing this complexity bound. We give a characterization of BFF$_2$ based on an imperative language with oracle calls using a tier-based type system whose inference is decidable. Such a characterization should make it possible to link higher-order complexity with programming theory. The low complexity (cubic in the size of the program) of the type inference algorithm contrasts with the intractability of the aforementioned methods and does not overly constrain the expressive power of the language.
- Published
- 2022
- Full Text
- View/download PDF
24. mwp-Analysis Improvement and Implementation: Realizing Implicit Computational Complexity
- Author
-
Aubert, Clément, Rubiano, Thomas, Rusch, Neea, and Seiller, Thomas
- Subjects
FOS: Computer and information sciences ,Program Verification ,Computer Science - Logic in Computer Science ,Implicit Computational Complexity ,Formal Languages and Automata Theory (cs.FL) ,Computer Science - Formal Languages and Automata Theory ,Mathematics - Logic ,Theory of computation → Logic and verification ,Logic in Computer Science (cs.LO) ,Automatic Complexity Analysis ,Static Program Analysis ,Software and its engineering → Automated static analysis ,FOS: Mathematics ,Theory of computation → Complexity theory and logic ,Logic (math.LO) - Abstract
Implicit Computational Complexity (ICC) drives better understanding of complexity classes, but it also guides the development of resources-aware languages and static source code analyzers. Among the methods developed, the mwp-flow analysis [Jones and Lars Kristiansen, 2009] certifies polynomial bounds on the size of the values manipulated by an imperative program. This result is obtained by bounding the transitions between states instead of focusing on states in isolation, as most static analyzers do, and is not concerned with termination or tight bounds on values. Those differences, along with its built-in compositionality, make the mwp-flow analysis a good target for determining how ICC-inspired techniques diverge compared with more traditional static analysis methods. This paper’s contributions are three-fold: we fine-tune the internal machinery of the original analysis to make it tractable in practice; we extend the analysis to function calls and leverage its machinery to compute the result of the analysis efficiently; and we implement the resulting analysis as a lightweight tool to automatically perform data-size analysis of C programs. This documented effort prepares and enables the development of certified complexity analysis, by transforming a costly analysis into a tractable program, that furthermore decorrelates the problem of deciding if a bound exist with the problem of computing it., LIPIcs, Vol. 228, 7th International Conference on Formal Structures for Computation and Deduction (FSCD 2022), pages 26:1-26:23
- Published
- 2022
- Full Text
- View/download PDF
25. BUILD YOUR OWN CLARITHMETIC II: SOUNDNESS.
- Author
-
JAPARIDZE, GIORGI
- Subjects
NUMBER theory ,COMPUTATIONAL complexity ,MATHEMATICAL proofs ,MATHEMATICAL formulas ,PROBLEM solving - Abstract
Clarithmetics are number theories based on computability logic. Formulas of these theories represent interactive computational problems, and their "truth" is under- stood as existence of an algorithmic solution. Various complexity constraints on such solutions induce various versions of clarithmetic. The present paper introduces a param- eterized/schematic version CLA11P
1 ,P2 ,P3 P4. By tuning the three parameters P1 , P2 , P3 in an essentially mechanical manner, one automatically obtains sound and complete theories with respect to a wide range of target tricomplexity classes, i.e., combinations of time (set by P3 ), space (set by P2 ) and so called amplitude (set by P1 ) complexities. Sound in the sense that every theorem T of the system represents an interactive number-theoretic computational problem with a solution from the given tricomplexity class and, further- more, such a solution can be automatically extracted from a proof of T. And complete in the sense that every interactive number-theoretic problem with a solution from the given tricomplexity class is represented by some theorem of the system. Furthermore, through tuning the 4th parameter P4 , at the cost of sacrificing recursive axiomatizability but not simplicity or elegance, the above extensional completeness can be strengthened to intensional completeness, according to which every formula representing a problem with a solution from the given tricomplexity class is a theorem of the system. This article is published in two parts. The previous Part I has introduced the system and proved its completeness, while the present Part II is devoted to proving soundness. [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
26. The role of polymorphism in the characterisation of complexity by soft types.
- Author
-
Chrząszcz, Jacek and Schubert, Aleksy
- Subjects
- *
COMPUTATIONAL complexity , *ARITHMETIC mean , *MATHEMATICAL simplification , *SET theory , *PROGRAMMING languages , *INFERENTIAL statistics - Abstract
Soft type assignment systems STA, STA + , and STA B characterise by means of reduction of terms computation in complexity classes PTIME, NP, and PSPACE, respectively. All these systems are inspired by linear logic and include polymorphism similar to the one of System F. We show that the presence of polymorphism gives the undecidability of typechecking and type inference. We also show that reductions in decidable monomorphic versions of these systems also capture the same complexity classes in a way sufficient for the traditional complexity theory. The translations we propose show in addition that the monomorphic systems to serve as a programming language require some metalanguage support since the program which operates on data has form and type which depend on the size of the input. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
27. Computation by interaction for space-bounded functional programming.
- Author
-
Dal Lago, Ugo and Schöpp, Ulrich
- Subjects
- *
MATHEMATICAL bounds , *VECTOR spaces , *FUNCTIONAL programming languages , *COMPUTATIONAL complexity , *UNDIRECTED graphs , *GRAPH theory - Abstract
When programming with sublinear space constraints one often needs to use special implementation techniques even for simple tasks, such as function composition. In this paper, we study how such implementation techniques can be supported in a functional programming language. Our approach is based on modelling computation by interaction using the Int construction of Joyal, Street & Verity. We apply this construction to a term model of a first-order programming language and use the resulting structure to derive the functional programming language intml . Intml can be understood as a programming language simplification of Stratified Bounded Affine Logic. We formulate intml by means of a type system inspired by Baillot & Terui's Dual Light Affine Logic. We show that it captures the complexity classes flogspace and nflogspace . We illustrate its expressiveness by showing how typical graph algorithms, such a test for acyclicity in undirected graphs, can be represented. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
28. Bounded Combinatory Logic and lower complexity.
- Author
-
Redmond, Brian F.
- Subjects
- *
MATHEMATICAL bounds , *COMBINATORY logic , *COMPUTATIONAL complexity , *SET theory , *POLYNOMIAL time algorithms - Abstract
We introduce a stratified version of Combinatory Logic 1 in which there are two classes of terms called player and opponent such that the class of player terms is strictly contained in the class of opponent terms. We show that the system characterizes Polynomial Time in a similar way to Soft Linear Logic. With the addition of explicit “lazy” products to the player terms and various notions of distributivity, we obtain a characterization of Polynomial Space. This paper is an expanded version of the abstract presented at DICE 2013. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
29. Higher-order interpretations and program complexity.
- Author
-
Baillot, Patrick and Dal Lago, Ugo
- Subjects
- *
COMPUTATIONAL complexity , *GENERALIZATION , *POLYNOMIAL time algorithms , *PATHS & cycles in graph theory , *LINEAR systems - Abstract
Polynomial interpretations and their generalizations like quasi-interpretations have been used in the setting of first-order functional languages to design criteria ensuring statically some complexity bounds on programs [10] . This fits in the area of implicit computational complexity, which aims at giving machine-free characterizations of complexity classes. In this paper, we extend this approach to the higher-order setting. For that we consider simply-typed term rewriting systems [35] , we define higher-order polynomial interpretations for them, and we give a criterion ensuring that a program can be executed in polynomial time. In order to obtain a criterion flexible enough to validate interesting programs using higher-order primitives, we introduce a notion of polynomial quasi-interpretations, coupled with a simple termination criterion based on linear types and path-like orders. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
30. Introduction to clarithmetic II.
- Author
-
Japaridze, Giorgi
- Subjects
- *
AXIOMS , *COMPUTABILITY logic , *MATHEMATICAL proofs , *POLYNOMIALS , *SPACETIME - Abstract
The earlier paper “Introduction to clarithmetic I” constructed an axiomatic system of arithmetic based on computability logic, and proved its soundness and extensional completeness with respect to polynomial time computability. The present paper elaborates three additional sound and complete systems in the same style and sense: one for polynomial space computability, one for elementary recursive time (and/or space) computability, and one for primitive recursive time (and/or space) computability. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
31. Proceedings Third Joint Workshop on Developments in Implicit Computational complExity and Foundational & Practical Aspects of Resource Analysis
- Author
-
Brian F. Redmond
- Subjects
Exponential complexity ,Theoretical computer science ,Computer science ,Resource analysis ,Hierarchical control system ,Implicit computational complexity ,Joint (audio engineering) ,Linear logic - Published
- 2019
- Full Text
- View/download PDF
32. Proceedings Third Joint Workshop on Developments in Implicit Computational complExity and Foundational & Practical Aspects of Resource Analysis
- Author
-
Michael Schaper, Martin Avanzini, and Georg Moser
- Subjects
Theoretical computer science ,Computer science ,business.industry ,Resource analysis ,Probabilistic logic ,Implicit computational complexity ,Joint (building) ,Modular design ,business - Published
- 2019
- Full Text
- View/download PDF
33. An extended and more practical mwp flow analysis
- Author
-
Aubert, Clément, Rubiano, Thomas, Rusch, Neea, Seiller, Thomas, Augusta University, University System of Georgia (USG), Laboratoire d'Informatique de Paris-Nord (LIPN), Centre National de la Recherche Scientifique (CNRS)-Université Sorbonne Paris Nord, Centre National de la Recherche Scientifique (CNRS), and This material is based upon research supported by the Thomas Jefferson Fund of the Embassy of France in the United States and the FACE Foundation. Thomas Rubiano and Thomas Seiller are also supported by the Île-de-France region through the DIM RFSI project 'CoHOp'.
- Subjects
FOS: Computer and information sciences ,Program Verification ,[INFO.INFO-CC]Computer Science [cs]/Computational Complexity [cs.CC] ,Computer Science - Logic in Computer Science ,Computer Science - Programming Languages ,Implicit Computational Complexity ,[INFO.INFO-LO]Computer Science [cs]/Logic in Computer Science [cs.LO] ,Mathematics - Logic ,[INFO.INFO-SE]Computer Science [cs]/Software Engineering [cs.SE] ,Computational Complexity (cs.CC) ,Logic in Computer Science (cs.LO) ,Automatic Complexity Analysis ,[MATH.MATH-LO]Mathematics [math]/Logic [math.LO] ,Computer Science - Computational Complexity ,[INFO.INFO-FL]Computer Science [cs]/Formal Languages and Automata Theory [cs.FL] ,Static Program Analysis ,FOS: Mathematics ,Logic (math.LO) ,Programming Languages (cs.PL) - Abstract
We improve and refine a method for certifying that the values' sizes computed by an imperative program will be bounded by polynomials in the program's inputs' sizes. Our work ''tames'' the non-determinism of the original analysis, and offers an innovative way of completing the analysis when a non-polynomial growth is found. We furthermore enrich the analyzed language by adding function definitions and calls, allowing to compose the analysis of different libraries and offering generally more modularity. The implementation of our improved method, discussed in a tool paper (https://hal.archives-ouvertes.fr/hal-03269121), also required to reason about the efficiency of some of the needed operations on the matrices produced by the analysis. It is our hope that this work will enable and facilitate static analysis of source code to guarantee its correctness with respect to resource usages.
- Published
- 2021
34. Light combinators for finite fields arithmetic.
- Author
-
Canavese, D., Cesena, E., Ouchary, R., Pedicini, M., and Roversi, L.
- Subjects
- *
COMBINATORICS , *FINITE fields , *ARITHMETIC , *COMPUTATIONAL complexity , *POLYNOMIAL time algorithms , *FUNCTIONAL programming (Computer science) - Abstract
This work completes the definition of a library which provides the basic arithmetic operations in binary finite fields as a set of functional terms with very specific features. Such a functional terms have type in Typeable Functional Assembly ( TFA ). TFA is an extension of Dual Light Affine Logic ( DLAL ). DLAL is a type assignment designed under the prescriptions of Implicit Computational Complexity (ICC), which characterises polynomial time costing computations. We plan to exploit the functional programming patterns of the terms in the library to implement cryptographic primitives whose running-time efficiency can be obtained by means of the least hand-made tuning as possible. We propose the library as a benchmark. It fixes a kind of lower bound on the difficulty of writing potentially interesting low cost programs inside languages that can express only computations with predetermined complexity. In principle, every known and future ICC compliant programming language for polynomially costing computations should supply a simplification over the encoding of the library we present, or some set of combinators of comparable interest and difficulty. We finally report on the applicative outcome that our library has and which is a reward we get by programming in the very restrictive scenario that TFA provides. The term of TFA which encodes the inversion in binary fields suggested us a variant of a known and efficient imperative implementation of the inversion itself given by Fong. Our variant, can outperform Fong's implementation of inversion on specific hardware architectures. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
35. On bounding space usage of streams using interpretation analysis.
- Author
-
Gaboardi, Marco and Péchoux, Romain
- Subjects
- *
MATHEMATICAL bounds , *COMPUTATIONAL complexity , *COMPUTER software , *REWRITING systems (Computer science) , *FUNCTIONAL programming (Computer science) - Abstract
Interpretation methods are important tools in implicit computational complexity. They have been proved particularly useful to statically analyze and to limit the complexity of programs. However, most of these studies have been so far applied in the context of term rewriting systems over finite data. In this paper, we show how interpretations can also be used to study properties of lazy first-order functional programs over streams. In particular, we provide some interpretation criteria useful to ensure two kinds of stream properties: space upper bounds and input/output upper bounds . Our space upper bounds criteria ensure global and local upper bounds on the size of each output stream element expressed in terms of the maximal size of the input stream elements. The input/output upper bounds criteria consider instead the relations between the number of elements read from the input stream and the number of elements produced on the output stream. This contribution can be seen as a first step in the development of a methodology aiming at using interpretation properties to ensure space safety properties of programs working on streams. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
36. ON THE SYSTEM CL12 OF COMPUTABILITY LOGIC.
- Author
-
JAPARIDZE, GIORGI
- Subjects
COMPUTABILITY logic ,COMPUTABLE functions ,CONSTRUCTIVE mathematics ,MATHEMATICAL analysis ,MATHEMATICAL logic - Abstract
Computability logic (CoL) is a long-term project for redeveloping logic on the basis of a constructive game semantics, with games seen as abstract models of interactive computational problems. Among the fragments of CoL successfully axiomatized so far is CL12 - a conservative extension of classical first-order logic, whose language augments that of classical logic with the so called choice ("constructive") sorts of quantifiers and connectives. This system has already found fruitful applications as a logical basis for constructive and complexity-bound versions of Peano arithmetic, such as arithmetics for polynomial time computability,polynomial space computability, and beyond. The present paper introduces a third, indispensable complexity measure for interactive computations termed amplitude complexity, and establishes the adequacy (soundness/completeness) of CL12 and the associated Logical Consequence mechanism with respect to (simultaneously) A amplitude, S space and T time computability under certain minimal conditions on the triples(A,S,T)of function classes. This result very substantially broadens the potential application areas of CL12, even when time and/or space complexity is the only concern. It would be sufficient to point out that, for instance, now CL12 can be reliably used as a logical basis of systems for logarithmic space or exponential time computabilities - something that the earlier-known crude adequacy results for CL12 were too weak to allow us to do. This paper is self-contained, and targets readers with no prior familiarity with the subject. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
37. Real or natural number interpretation and their effect on complexity.
- Author
-
Bonfante, Guillaume, Deloup, Florian, and Henrot, Antoine
- Subjects
- *
COMPUTATIONAL complexity , *MATHEMATICAL bounds , *ARCHIMEDEAN property , *NATURAL numbers , *POLYNOMIALS - Abstract
Interpretation methods have been introduced in the 70s by Lankford [1] in rewriting theory to prove termination. Actually, as shown by Bonfante et al. [2] , an interpretation of a program induces a bound on its complexity. However, Lankford's original analysis depends deeply on the Archimedean property of natural numbers. This goes against the fact that finding a real interpretation can be solved by Tarski's decision procedure over the reals (as described by Dershowitz in [3] ), and consequently interpretations are usually chosen over the reals rather than over the integers. Doing so, one cannot use anymore the (good) properties of the natural (well-)ordering of N used to bound the complexity of programs. We prove that one may take benefit from the best of both worlds: the complexity analysis still holds even with real numbers. The reason lies in a deep algebraic property of polynomials over the reals. We illustrate this by two characterizations, one of polynomial time and one of polynomial space. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
38. Realizability models for a linear dependent PCF.
- Author
-
Brunel, Aloïs and Gaboardi, Marco
- Subjects
- *
COMPUTATIONAL complexity , *PROGRAMMING languages , *LINEAR programming , *PROOF theory , *SEMANTICS - Abstract
Recently, Dal Lago and Gaboardi have proposed a type system, named d ℓ PCF as a framework for implicit computational complexity. d ℓ PCF is a non-standard type system for PCF programs which is relatively complete with respect to quantitative properties thanks to the use of linear types inspired by Bounded linear logic and dependent types à la Dependent ML. In this work, we adapt the framework of quantitative realizability and obtain a model for d ℓ PCF . The quantitative realizability model aims at a better understanding of d ℓ PCF type decorations and at giving an abstract semantic proof of intensional soundness. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
39. A new order-theoretic characterisation of the polytime computable functions.
- Author
-
Avanzini, Martin, Eguchi, Naohi, and Moser, Georg
- Subjects
- *
MATHEMATICAL functions , *COMPUTATIONAL complexity , *ASYMPTOTIC expansions , *AUTOMATION , *REWRITING systems (Computer science) - Abstract
We propose a new order-theoretic characterisation of the class of polytime computable functions. To this avail we define the small polynomial path order ( sPOP ⁎ for short). This termination order entails a new syntactic method to analyse the innermost runtime complexity of term rewrite systems fully automatically: for any rewrite system compatible with sPOP ⁎ that employs recursion up to depth d , the (innermost) runtime complexity is polynomially bounded of degree d . This bound is tight. Thus we obtain a direct correspondence between a syntactic (and easily verifiable) condition of a program and the asymptotic worst-case complexity of the program. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
40. On the expressivity of elementary linear logic: Characterizing Ptime and an exponential time hierarchy.
- Author
-
Baillot, Patrick
- Subjects
- *
LINEAR systems , *EXPONENTIAL functions , *MATHEMATICAL bounds , *MATHEMATICAL proofs , *COMPUTATIONAL complexity - Abstract
Elementary linear logic is a simple variant of linear logic due to Girard and which characterizes in the proofs-as-programs approach the class of elementary functions, that is to say functions computable in time bounded by a tower of exponentials of fixed height. Other systems like light and soft linear logics have then been defined to characterize in a similar way the more interesting complexity class of polynomial time functions, but at the price of either a more complicated syntax or of more sophisticated encodings. These logical systems can serve as the basis of type systems for λ -calculus ensuring polynomial time complexity bounds on well-typed terms. This paper aims at reviving interest in elementary linear logic by showing that, despite its simplicity, it can capture smaller complexity classes than that of elementary functions. For that we carry a detailed analysis of its normalization procedure, and study the complexity of functions represented by a given type. We then show that by considering a slight variant of this system, with type fixpoints and free weakening (elementary affine logic with fixpoints) we can characterize the complexity of functions of type ! W ⊸ ! k + 2 B , where W and B are respectively types for binary words and booleans. The key point is a sharper study of the normalization bounds. We characterize in this way the class P of polynomial time predicates, and more generally the hierarchy of classes k - EXP , for k ≥ 0 , where k - EXP is the union of DTIME ( 2 k n i ) , for i ≥ 1 . [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
41. An abstract approach to stratification in linear logic.
- Author
-
Boudes, Pierre, Mazza, Damiano, and Tortora de Falco, Lorenzo
- Subjects
- *
LINEAR systems , *COMPUTATIONAL complexity , *MATHEMATICAL bounds , *EXPONENTIAL functions , *SEMANTICS - Abstract
We study the notion of stratification, as used in subsystems of linear logic with low complexity bounds on the cut-elimination procedure (the so-called “light” subsystems), from an abstract point of view, introducing a logical system in which stratification is handled by a separate modality. This modality, which is a generalization of the paragraph modality of Girard's light linear logic, arises from a general categorical construction applicable to all models of linear logic. We thus learn that stratification may be formulated independently of exponential modalities; when it is forced to be connected to exponential modalities, it yields interesting complexity properties. In particular, from our analysis stem three alternative reformulations of Baillot and Mazza's linear logic by levels: one geometric, one interactive, and one semantic. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
42. A higher-order characterization of probabilistic polynomial time.
- Author
-
Dal Lago, Ugo and Parisen Toldin, Paolo
- Subjects
- *
POLYNOMIAL time algorithms , *SET theory , *ERROR probability , *COMPUTATIONAL complexity , *PROBABILITY theory - Abstract
We present RSLR , an implicit higher-order characterization of the class PP of those problems which can be decided in probabilistic polynomial time with error probability smaller than 1 2 . Analogously, a (less implicit) characterization of the class BPP can be obtained. RSLR is an extension of Hofmann's SLR with a probabilistic primitive, which enjoys basic properties such as subject reduction and confluence. Polynomial time soundness of RSLR is obtained by syntactical means, as opposed to the standard literature on SLR -derived systems, which use semantics in an essential way. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
43. Λ-Symsym: An Interactive Tool for Playing with Involutions and Types
- Author
-
Honsell, F., Lenisa, M., and Scagnetto, I.
- Subjects
Game semantics ,Implicit computational complexity ,Involutions ,Lambda calculus ,Linear logic ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,Theory of computation → Linear logic ,involutions ,Computer Science::Logic in Computer Science ,linear logic ,lambda calculus ,implicit computational complexity ,game semantics - Abstract
We present the web portal Λ-symsym, available at http://158.110.146.197:31780/automata/, for experimenting with game semantics of λ^!-calculus, and its normalizing elementary sub-calculus, the λ^{EAL}-calculus. The λ^!-calculus is a generalization of the λ-calculus obtained by introducing a modal operator !, giving rise to a pattern β-reduction. Its sub-calculus corresponds to an applicatively closed class of terms normalizing in an elementary number of steps, in which all elementary functions can be encoded. The game model which we consider is the Geometry of Interaction model I introduced by Abramsky to study reversible computations, consisting of partial involutions over a very simple language of moves. Given a λ^!- or a λ^{EAL}-term, M, Λ-symsym provides: - an abstraction algorithm A^!, for compiling M into a term, A^!(M), of the linear combinatory logic CL^{!}, or the normalizing combinatory logic CL^{EAL}; - an interpretation algorithm [[ ]]^I yielding a specification of the partial involution [[A^!(M)]]^I in the model I; - an algorithm, I2T, for synthesizing from [[A^!(M)]]^I a type, I2T([[A^!(M)]]^I), in a multimodal, intersection type assignment discipline, ⊢_!. - an algorithm, T2I, for synthesizing a specification of a partial involution from a type in ⊢_!, which is an inverse to the former. We conjecture that ⊢_! M : I2T([[A^!(M)]]^I). Λ-symsym permits to investigate experimentally the fine structure of I, and hence the game semantics of the λ^!- and λ^{EAL}-calculi. For instance, we can easily verify that the model I is a λ^!-algebra in the case of strictly linear λ-terms, by checking all the necessary equations, and find counterexamples in the general case. We make this tool available for readers interested to play with games (-semantics). The paper builds on earlier work by the authors, the type system being an improvement., LIPIcs, Vol. 188, 26th International Conference on Types for Proofs and Programs (TYPES 2020), pages 7:1-7:18
- Published
- 2021
- Full Text
- View/download PDF
44. Probabilistic Recursion Theory and Implicit Computational Complexity.
- Author
-
DAL LAGO, Ugo, ZUPPIROLI, Sara, and GABBRIELLI, Maurizio
- Subjects
RECURSION theory ,COMPUTATIONAL complexity ,MATHEMATICAL functions ,CRYPTOGRAPHY ,COMPUTER systems - Abstract
We show that probabilistic computable functions, i.e., those functions outputting distributions and computed by probabilistic Turing machines, can be characterized by a natural generalization of Church and Kleene's partial recursive functions. The obtained algebra, following Leivant, can be restricted so as to capture the notion of a polytime sampleable distribution, a key concept in average-case complexity and cryptography. [ABSTRACT FROM AUTHOR]
- Published
- 2014
- Full Text
- View/download PDF
45. Theory of Higher Order Interpretations and Application to Basic Feasible Functions
- Author
-
Hainry, Emmanuel, P��choux, Romain, Designing the Future of Computational Models (MOCQUA), Inria Nancy - Grand Est, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Department of Formal Methods (LORIA - FM), Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS), This work has been partially supported by ANR Project ELICA ANR-14-CE25-0005 and Inria associate team TC(Pro)., and ANR-14-CE25-0005,ELICA,Élargir les idées logiques pour l'analyse de complexité(2014)
- Subjects
FOS: Computer and information sciences ,[INFO.INFO-CC]Computer Science [cs]/Computational Complexity [cs.CC] ,Computer Science - Logic in Computer Science ,Computer Science - Computational Complexity ,Computer Science - Programming Languages ,Implicit computational complexity ,Computer Science::Programming Languages ,[INFO.INFO-LO]Computer Science [cs]/Logic in Computer Science [cs.LO] ,Computational Complexity (cs.CC) ,[INFO.INFO-CL]Computer Science [cs]/Computation and Language [cs.CL] ,Logic in Computer Science (cs.LO) ,Programming Languages (cs.PL) ,basic feasible functionals - Abstract
Logical Methods in Computer Science ; Volume 16, Issue 4 ; 1860-5974, Interpretation methods and their restrictions to polynomials have been deeply used to control the termination and complexity of first-order term rewrite systems. This paper extends interpretation methods to a pure higher order functional language. We develop a theory of higher order functions that is well-suited for the complexity analysis of this programming language. The interpretation domain is a complete lattice and, consequently, we express program interpretation in terms of a least fixpoint. As an application, by bounding interpretations by higher order polynomials, we characterize Basic Feasible Functions at any order.
- Published
- 2020
- Full Text
- View/download PDF
46. Linear dependent types in a call-by-value scenario.
- Author
-
Dal Lago, Ugo and Petit, Barbara
- Subjects
- *
LINEAR dependence (Mathematics) , *SYSTEMS design , *MACHINE theory , *COMPUTATIONAL complexity , *MATHEMATICAL inequalities , *LINEAR systems - Abstract
Abstract: Linear dependent types were introduced recently (Dal Lago and Gaboardi, 2012) [26] as a formal system that allows to precisely capture both the extensional behavior and the time complexity of -terms, when the latter are evaluated by Krivine’s abstract machine. In this work, we show that the same paradigm can be applied to call-by-value computation. A system of linear dependent types for Plotkin’s is introduced, called , whose types reflect the complexity of evaluating terms in the machine. is proved to be sound, but also relatively complete: every true statement about the extensional and intentional behavior of terms can be derived, provided all true index term inequalities can be used as assumptions. [Copyright &y& Elsevier]
- Published
- 2014
- Full Text
- View/download PDF
47. Complexité implicite : bilan et perspectives
- Author
-
Romain Péchoux, Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS), Designing the Future of Computational Models (MOCQUA), Inria Nancy - Grand Est, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Department of Formal Methods (LORIA - FM), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS), Université de Lorraine, Olivier Bournez, Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria), Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), and Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)
- Subjects
[INFO.INFO-CC]Computer Science [cs]/Computational Complexity [cs.CC] ,Implicit computational complexity ,Complexité implicite - Published
- 2020
48. Implicit Computational Complexity: past and future
- Author
-
Péchoux, Romain, Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria), Designing the Future of Computational Models (MOCQUA), Inria Nancy - Grand Est, Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National de Recherche en Informatique et en Automatique (Inria)-Department of Formal Methods (LORIA - FM), Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL), Université de Lorraine, Olivier Bournez, Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS), Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Laboratoire Lorrain de Recherche en Informatique et ses Applications (LORIA), and Institut National de Recherche en Informatique et en Automatique (Inria)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)-Université de Lorraine (UL)-Centre National de la Recherche Scientifique (CNRS)
- Subjects
[INFO.INFO-CC]Computer Science [cs]/Computational Complexity [cs.CC] ,Implicit computational complexity ,Complexité implicite - Published
- 2020
49. A study on relationships between some subrecursive function classes and complexity classes [Project Paper]
- Subjects
Subrecursion ,Implicit computational complexity ,Logtime hierarchy ,ComputingMilieux_THECOMPUTINGPROFESSION ,Function algebra ,Polynomial-time functions ,ComputerApplications_COMPUTERSINOTHERSYSTEMS ,Complexity classes ,Grzegorczyk hierarchy - Abstract
Supervisor:石原 哉, 先端科学技術研究科, 修士(情報科学)
- Published
- 2020
50. Reversible computing and implicit computational complexity
- Author
-
Lars Kristiansen
- Subjects
Theoretical computer science ,Computer science ,Computation ,Complexity class ,Reversible computing ,Implicit computational complexity ,Link (knot theory) ,Software - Abstract
We argue that there is a link between implicit computational complexity theory and reversible computation. We introduce inherently reversible programming languages which capture the complexity classes etime and . Furthermore, we discuss and analyze higher-order versions of our reversible programming languages.
- Published
- 2022
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.