19 results on '"Recursive Data Structures"'
Search Results
2. PList-based Divide and Conquer Parallel Programming
- Author
-
Virginia Niculescu, Darius Bufnea, and Adrian Sterca
- Subjects
parallel computation ,divide and conquer ,recursive data structures ,framework ,Computer software ,QA76.75-76.765 - Abstract
This paper details an extension of a Java parallel programming framework – JPLF. The JPLF framework is a programming framework that helps programmers build parallel programs using existing building blocks. The framework is based on {\em PowerLists} and PList Theories and it naturally supports multi-way Divide and Conquer. By using this framework, the programmer is exempted from dealing with all the complexities of writing parallel programs from scratch. This extension to the JPLF framework adds PLists support to the framework and so, it enlarges the applicability of the framework to a larger set of parallel solvable problems. Using this extension, we may apply more flexible data division strategies. In addition, the length of the input lists no longer has to be a power of two – as required by the PowerLists theory. In this paper we unveil new applications that emphasize the new class of computations that can be executed within the JPLF framework. We also give a detailed description of the data structures and functions involved in the PLists extension of the JPLF, and extended performance experiments are described and analyzed.
- Published
- 2020
3. PList-based Divide and Conquer Parallel Programming.
- Author
-
Niculescu, Virginia, Bufnea, Darius, and Sterca, Adrian
- Subjects
PARALLEL programming ,DATA structures ,JAVA programming language - Abstract
This paper details an extension of a Java parallel programming framework -- JPLF. The JPLF framework is a programming framework that helps programmers build parallel programs using existing building blocks. The framework is based on PowerLists and PList theories and it naturally supports multi-way Divide and Conquer. By using this framework, the programmer is exempted from dealing with all the complexities of writting parallel programs from scratch. This extension of the JPLF framework adds PLists support to the framework and so, it enlarges the applicability of the framework to a larger set of parallel solvable problems. Using this extension, we may apply more flexible data division strategies, and the length of the input lists no longer has to be a power of two -- as required by the PowerLists theory. In this paper we unveil new applications that emphasize the new class of computations that can be executed within the JPLF framework. We also give a detailed description of the data structures and functions involved in the PLists extension of the JPLF, and extended performance experiments are described and analyzed. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
4. Introduction
- Author
-
Buhr, Peter A. and Buhr, Peter A.
- Published
- 2016
- Full Text
- View/download PDF
5. Separation Logic with Monadic Inductive Definitions and Implicit Existentials
- Author
-
Tatsuta, Makoto, Kimura, Daisuke, Hutchison, David, Series editor, Kanade, Takeo, Series editor, Kittler, Josef, Series editor, Kleinberg, Jon M., Series editor, Mattern, Friedemann, Series editor, Mitchell, John C., Series editor, Naor, Moni, Series editor, Pandu Rangan, C., Series editor, Steffen, Bernhard, Series editor, Terzopoulos, Demetri, Series editor, Tygar, Doug, Series editor, Weikum, Gerhard, Series editor, Feng, Xinyu, editor, and Park, Sungwoo, editor
- Published
- 2015
- Full Text
- View/download PDF
6. High-Performance Derivative Computations using CoDiPack.
- Author
-
Sagebaum, Max, Albring, Tim, and Gauger, Nicolas R.
- Subjects
- *
DATA structures , *AUTOMATIC differentiation , *MINIMAL design - Abstract
There are several AD tools available that all implement different strategies for the reverse mode of AD. The most common strategies are primal value taping (implemented e.g. by ADOL-C) and Jacobian taping (implemented e.g. by Adept and dco/c++). Particulary for Jacobian taping, recent advances using expression templates make it very attractive for large scale software. However, the current implementations are either closed source or miss essential features and flexibility. Therefore, we present the new AD tool CoDiPack (Code Differentiation Package) in this paper. It is specifically designed for minimal memory consumption and optimal runtime, such that it can be used for the differentiation of large scale software. An essential part of the design of CoDiPack is the modular layout and the recursive data structures which not only allow the efficient implementation of the Jacobian taping approach but will also enable other approaches like the primal value taping or new research ideas. We will finally present the performance values of CoDiPack on a generic PDE example and on the SU2 code. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
7. PList-based Divide and Conquer Parallel Programming
- Author
-
Darius Bufnea, Adrian Sterca, and Virginia Niculescu
- Subjects
lcsh:Computer software ,Divide and conquer algorithms ,Java ,parallel computation ,divide and conquer ,recursive data structures ,framework ,Computer science ,Data division ,020206 networking & telecommunications ,020207 software engineering ,02 engineering and technology ,Extension (predicate logic) ,Parallel computing ,Data structure ,Set (abstract data type) ,lcsh:QA76.75-76.765 ,Scratch ,0202 electrical engineering, electronic engineering, information engineering ,Electrical and Electronic Engineering ,Programmer ,computer ,Software ,computer.programming_language - Abstract
This paper details an extension of a Java parallel programming framework – JPLF. The JPLF framework is a programming framework that helps programmers build parallel programs using existing building blocks. The framework is based on {\em PowerLists} and PList Theories and it naturally supports multi-way Divide and Conquer. By using this framework, the programmer is exempted from dealing with all the complexities of writing parallel programs from scratch. This extension to the JPLF framework adds PLists support to the framework and so, it enlarges the applicability of the framework to a larger set of parallel solvable problems. Using this extension, we may apply more flexible data division strategies. In addition, the length of the input lists no longer has to be a power of two – as required by the PowerLists theory. In this paper we unveil new applications that emphasize the new class of computations that can be executed within the JPLF framework. We also give a detailed description of the data structures and functions involved in the PLists extension of the JPLF, and extended performance experiments are described and analyzed.
- Published
- 2020
- Full Text
- View/download PDF
8. Reflections on the Design of Parallel Programming Frameworks
- Author
-
Adrian Sterca, Frédéric Loulergue, Virginia Niculescu, Universitatea Babeş-Bolyai [Cluj-Napoca], Northern Arizona University [Flagstaff], Laboratoire d'Informatique Fondamentale d'Orléans (LIFO), Université d'Orléans (UO)-Institut National des Sciences Appliquées - Centre Val de Loire (INSA CVL), and Institut National des Sciences Appliquées (INSA)-Institut National des Sciences Appliquées (INSA)
- Subjects
020203 distributed computing ,[INFO.INFO-PL]Computer Science [cs]/Programming Languages [cs.PL] ,Frameworks ,Computer science ,Reliability (computer networking) ,Separation of concerns ,Software Engineering ,Context (language use) ,02 engineering and technology ,Parallel computing ,computer.software_genre ,Design Patterns ,Software quality ,Parallel Programming ,Software framework ,Sequential programming ,020204 information systems ,Software design pattern ,0202 electrical engineering, electronic engineering, information engineering ,Separation of Concerns ,Recursive Data Structures ,computer - Abstract
International audience; Parallel programming is much more complex and difficult than sequential programming, and it is therefore more challenging to achieve the same software quality in a parallel context. High-level parallel programming models, if implemented as software frameworks, could increase productivity and reliability. Important requirements such as extensibility and adaptability for different platforms are required for such a framework, and this paper reflects on these requirements and their relation to the software engineering methodologies that could put them in practice. All these are exemplified on a Java framework-JPLF; this is a high-level parallel programming approach being based on the model brought by the PowerLists associated theories, and it respects the analysed requirements. The design of JPLF is analysed by explaining the design choices and highlighting the design patterns and design principles applied.
- Published
- 2021
- Full Text
- View/download PDF
9. Decision procedures for term algebras with integer constraints
- Author
-
Zhang, Ting, Sipma, Henny B., and Manna, Zohar
- Subjects
- *
MATHEMATICS , *ALGEBRA , *MATHEMATICAL analysis , *PROGRAMMING languages - Abstract
Abstract: Term algebras can model recursive data structures which are widely used in programming languages. To verify programs we must be able to reason about these structures. However, as programming languages often involve multiple data domains, in program verification decision procedures for a single theory are usually not applicable. An important class of mixed constraints consists of combinations of data structures with integer constraints on the size of data structures. Such constraints can express memory safety properties such as absence of memory overflow and out-of-bound array access, which are crucial for program correctness. In this paper we extend the theory of term algebras with the length function which maps a term to its size, resulting in a combined theory of term algebras and Presburger arithmetic. This arithmetic extension provides a natural but tight coupling between the two theories, and hence the general purpose combination methods like Nelson-Oppen combination are not applicable. We present decision procedures for quantifier-free theories in structures with an infinite constant domain and with a finite constant domain. We also present a quantifier elimination procedure for the extended first-order theory that can remove a block of existential quantifiers in one step. [Copyright &y& Elsevier]
- Published
- 2006
- Full Text
- View/download PDF
10. A Framework to Capture Dynamic Data Structures in Pointer-Based Codes.
- Author
-
Corbera, Francisco, Asenjo, Rafael, and Zapata, Emilio L.
- Subjects
- *
COMPUTER architecture , *COMPILING (Electronic computers) , *DATA structures , *MATHEMATICAL optimization , *GRAPH theory , *COMPUTER science - Abstract
To successfully exploit all the possibilities of current computer/multicomputer architectures, optimization compiling techniques are a must. However, for codes based on pointers and dynamic data structures, these optimization techniques have to be necessarily carried out after identifying the characteristics and properties of the data structure used in the code. In this paper, we describe the framework and the analyzer we have implemented to capture complex data structures generated, traversed, and modified in codes based on pointers. Our method assigns a Reduced Set of Reference Shape Graph (RSRSG) to each statement to approximate the shape of the data structure after the execution of such a statement. With the properties and operations that define the behavior of our RSRSG, the method can accurately detect complex recursive data structures such as a doubly linked list of pointers to trees where the leaves point to additional lists. Several experiments are carried out with real codes to validate the capabilities of our analyzer. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
11. Eliminating dead code on recursive data
- Author
-
Liu, Yanhong A. and Stoller, Scott D.
- Subjects
- *
RECURSIVE functions , *CONSTRAINT satisfaction - Abstract
This paper describes a powerful method for dead-code analysis and elimination in the presence of recursive data constructions. We describe partially dead recursive data using liveness patterns based on general regular tree grammars extended with the notion of live and dead, and we formulate the analysis as computing liveness patterns at all program points based on constraints constructed from the program and programming language semantics. The analysis yields the most precise program-based grammars that satisfy the constraints. The analysis algorithm takes cubic time in terms of the size of the program in the worst case but is very efficient in practice, as shown by our prototype implementation. The analysis results are used to identify and eliminate dead code. The framework for representing and analyzing properties of recursive data structures using general regular tree grammars applies to other analyses as well. [Copyright &y& Elsevier]
- Published
- 2003
- Full Text
- View/download PDF
12. Parallelizing graph construction operations in programs with cyclic graphs
- Author
-
Hwang, Yuan-Shin
- Subjects
- *
GRAPH theory , *PARALLEL processing - Abstract
Pointer analysis has been an active research field in recent years. Combining pointer analysis and dependence analysis can identify parallelism in programs with pointers or graphs, which are in turn represented by dynamic pointer-linked data structures. After parallelism is identified, programs can be parallelized and then executed on parallel multiprocessor systems to achieve high performance. The compiler analysis techniques that have been proposed so far can only recognize parallelism in the traversal references of programs with cyclic graphs by analyzing the traversal patterns on graphs. Consequently, only the graph traversal operations will be parallelized while the graph construction operations will be left to be executed in sequential. Although this approach can usually parallelize the most time-consuming part of these programs (the graph traversal operations account for over 90% of total execution times in most programs), the rest of programs that accounts for less than 10% will dominate the execution on multiprocessor systems according to Amdahl''s law. This paper presents a technique that can identify parallelism in the construction operations of programs even with cyclic graphs. This technique is designed to facilitate automatic parallelization of scientific computational applications that construct cyclic graphs to represent adjacent lists, which in turn describe the interactions of simulated objects. Experimental results show this approach can parallelize the construction phase of such programs and achieve good speedup on multiprocessor systems. [Copyright &y& Elsevier]
- Published
- 2002
- Full Text
- View/download PDF
13. New Shape Analysis and Interprocedural Techniques for Automatic Parallelization of C Codes.
- Author
-
Corbera, Francisco, Asenjo, Rafael, and Zapata, Emilio
- Subjects
- *
DATA structures , *COMPUTER programming , *C (Computer program language) , *ELECTRONIC file management , *DATABASE management , *ELECTRONIC data processing - Abstract
Automatic parallelization of codes with complex data structures is becoming very important. These complex, and often recursive, data structures are widely used in scientific computing. Shape analysis is one of the key steps in the automatic parallelization of such codes. In this paper we extend the Static Shape Graph (SSG) method to enable the successful and accurate detection of complex doubly-linked structures. We have also included an interprocedural analysis in our framework. These techniques have been implemented in a compiler, which has been validated for several C codes. In particular, we present the results the compiler achieves for the C sparse LU factorization algorithm and for a recursive code generating a tree data structure. The output SSG perfectly describes the complex data structure used in these codes. [ABSTRACT FROM AUTHOR]
- Published
- 2002
- Full Text
- View/download PDF
14. Rewrite-Based Satisfiability Procedures for Recursive Data Structures.
- Author
-
Bonacina, Maria Paola and Echenim, Mnacho
- Subjects
FOUNDATIONS of geometry ,MATHEMATICS ,COMPUTATIONAL linguistics ,DATA structures ,ELECTRONIC data processing - Abstract
Abstract: If a rewrite-based inference system is guaranteed to terminate on the axioms of a theory and any set of ground literals, then any theorem-proving strategy based on that inference system is a rewrite-based decision procedure for -satisfiability. In this paper, we consider the class of theories defining recursive data structures, that might appear out of reach for this approach, because they are defined by an infinite set of axioms. We overcome this obstacle by designing a problem reduction that allows us to prove a general termination result for all these theories. We also show that the theorem-proving strategy decides satisfiability problems in any combination of these theories with other theories decided by the rewrite-based approach. [Copyright &y& Elsevier]
- Published
- 2007
- Full Text
- View/download PDF
15. Compile-time dynamic and recursive data structures in Modelica
- Author
-
Matthias Hellerer, Fabian Buse, Zimmer, Dirk, and Bachmann, Bernhard
- Subjects
060201 languages & linguistics ,Raumfahrt-Systemdynamik ,Modelica ,Programming language ,Computer science ,Dymola ,06 humanities and the arts ,02 engineering and technology ,computer.software_genre ,Data structure ,dynamic data structures ,Rendering (computer graphics) ,recursive data structures ,0602 languages and literature ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Dynamic data structures ,computer ,Implementation ,language enhancement ,Compile time - Abstract
The current Modelica Standard (v3.3) does not support dynamic or recursive data structures. For many applications this constitutes a serious restriction rendering certain implementations either impossible or requires elaborate and unelegant constructs. In this paper we will show that support for dynamic and recursive data structures can be implemented in the Modelica IDE Dymola using a variety of advanced constructs. This proves the principle viability of the then proposed inclusion of those data structures in the Modelica Standard.
- Published
- 2017
16. Eliminating dead code on recursive data
- Author
-
Scott D. Stoller and Yanhong A. Liu
- Subjects
Theoretical computer science ,Dead code ,Computer science ,Liveness ,Dead-code elimination ,Program analysis ,020207 software engineering ,Recursive partitioning ,02 engineering and technology ,Parsing expression grammar ,Data structure ,Dead code elimination ,Recursive language ,Constraints ,0202 electrical engineering, electronic engineering, information engineering ,Regular-tree grammars ,020201 artificial intelligence & image processing ,Recursive data structures ,Algorithm ,Slicing ,Software - Abstract
This paper describes a powerful method for dead-code analysis and elimination in the presence of recursive data constructions. We describe partially dead recursive data using liveness patterns based on general regular tree grammars extended with the notion of live and dead, and we formulate the analysis as computing liveness patterns at all program points based on constraints constructed from the program and programming language semantics. The analysis yields the most precise program-based grammars that satisfy the constraints. The analysis algorithm takes cubic time in terms of the size of the program in the worst case but is very efficient in practice, as shown by our prototype implementation. The analysis results are used to identify and eliminate dead code. The framework for representing and analyzing properties of recursive data structures using general regular tree grammars applies to other analyses as well.
- Published
- 2003
- Full Text
- View/download PDF
17. Rewrite-based satisfiability procedures for recursive data structures
- Author
-
Maria Paola Bonacina and Mnacho Echenim
- Subjects
Infinite set ,Class (set theory) ,Reduction (recursion theory) ,Theoretical computer science ,General Computer Science ,Rewrite-based inference systems ,rewrite-based theorem proving ,superposition ,Data structure ,Satisfiability ,Theoretical Computer Science ,Set (abstract data type) ,TheoryofComputation_MATHEMATICALLOGICANDFORMALLANGUAGES ,recursive data structures ,Satisfiability modulo theories ,Algorithm ,Axiom ,Mathematics ,Computer Science(all) - Abstract
If a rewrite-based inference system is guaranteed to terminate on the axioms of a theory T and any set of ground literals, then any theorem-proving strategy based on that inference system is a rewrite-based decision procedure for T-satisfiability. In this paper, we consider the class of theories defining recursive data structures, that might appear out of reach for this approach, because they are defined by an infinite set of axioms. We overcome this obstacle by designing a problem reduction that allows us to prove a general termination result for all these theories. We also show that the theorem-proving strategy decides satisfiability problems in any combination of these theories with other theories decided by the rewrite-based approach.
- Published
- 2007
18. Recursive data structures
- Author
-
Hoare, C. A. R.
- Published
- 1975
- Full Text
- View/download PDF
19. An Array Algebra for Database Applications
- Author
-
Orman, Levent
- Published
- 1985
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.