1,653 results
Search Results
2. Information extraction from images of paper-based maps
- Author
-
J. Alemany and Rangachar Kasturi
- Subjects
Syntax (programming languages) ,Computer science ,Image processing ,computer.software_genre ,Query language ,Set (abstract data type) ,Information extraction ,Digital image processing ,Data mining ,Lisp ,User interface ,computer ,Software ,computer.programming_language - Abstract
The design of a system to extract information automatically from paper-based maps and answer queries related to spatial features and structure of geographic data is considered. The foundation of such a system is a set of image-analysis algorithms for extracting spatial features. Efficient algorithms to detect symbols, identify and track various types of lines, follow closed contours, compute distances, find shortest paths, etc. from simplified map images have been developed. A query processor analyzes the queries presented by the user in a predefined syntax, controls the operation of the image processing algorithms, and interacts with the user. The query processor is written in Lisp and calls image-analysis routines written in Fortran. >
- Published
- 1988
3. Reply to Comments on "An Interval Logic for Real-Time System Specification"
- Author
-
Bellini, Pierfrancesco, Nesi, Paolo, and Rogai, Davide
- Subjects
FORMAL methods (Computer science) ,STRUCTURED techniques of electronic data processing ,SYSTEMS design ,SYSTEMS development ,COMPUTER science ,SYSTEM analysis ,SOFTWARE compatibility - Abstract
The paper on Comments on "An Interval Logic for Real-Time System Specification" presents some remarks on the comparison examples from TILCO and other logics and some slips on the related examples. This paper gives evidence that such issues have no impact on the validity of the TILCO Theory of paper [1] and provides some further clarifications about some aspects of the comparison [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
4. Some Critical Comments on the Paper 'An Optimal Approach to Fault Tolerant Software Systems Design' by Gannon and Shapiro
- Author
-
JL Lloyd, P.A. Lee, and Santosh K. Shrivastava
- Subjects
Computer errors ,Computer science ,business.industry ,Fault tolerance ,Fault detection and isolation ,Reliability engineering ,Software fault tolerance ,Redundancy (engineering) ,Software design ,Systems design ,Software system ,Software engineering ,business ,Software - Published
- 1981
5. The Second Software Engineering Standards Applications Workshop Call for Papers
- Author
-
G. H. MacEwen
- Subjects
Computer science ,business.industry ,Software engineering ,business ,Software - Published
- 1982
6. Guest Editors' Introduction to the Special Issue on the International Conference on Software Maintenance and Evolution.
- Author
-
Gyimóthy, Tibor and Rajlich, Václav
- Subjects
SOFTWARE maintenance ,CONFERENCES & conventions ,ASSOCIATIONS, institutions, etc. ,COMPUTER software ,REVERSE engineering ,COMPUTER programming ,COMPUTER science - Abstract
The article provides an introduction to this special issue related to software maintenance and evolution. The topics discussed in this issue were developed from papers presented at the International Conference on Software Maintenance and Evolution (ICSM), which was held in Budapest, Hungary from September 26-29, 2005; the articles focus on software maintenance and change. Topics such as reverse engineering, documenting, and programming computer software are discussed.
- Published
- 2006
- Full Text
- View/download PDF
7. Editorial: A Message from the New Editor-in-Chief.
- Author
-
Kramer, Jeff
- Subjects
SOFTWARE engineering ,HIGH technology industries ,PERIODICALS ,COMPUTER engineering ,COMPUTER science ,PROFESSIONAL peer review ,COMPUTER software industry - Abstract
An editorial is presented by Jeff Kramer, the new Editor-in-Chief of the "IEEE Transactions on Software Engineering" (TSE) journal, regarding the publications future. A vision statement is presented for the new issues and applications that face the computer engineering industry, and plans are offered to strengthen the Editorial Board. Three proposals are offered to improve the quality of TSE papers, including inviting submissions, being flexible with innovative papers that require further improvements, and the use of case studies.
- Published
- 2006
- Full Text
- View/download PDF
8. Equality to Equals and Unequals: A Revisit of the Equivalence and Nonequivalence Criteria in Class-Level Testing of Object-Oriented Software.
- Author
-
Chen, Huo Yan and Tse, T.H.
- Subjects
OBJECT-oriented programming ,OBJECT-oriented methods (Computer science) ,COMPUTER software testing ,ALGEBRA ,SOFTWARE engineering - Abstract
Algebraic specifications have been used in the testing of object-oriented programs and received much attention since the 1990s. It is generally believed that class-level testing based on algebraic specifications involves two independent aspects: the testing of equivalent and nonequivalent ground terms. Researchers have cited intuitive examples to illustrate the philosophy that even if an implementation satisfies all the requirements specified by the equivalence of ground terms, it may still fail to satisfy some of the requirements specified by the nonequivalence of ground terms. Thus, both the testing of equivalent ground terms and the testing of nonequivalent ground terms have been considered as significant and cannot replace each other. In this paper, we present an innovative finding that, given any canonical specification of a class with proper imports, a complete implementation satisfies all the observationally equivalent ground terms if and only if it satisfies all the observationally nonequivalent ground terms. As a result, these two aspects of software testing cover each other and can therefore replace each other. These findings provide a deeper understanding of software testing based on algebraic specifications, rendering the theory more elegant and complete. We also highlight a couple of important practical implications of our theoretical results. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
9. A Simple Experiment in Top-Down Design.
- Author
-
Comer, Douglas and Halstead, Maurice H.
- Subjects
COMPUTER software ,COMPUTER systems ,SOFTWARE engineering ,COMPUTER programming ,COMPUTER science - Abstract
In this paper we: 1) discuss the need for quantitatively reproducible experiments in the study of top-down design; 2) propose the design and writing of tutorial papers as a suitably general and in- expensive vehicle; 3) suggest the software science parameters as appropriate metrics; 4) report two experiments validating the use of these metrics on outlines and prose; and 5) demonstrate that the experiments tended toward the same optimal modularity. The last point appears to offer a quantitative approach to the estimation of the total length or volume (and the mental effort required to produce it) from an early stage of the top-down design process. If results of these experiments are validated elsewhere, then they will provide basic guidelines for the design process. [ABSTRACT FROM AUTHOR]
- Published
- 1979
10. Foreword.
- Author
-
Wong, Harry K. T.
- Subjects
DATABASE management ,STATISTICS ,ARTIFICIAL intelligence ,EXPERT systems ,SOFTWARE engineering ,COMPUTER science - Abstract
The note introduces a special section containing research papers on statistical and scientific database (SSDB) management, which appeared in the October 1985 issue of the periodical "IEEE Transactions on Software Engineering." The collection of papers reflect the diversity of approaches to the solutions of SSDB management including physical storage organization and the application of artificial intelligence expert system techniques. The first paper is an introduction to the key research issues of SSDB management. The second paper is on data compression. The third paper is on data management support of computational statistical data. The fourth paper is on query facilities for SSDB. The last paper is on applying the production systems techniques in artificial intelligence for performing estimation.
- Published
- 1985
11. A Static Analysis of the NAG Library.
- Author
-
Hennell, Michael A. and Prudom, J. Alan
- Subjects
FORTRAN ,PROGRAMMING languages ,COMPILERS (Computer programs) ,APPLICATION program interfaces ,COMPUTER software ,COMPUTER science - Abstract
This paper reports results obtained from a static analysis of the NAG Mark 4 Fortran numerical algorithms library. The analysis consists of two components. We rust record the gross features, such as the frequency with which language constructs occur, interpret our findings in terms of the NAG coding standards and properties of the Fortran compilers on which the library normally runs. The second component is a more detailed analysis of the routines' logical structure. This part of the paper is an attempt to determine some of the fundamental characteristics of good software. [ABSTRACT FROM AUTHOR]
- Published
- 1980
12. The Third International Conference on Data Engineering.
- Author
-
Wah, Benjamin W.
- Subjects
SOFTWARE engineering ,CONFERENCES & conventions ,COMPUTER science ,COMPUTER engineering ,ARTIFICIAL intelligence - Abstract
The Third International Conference on Data Engineering was held in Los Angeles, California on February 2 to 6, 1987. The conference was designed as an international forum for bringing together researchers, developers, managers, strategic planners and other users with an interest in the research, design and development of data engineering methodologies, strategies and systems. Its scope includes computer science, artificial intelligence, electrical engineering, and computer engineering. It featured papers from all major areas of data engineering, including database design and modeling, performance evaluation, algorithms, integrity, security, fault tolerance, query language, artificial intelligence approaches, knowledge bases, database machines, distributed databases and data engineering applications.
- Published
- 1988
13. Synthesis of Partial Behavior Models from Properties and Scenarios.
- Author
-
Uchitel, Sebastian, Brunet, Greg, and Chechik, Marsha
- Subjects
COMPUTER software development ,STAKEHOLDERS ,EMAIL systems ,ONLINE education ,COMPUTER science ,SOCIAL computing - Abstract
Synthesis of behavior models from software development artifacts such as scenario-based descriptions or requirements specifications helps reduce the effort of model construction. However, the models favored by existing synthesis approaches are not sufficiently expressive to describe both universal constraints provided by requirements and existential statements provided by scenarios. In this paper, we propose a novel synthesis technique that constructs behavior models in the form of Modal Transition Systems (MTS) from a combination of safety properties and scenarios. MTSs distinguish required, possible, and proscribed behavior, and their elaboration not only guarantees the preservation of the properties and scenarios used for synthesis but also supports further elicitation of new requirements. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
14. Guest Editor's Introduction Experimental Computer Science.
- Author
-
Iyer, Ravi K.
- Subjects
COMPUTER science ,TECHNOLOGY ,COMPUTERS ,SOFTWARE engineering ,COMPUTER software ,COMPUTER systems ,ELECTRONIC systems - Abstract
Experimental research in computer science is a relatively new, yet fast developing area. It is encouraging to that there is substantial research going on in this important area. A study on experimental evaluation of a reusability-oriented parallel programming environment is given. There is a significant amount of experimental research in the area of computer dependability.
- Published
- 1990
15. Ambiguity in Processing Boolean Queries on TDMS Tree Structures: A Study of Four Different Philosophies.
- Author
-
Hardgrave, W Terry
- Subjects
QUERY languages (Computer science) ,DATA structures ,ELECTRONIC data processing ,DATABASES ,COMPUTER science ,DATABASE management - Abstract
This paper defines and demonstrates four philosophies for processing queries on tree structures; shows that the data semantics of queries should be described by designating sets of nodes from which values for attributes may be returned to the data consumer; shows that the data semantics of database processing can be specified totally independent of any machine, file structure, or implementation; shows that set theory is a natural and effective vehicle for analyzing the semantics of queries on tree structures; and finally, shows that Bolts is an adequate formalism for conveying the semantics of tree structure processing. [ABSTRACT FROM AUTHOR]
- Published
- 1980
16. The Effectiveness of Software Diversity in a Large Population of Programs.
- Author
-
Van der Meulen, Meine J. P. and Revilla, Miguel A.
- Subjects
COMPUTER reliability ,COMPUTER programming ,FAULT tolerance (Engineering) ,PROGRAMMING languages ,COMPUTER simulation ,COMPUTER system failures ,COMPUTER science - Abstract
In this paper, we first present an exploratory analysis of the aspects of multiple-version software diversity using 36,123 programs written to the same specification. We do so within the framework of the theories of Eckhardt and Lee and Littlewood and Miller. We analyze programming faults made, explore failure regions and difficulty functions, and show how effective 1-out-of-2 diversity is and how language diversity increases this effectiveness. The second part of this paper generalizes the findings about 1-out-of-2 diversity and its special case language diversity by performing statistical analyses of 89,402 programs written to 60 specifications. Most observations in the exploratory analysis are confirmed; however, although the benefit of language diversity can be observed, its effectiveness-appears to be low. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
17. Constructing Interaction Test Suites for Highly-Configurable Systems in the Presence of Constraints: A Greedy Approach.
- Author
-
Cohen, Myra B., Dwyer, Matthew B., and Jiangfan Shi
- Subjects
COMBINATORICS ,CONSTRAINT satisfaction ,PROGRAMMING languages ,ADAPTIVE computing systems ,COMPUTER algorithms ,COMPUTER science - Abstract
Researchers have explored the application of combinatorial interaction testing (CIT) methods to construct samples to drive systematic testing of software system configurational Applying CIT to highly-configurable software systems is complicated by the fact that, in many such systems, there are constraints between specific configuration parameters that render certain combinations invalid. Many CIT algorithms lack a mechanism to avoid these. In recent work, automated constraint solving methods have been combined with search-based CIT construction methods to address the constraint problem with promising results. However, these techniques can incur a nontrivial overhead. In this paper, we build upon our previous work to develop a family of greedy CIT sample generation algorithms that exploit calculations made by modern Boolean satisfiability (SAT) solvers to prune the search space of the CIT problem. We perform a comparative evaluation of the cost effectiveness of these algorithms on four real-world highly-configurable software systems and on a population of synthetic examples that share the characteristics of these systems. In combination, our techniques reduce the cost of CIT in the presence of constraints to 30 percent of the cost of widely used unconstrained CIT methods without sacrificing the quality of the solutions. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
18. Toward the Reverse Engineering of UML Sequence Diagrams for Distributed Java Software.
- Author
-
Briand, Lionel C., Labiche, Yvan, and Leduc, Johanne
- Subjects
REVERSE engineering ,METHODOLOGY ,UNIFIED modeling language ,SOURCE code ,COMPUTER software ,QUALITY assurance ,CHARTS, diagrams, etc. ,COMPUTER science - Abstract
This paper proposes a methodology and instrumentation infrastructure toward the reverse engineering of UML (Unified Modeling Language) sequence diagrams from dynamic analysis. One motivation is, of course, to help people understand the behavior of systems with no (complete) documentation. However, such reverse-engineered dynamic models can also be used for quality assurance purposes, They can, for example, be compared with design sequence diagrams and the conformance of the implementation to the design can thus be verified. Furthermore, discrepancies can also suggest failures in meeting the specifications. Due to size constraints, this paper focuses on the distribution aspects of the methodology we propose. We formally define our approach using metamodels and consistency rules. The instrumentation is based on Aspect-Oriented Programming in order to alleviate the effort overhead usually associated with source code instrumentation. A case study is discussed to demonstrate the applicability of the approach on a concrete example. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
19. Design by Contract to Improve Software Vigilance.
- Author
-
Traon, Yves Le, Baudry, Benoit, and Jézéquel, Jean-Marc
- Subjects
SOFTWARE measurement ,OBJECT-oriented methods (Computer science) ,COMPUTER science ,DIAGNOSIS ,QUALITY control ,COMPUTER software quality control ,COMPUTER programming ,MATHEMATICAL models - Abstract
Design by contract is a lightweight technique for embedding elements of formal specification (such as invariants, pre and postconditions) into an object-oriented design. When Contracts are made executable, they can play the role of embedded, online oracles. Executable contracts allow components to be responsive to erroneous states and, thus, may help in detecting and locating faults. In this paper, we define Vigilance as the degree to which a program is able to detect an erroneous state at runtime. Diagnosability represents the effort needed to locate a fault once it has been detected. In order to estimate the benefit of using Design by Contract, we formalize both notions of Vigilance and Diagnosability as software quality measures. The main steps of measure elaboration are given, from informal definitions of the factors to be measured to the mathematical model of the measures. As is the standard in this domain, the parameters are then fixed through actual measures, based on a mutation analysis in our case. Several measures are presented that reveal and estimate the contribution of contracts to the overall quality of a system in terms of vigilance and diagnosability. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
20. An Empirical Investigation of the Key Factors for Success in Software Process Improvement.
- Author
-
Dybå, Tore
- Subjects
COMPUTER systems ,COMPUTER software ,ORGANIZATION ,INFORMATION technology ,COMPUTER networks ,COMPUTER science - Abstract
Understanding how to implement software process improvement (SRI) successfully is arguably the most challenging issue facing the SPI field today. The SPI literature contains many case studies of successful companies and descriptions of their SRI programs. However, the research efforts to date are limited and inconclusive and without adequate theoretical and psychometric justification. This paper extends and integrates models from prior research by performing an empirical investigation of the key factors for success in SPI. A quantitative survey of 120 software organizations was designed to test the conceptual model and hypotheses of the study. The results indicate that success depends critically on six organizational factors, which explained more than 50 percent of the variance in the outcome variable. The main contribution of the paper is to increase the understanding of the influence of organizational issues by empirically showing that they are at least as important as technology for succeeding with SPI and, thus, to provide researchers and practitioners with important new insights regarding the critical factors of success in SPI. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
21. Retargeting Sequential Image-Processing Programs for Data Parallel Execution.
- Author
-
Baumstark, Jr., Lewis B. and Wills, Linda M.
- Subjects
SOFTWARE compatibility ,COMPUTER software ,COMPUTER systems ,COMPUTER architecture ,COMPUTER science - Abstract
New compact, low-power implementation technologies for processors and imaging arrays can enable a new generation of portable video products. However, software compatibility with large bodies of existing applications written in C prevents more efficient, higher performance data parallel architectures from being used in these embedded products. If this software could be automatically retargeted explicitly for data parallel execution, product designers could incorporate these architectures into embedded products. The key challenge is exposing the parallelism that is inherent in these applications but that is obscured by artifacts imposed by sequential programming languages. This paper presents a recognition-based approach for automatically extracting a data parallel program model from sequential image processing code and retargeting it to data parallel execution mechanisms. The explicitly parallel model presented, called multidimensional data flow (MDDF), captures a model of how operations on data regions (e.g., rows, columns, and tiled blocks) are composed and interact. To extract an MDDF model, a partial recognition technique is used that focuses on identifying array access patterns in loops, transforming only those program elements that hinder parallelization, while leaving the core algorithmic computations intact. The paper presents results of retargeting a set of production programs to a representative data parallel processor array to demonstrate the capacity to extract parallelism using this technique. The retargeted applications yield a potential execution throughput limited only by the number of processing elements, exceeding thousands of instructions per cycle in massively parallel implementations. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
22. Toward Formalizing Domain Modeling Semantics in Language Syntax.
- Author
-
Evermann, Joerg and Wand, Yair
- Subjects
INFORMATION resources ,PROGRAMMING languages ,UNIFIED modeling language ,COMPUTER programming ,COMPUTER software development ,COMPUTER science - Abstract
Information Systems are situated in and are representations of some business or organizational domain. Hence, understanding the application domain is critical to the success of information systems development. To support domain understanding, the application domain is represented in conceptual models. The correctness of conceptual models can affect the development outcome and prevent costly rework during later development stages. This paper proposes a method to restrict the syntax of a modeling language to ensure that only possible configurations of a domain can be modeled, thus increasing the likelihood of creating correct domain models. The proposed method, based on domain ontologies, captures relationships among domain elements via constraints on the language metamodel, thus restricting the set of statements about the domain that can be generated with the language. In effect, this method creates domain specific modeling languages from more generic ones. The method is demonstrated using the Unified Modeling Language (UML). Specifically, it is applied to the subset of UML dealing with object behavior and its applicability is demonstrated on a specific modeling example. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
23. Dynamic Coupling Measurement for Object-Oriented Software.
- Author
-
Arisholm, Erik, Briand, Lionel C., and Føyen, Audun
- Subjects
OBJECT-oriented methods (Computer science) ,COMPUTER software ,COMPUTER science ,ELECTRONIC systems ,DATA transmission systems ,COMPUTER systems - Abstract
The relationships between coupling and external quality factors of object-oriented software have been studied extensively for the past few years. For example, several, studies have identified clear empirical relationships between class-level coupling and class fault-proneness. A common way to define and measure coupling is through structural properties and static code analysis. However, because of polymorphism, dynamic binding, and the common presence of unused ("dead") code in commercial software, the resulting coupling measures are imprecise as they do not perfectly reflect the actual coupling taking place among classes at runtime. For example, when using static analysis to measure coupling, it is difficult and sometimes impossible to determine what actual methods can be invoked from a client class if those methods are overridden in the subclasses of the server classes. Coupling measurement has traditionally been performed using static code analysis, because most of the existing work was done on nonobject oriented code and because dynamic code analysis is more expensive and complex to perform. For modem software systems, however, this focus on static analysis can be problematic because although dynamic binding existed before the advent of object-orientation, its usage has increased significantly in the last decade. This paper describes how coupling can be defined and precisely measured based on dynamic analysis of systems. We refer to this type of coupling as dynamic coupling. An empirical evaluation of the proposed dynamic coupling measures is reported in which we study the relationship of these measures with the change proneness of classes. Data from maintenance releases of a large Java system are used for this purpose. Preliminary results suggest that some dynamic coupling measures are significant indicators of change proneness and that they complement existing coupling measures based on static analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
24. Correction to "A Practical View of Software Measurement and Implementation Experiences Within Motorola".
- Author
-
Leveson, N.C.
- Subjects
SOFTWARE engineering ,COMPUTER multitasking ,MATHEMATICAL logic ,COMPUTER programming ,ELECTRONIC data processing ,COMPUTER science - Abstract
In the paper, "Temporal Logic-Based Deadlock Analysis for Ada," reference [9] was duplicated as reference [10], causing the remaining references to be misnumbered by one. The correctly numbered reference list is given below in its entirety. All reference citations in the paper's text are correct as published.
- Published
- 1993
25. Editorial: Program Transformations.
- Author
-
Balzer, Robert and Cheatham Jr., Thomas E.
- Subjects
PROGRAM transformation ,COMPUTER software ,SOFTWARE engineering ,COMPUTER programming ,COMPUTER science - Abstract
The field of transformations is developing a methodology for manipulating programs, mapping them from one form into another. Transforming a program will preserve some of its properties, and alter others. While many choices could be made about which properties to preserve, the transformation field has focused on altering the performance characteristics of programs while preserving their 'semantics.' The papers published in the January 1981 issue of the periodical "IEEE Transactions on Software Engineering" are necessarily limited. Many of the important topics are not trated. The papers were selected to present a flavor for some aspects of the transformation field and to illustrate some research.
- Published
- 1981
26. Guest Editorial.
- Author
-
Ramamoorthy, C. V.
- Subjects
SOFTWARE engineering ,PERIODICALS ,COMPUTER science ,CONFERENCES & conventions ,APPLICATION software ,COMPUTER software - Abstract
Some papers from the 1978 computer science conference, COMPSAC 78 Conference, are presented in this issue of the "IEEE Transaction on Software Engineering". Computer science of which software engineering is a very important component, is a multifaceted discipline and the computer applications are providing the major avenues of its growth, just as physical and biological sciences have been the beacon lights of the growth of mathematical sciences. The papers in this issue and those presented at COMPSAC 78 portray this emerging thrust of software application experience.
- Published
- 1980
27. Understanding Code Mobility.
- Author
-
Fuggetta, Alfonso, Picco, Gian Pietor, and Vigna, Giovanni
- Subjects
APPLICATION software ,INTERNET ,SCALABILITY ,COMPUTER networks ,INDUSTRIALIZATION ,COMPUTER architecture ,TECHNOLOGY ,COMPUTER science - Abstract
The technologies, architecures, and methodologies traditionally used to develop distributed applications exhibit a variety of limitations and drawbacks when applied to large scale distributed settlings (e.g., the Internet). In particular, they fail in providing the desired degree of configurability, scalability, and customizability. To address these issues, researchers are investigating a varity of innovative approaches. The most promising and intriguing ones are those based on the ability of moving code across the nodes of a network, expoloiting the notion of mobile code. As an emerging research field, code mobility is generating a growing body of scientific literature and industrial developments. Nevertheless, the field is still characterized by the lack of a sound and comprehensive body of concepts and terms. As a consequence, it is rather difficult to understand, assess, and compare the existing approaches. In turn, this limits our ability to fully exploit them in practice, and to further promote the research work on mobile code. Indeed, a significant symptom of this situation is the lack of a commonly accepted and sound definition of the term "mobile code" itself. This paper presents a conceptual framework for understanding code mobility. The framework is centered around a classification that introduces three dimensions: technologies, design paradigms, and applications. The contribution of the paper is two-fold. First, it provides a set of terms and concepts to understand and compare the approaches based on the notion of mobile code. Second, it introduces criteria and guidelines that support the developer in the identification of the classes of applications that can leverage off of mobile code, in the design of these applications, and finally, in the selection of the most appropriate implemention technlogies. The presentation of the classification is interwined with a review of state-of-the-art in the field. Finally, the use of the classification is exemplified in a case study. Mobile code, mobile agent, distributed application, design paradigm. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
28. A Procedure for Analyzing Unbalanced Datasets.
- Author
-
Kitchenham, Barbara
- Subjects
SOFTWARE measurement ,BEST practices ,MATHEMATICAL statistics ,EXPERIMENTAL design ,SOFTWARE engineering ,SOFTWARE productivity ,COMPUTER science - Abstract
This paper describes a procedure for analyzing unbalanced datasets that include many nominal- and ordinal-scale factors. Such datasets are often found in company datasets used for benchmarking and productivity assessment. The two major problems caused by lack of balance are that the impact of factors can be concealed and that spurious impacts can be observed. These effects are examined with the help of two small artificial datasets. The paper proposes a method of forward pass residual analysis to analyze such datasets. The analysis procedure is demonstrated on the artificial datasets and then applied to the COCOMO dataset. The paper ends with a discussion of the advantages and limitations of the analysis procedure. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
29. A Multiframe Model for Real-Time Tasks.
- Author
-
Mok, Aloysius K. and Deji Chen
- Subjects
PRODUCTION scheduling ,COMPUTER science ,REAL-time programming ,COMPUTER algorithms ,COMPUTER systems ,MULTIMEDIA computer applications ,JOB descriptions - Abstract
The well-known periodic task model of Liu and Layland [10] assumes a worst-case execution time bound for every task and may be too pessimistic if the worst-case execution time of a task is much longer than the average. In this paper, we give a multiframe real-time task model which allows the execution time of a task to vary from one instance to another by specifying the execution time of a task in terms of a sequence of numbers. We investigate the schedulability problem for this model for the preemptive fixed priority scheduling policy. We show that a significant improvement in the utilization bound can be established in our model. [ABSTRACT FROM AUTHOR]
- Published
- 1997
- Full Text
- View/download PDF
30. Mechanically Verifying Concurrent Programs with the Boyer-Moore Prover.
- Author
-
Goldschlag, David M.
- Subjects
SEMANTICS ,ALGORITHMS ,ENCODING ,COMPUTER software ,COMPUTER science - Abstract
This paper describes a proof system suitable for the mechanical verification of concurrent programs. Mechanical verification, which uses a computer program to validate a formal proof, increases one's confidence in the correctness of the validated proof. This proof system is based on Unity [12], and may be used to specify and verify both safety and liveness properties. However, it is defined with respect to an operational semantics of the transition system model of concurrency. Proof rules are simply theorems of this operational semantics. This methodology makes a clear distinction between the theorems in the proof system and the logical inference rules and syntax which define the underlying logic. Since this proof system essentially encodes Unity in another sound logic, and this encoding has been mechanically verified, this encoding proves the soundness of this formalization of Unity. This proof system has been mechanically verified by the Boyer-Moore prover, a computer program mechanizing the Boyer-Moore logic [7]. This proof system has been used to mechanically verify the correctness of a distributed algorithm that computes the minimum node value in a tree. This paper also describes this algorithm and its correctness theorems, and presents the key lemmas that aided the mechanical verification. The mechanized proof closely resembles a hand proof, but is longer, since all concepts are defined from first principles. This proof system is suitable for the mechanical verification of a wide class of programs, since the underlying prover, though automatic, is guided by the user. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
31. Debugging Larch Shared Language Specifications.
- Author
-
Garland, Stephen J., Guttag, John V., and Horning, James J.
- Subjects
PROGRAMMING languages ,DEBUGGING ,AUTOMATION ,ELECTRONIC data processing ,ARTIFICIAL languages ,COMPUTER science - Abstract
The Larch family of specification languages supports a two-tiered definitional approach to specification. Each specification has components written in two languages: one designed for a specific programming language and another independent of any programming language. The former are called Larch interface languages, and the latter the Larch Shared Language (LSL). The Larch style of specification emphasizes brevity and clarity rather than executability. To make it possible to test specifications without executing or implementing them, Larch permits specifiers to make claims about logical properties of specifications and to check these claims at specification time. Since these claims are undecidable in the general case, it is impossible to build a tool that will automatically certify claims about arbitrary specifications. However, it is feasible to build tools that assist specifiers in checking claims as they debug specifications. This paper describes the checkability designed into LSL and discusses two tools that help perform the checking. This paper is a revised and expanded version of a paper presented at the April 1990 IFIP Working Conference on Programming Concepts and Methods [7]. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
32. Mechanizing CSP Trace Theory in Higher Order Logic.
- Author
-
Camilleri, Albert John
- Subjects
CSP (Computer program language) ,PROGRAMMING languages ,HUMAN error ,ALGEBRA ,DISTRIBUTED computing ,COMPUTER science - Abstract
The process algebra CSP is widely used for formal reasoning in the areas of concurrency, communication, and distributed systems. Mathematical proof plays a key role in CSP reasoning, but despite this, little mechanical proof support has been developed for CSP to facilitate the exercise and eliminate the risk of human error. In this paper we describe how a mechanized tool for reasoning about CSP can be developed by customizing an existing general-purpose theorem prover based on higher order logic. We investigate how the trace semantics of CSP operators can be mechanized in higher order logic, and show how the laws associated with these operators can be proved from their semantic definitions. The resulting system is one in which natural-deduction style proofs can be conducted using the standard CSP laws. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
33. The Specification and Verified Decomposition of System Requirements Using CSP.
- Author
-
Moore, Andrew P.
- Subjects
COMPUTER systems ,SYSTEMS design ,SYNCHRONIZATION ,SEQUENTIAL processing (Computer science) ,SYSTEM analysis ,COMPUTER science - Abstract
An important principle of building trustworthy systems is to rigorously analyze the critical requirements early in the development process, even before starting system design. Existing proof methods for systems of communicating processes focus on the bottom-up composition of component-level specifications into system-level specifications. Trustworthy system development requires, instead, the top-down derivation of component requirements from the critical system requirements. This paper describes a formal method for decomposing the requirements of a system into requirements of its component processes and a minimal, possibly empty, set of synchronization requirements. The Trace Model of Hoare's Communicating Sequential Processes (CSP) is the basis for the formal method. We apply the method to an abstract voice transmitter and describe the role that the EHDM verification system plays in the transmitter's decomposition. In combination with other verification techniques, we expect that the method defined here will promote the development of more trustworthy systems. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
34. Programmer-Transparent Coordination of Recovering Concurrent Processes: Philosophy and Rules for Efficient Implementation.
- Author
-
Kim, K. H.
- Subjects
COMPUTER programming ,ELECTRONIC data processing ,COMPUTER science ,COMPUTER systems ,COMPUTER engineering ,ALGORITHMS - Abstract
A new approach to coordination of cooperating concurrent processes, each capable of error detection and recovery, is presented. Error detection, rollback, and retry in a process are specified by a well-structured language construct called recovery block. Recovery points of processes must be properly coordinated to prevent a disastrous avalanche of process rollbacks. In contrast to the previously studied approaches that require the program designer to coordinate the recovery block structures of interacting processes (thereby coordinating the recovery points of processes), the new approach relieves the program designer of that burden It instead relies upon an intelligent processor system (that runs processes) capable of establishing and discarding the recovery points of interacting processes in a well coordinated manner such that 1) a process never makes two consecutive rollbacks without making a retry between the two, and 2) every process rollback becomes a minimum-distance rollback. Following the discussion of the underlying philosophy of the new approach, basic rules of reducing storage and time overhead in such a processor system are discussed. Throughout this paper examples are drawn from the systems in which processes communicate through monitors. [ABSTRACT FROM AUTHOR]
- Published
- 1988
- Full Text
- View/download PDF
35. A Distributed Drafting Algorithm for Load Balancing.
- Author
-
Ni, Lionel M., Chong-Wei Xu, and Gendreau, Thomas B.
- Subjects
DISTRIBUTED computing ,ALGORITHMS ,COMPUTER networks ,DISTRIBUTED operating systems (Computers) ,SOFTWARE engineering ,COMPUTER science - Abstract
It is desirable for the load in a distributed system to be balanced evenly. A dynamic process migration protocol is needed in order to achieve load balancing in a user transparent manner. A distributed algorithm for load balancing which is network topology independent is proposed in this paper. Different network topologies and low-level communications protocols affect the choice of only some system design parameters. The "drafting" algorithm attempts to compromise two contradictory goals: maximize the processor utilization and minimize the communication overhead. The main objective of this paper is to describe the dynamic process migration protocol based on the proposed drafting algorithm. A sample distributed system is used to further illustrate the drafting algorithm and to show how to define system design parameters. The system performance is measured by simulation experiments based on the sample system. [ABSTRACT FROM AUTHOR]
- Published
- 1985
36. Statistical and Scientific Database Issues.
- Author
-
Shoshani, Arie and Wong, Harry K. T.
- Subjects
DATABASE management ,MULTIDIMENSIONAL databases ,USER interfaces ,DATABASES ,ELECTRONIC data processing ,COMPUTER science - Abstract
The purpose of this paper is to summarize the research issues of statistical and scientific databases (SSDB's). It organizes the issues into four major groups: physical organization and access methods, operators, logical organization and user interfaces, and miscellaneous issues. It emphasizes the differences between SSDB's and traditional database applications, and motivates the need for new and innovative techniques for the support of SSDB's. In addition to describing current work in this field, it discusses open research areas and proposes possible approaches to their solution. [ABSTRACT FROM AUTHOR]
- Published
- 1985
37. A Successful Software Development.
- Author
-
Wong, Carolyn
- Subjects
COMPUTER software development ,COMMAND & control systems ,STRUCTURED programming ,STRUCTURED walkthrough (Computer science) ,SOFTWARE engineering ,COMPUTER science - Abstract
In 1980, System Development Corporation (SDC) delivered software for a modern air defense system (ADS) for a foreign country. Development of the ADS software was a successful SCD project where all products were delivered within budget and within an ambitious 25 month schedule. This paper describes SDC's approach and experience in developing ADS software. SDC's software development approach included the first time use of an oft-the-shelf operating system for a major air defense system, the application of a selective set of modern software development techniques, and use of a matrix management structure. SDC's successful application on ADS of a commercial (`pent- log system, a higher order language, a Program Design Language, a Pro- gram Production Library, structured walk-throughs, structured programming techniques, incremental build implementation and test procedures, interactive development, and word processing is described. A discussion of the advantages realized and difficulties encountered in the ADS matrix management structure is presented. The paper concludes with a summary of how SDC will develop software on future projects as a result of its experience on ADS. [ABSTRACT FROM AUTHOR]
- Published
- 1984
38. Software Science Revisited: A Critical Analysis of the Theory and Its Empirical Support.
- Author
-
Shen, Vincent Y., Conte, Samuel D., and Dunsmore, H. E.
- Subjects
COMPUTER software ,COMPUTER science ,COMPUTER training ,SOFTWARE engineering - Abstract
-c_The theory of software science was developed by the late M. H. Halstead of Purdue University during the early 1970's. It was first presented in unified form in the monograph Elements of Software Science published by Elsevier North-Holland in 1977. Since it claimed to apply scientific methods to the very complex and important problem of software production, and since experimental evidence supplied by Halstead and others seemed to support the theory, it drew widespread attention from the computer science community. Some researchers have raised serious questions about the underlying theory of software science. At the same time, experimental evidence supporting some of the metrics continue to be presented. This paper is a critique of the theory as presented by Halstead and a review of experimental results concerning software science metrics published since 1977. [ABSTRACT FROM AUTHOR]
- Published
- 1983
39. The Implementation of Run-Time Diagnostics in Pascal.
- Author
-
Fischer, Charles N. and Leblanc, Richard J.
- Subjects
PASCAL (Computer program language) ,COMPILERS (Computer programs) ,PROGRAMMING languages ,COMPUTER programming ,COMPUTER software ,COMPUTER science - Abstract
This paper considers the role of run-time diagnostic checking in enforcing the rules of the Pascal programming language. Run- time diagnostic checks must be both complete (covering all language requirements) and efficient. Further, such checks should be implemented so that the cost of enforcing the correct use of a given construct is borne by users of that construct. This paper describes simple and efficient mechanisms currently in use with a diagnostic Pascal compiler that monitor the run-time behavior of such sensitive Pascal constructs as pointers, variant records, reference (i.e., vat) parameters, and with statements. The use of these mechanisms with related constructs in other languages is considered. Language modifications that simplify run-time checking are also noted. [ABSTRACT FROM AUTHOR]
- Published
- 1980
40. Compile Time Memory Allocation for Parallel Processes.
- Author
-
Bochmann, Gregor V.
- Subjects
DYNAMIC storage allocation (Computer science) ,COMPUTER programming ,COMPUTER science ,MATHEMATICAL analysis ,SOFTWARE engineering - Abstract
This paper discusses the problem of allocating storage for the activation records of procedure calls within a system of parallel processes. A compile time storage allocation scheme is given, which determines the relative address within the memory segment of a process for the activation records of all procedures called by the process. This facilitates the generation of an efficient run-time code. The allocation scheme applies to systems in which data and procedures can be shared among several processes. However, recursive procedure calls are not supported. [ABSTRACT FROM AUTHOR]
- Published
- 1978
41. Cover2.
- Subjects
COMPUTER software periodicals ,SOFTWARE engineering ,PERIODICAL editors ,COMPUTER science ,MEMBERSHIP in associations, institutions, etc. ,PUBLICATIONS ,SOCIETIES - Published
- 2011
- Full Text
- View/download PDF
42. Editorial: A New Editor-in-Chief and the State of TSE.
- Author
-
Knight, John
- Subjects
COMPUTER software industry ,SOFTWARE engineering ,COLLEGE teachers ,COMPUTER science ,ENGINEERING ,INDUSTRIAL costs - Abstract
The editorial introduces the new Editor-in-Chief of the "IEEE Transactions on Software Engineering" journal, Jeff Kramer, professor in the Department of Computing at the Imperial College of Science, Technology, and Medicine in London, England. The status of the journal is discussed, including gratitude for the creativity displayed in the 341 papers received in 2005. The future of the software industry is discussed, along with difficulties in improving reliability and lowering production cost.
- Published
- 2006
- Full Text
- View/download PDF
43. Practical Mutation Testing at Scale: A view from Google
- Author
-
René Just, Goran Petrovic, Marko Ivankovic, and Gordon Fraser
- Subjects
Code review ,Source lines of code ,business.industry ,Computer science ,Code coverage ,Software development ,Python (programming language) ,computer.software_genre ,Machine learning ,Test suite ,Mutation testing ,Artificial intelligence ,business ,computer ,Software ,Codebase ,computer.programming_language - Abstract
Mutation analysis assesses a test suites adequacy by measuring its ability to detect small artificial faults, systematically seeded into the tested program. Mutation analysis is considered one of the strongest test-adequacy criteria. Mutation testing builds on top of mutation analysis and is a testing technique that uses mutants as test goals to create or improve a test suite. Mutation testing has long been considered intractable because the sheer number of mutants that can be created represents an insurmountable problemboth in terms of human and computational effort. This has hindered the adoption of mutation testing as an industry standard. For example, Google has a codebase of two billion lines of code and more than 150,000,000 tests are executed on a daily basis. The traditional approach to mutation testing does not scale to such an environment; even existing solutions to speed up mutation analysis are insufficient to make it computationally feasible at such a scale. To address these challenges, this paper presents a scalable approach to mutation testing based on the following main ideas: (1) mutation testing is done incrementally, mutating only changed code during code review, rather than the entire code base; (2) mutants are filtered, removing mutants that are likely to be irrelevant to developers, and limiting the number of mutants per line and per code review process; (3) mutants are selected based on the historical performance of mutation operators, further eliminating irrelevant mutants and improving mutant quality. This paper empirically validates the proposed approach by analyzing its effectiveness in a code-review-based setting, used by more than 24,000 developers on more than 1,000 projects. The results show that the proposed approach produces orders of magnitude fewer mutants and that context-based mutant filtering and selection improve mutant quality and actionability. Overall, the proposed approach represents a mutation testing framework that seamlessly integrates into the software development workflow and is applicable to industrial settings of any size.
- Published
- 2022
44. Mutation Analysis for Cyber-Physical Systems: Scalable Solutions and Results in the Space Domain
- Author
-
Cornejo Olivares, Oscar Eduardo, Pastore, Fabrizio, Briand, Lionel, European Space Agency - ESA [sponsor], European Research Council - ERC [sponsor], NSERC Discovery [sponsor], and Interdisciplinary Centre for Security, Reliability and Trust (SnT) > Software Verification and Validation Lab (SVV Lab) [research center]
- Subjects
FOS: Computer and information sciences ,Computer science [C05] [Engineering, computing & technology] ,mutation testing ,business.industry ,Computer science ,Software development ,Cyber-physical system ,embedded software ,020207 software engineering ,02 engineering and technology ,cyber-physical systems ,Sciences informatiques [C05] [Ingénierie, informatique & technologie] ,Software Engineering (cs.SE) ,Computer Science - Software Engineering ,Embedded software ,Software ,Software quality assurance ,0202 electrical engineering, electronic engineering, information engineering ,Test suite ,space software ,Software verification and validation ,Software system ,business ,Software engineering - Abstract
On-board embedded software developed for spaceflight systems (space software) must adhere to stringent software quality assurance procedures. For example, verification and validation activities are typically performed and assessed by third party organizations. To further minimize the risk of human mistakes, space agencies, such as the European Space Agency (ESA), are looking for automated solutions for the assessment of software testing activities, which play a crucial role in this context. Over the years, mutation analysis has shown to be a promising solution for the automated assessment of test suites; it consists of measuring the quality of a test suite in terms of the percentage of injected faults leading to a test failure. A number of optimization techniques, addressing scalability and accuracy problems, have been proposed to facilitate the industrial adoption of mutation analysis. However, to date, two major problems prevent space agencies from enforcing mutation analysis in space software development. In this paper, we enhance mutation analysis optimization techniques to enable their applicability to embedded software and propose a pipeline that successfully integrates them to address scalability and accuracy issues in this context, as described above. Further, we report on the largest study involving embedded software systems in the mutation analysis literature. Our research is part of a research project funded by ESA ESTEC involving private companies (GomSpace Luxembourg and LuxSpace) in the space sector. These industry partners provided the case studies reported in this paper; they include an on-board software system managing a microsatellite currently on-orbit, a set of libraries used in deployed cubesats, and a mathematical library certified by ESA., Accepted for publication on IEEE TRANSACTIONS ON SOFTWARE ENGINEERING
- Published
- 2022
45. Deep Learning Based Vulnerability Detection: Are We There Yet?
- Author
-
Saikat Chakraborty, Yangruibo Ding, Rahul Krishna, and Baishakhi Ray
- Subjects
FOS: Computer and information sciences ,Data collection ,Computer science ,business.industry ,Deep learning ,media_common.quotation_subject ,Machine learning ,computer.software_genre ,Data modeling ,Software Engineering (cs.SE) ,Computer Science - Software Engineering ,Program analysis ,Software security assurance ,False positive paradox ,Data deduplication ,Artificial intelligence ,business ,Function (engineering) ,computer ,Software ,media_common - Abstract
Automated detection of software vulnerabilities is a fundamental problem in software security. Existing program analysis techniques either suffer from high false positives or false negatives. Recent progress in Deep Learning (DL) has resulted in a surge of interest in applying DL for automated vulnerability detection. Several recent studies have demonstrated promising results achieving an accuracy of up to 95% at detecting vulnerabilities. In this paper, we ask, "how well do the state-of-the-art DL-based techniques perform in a real-world vulnerability prediction scenario?". To our surprise, we find that their performance drops by more than 50%. A systematic investigation of what causes such precipitous performance drop reveals that existing DL-based vulnerability prediction approaches suffer from challenges with the training data (e.g., data duplication, unrealistic distribution of vulnerable classes, etc.) and with the model choices (e.g., simple token-based models). As a result, these approaches often do not learn features related to the actual cause of the vulnerabilities. Instead, they learn unrelated artifacts from the dataset (e.g., specific variable/function names, etc.). Leveraging these empirical findings, we demonstrate how a more principled approach to data collection and model design, based on realistic settings of vulnerability prediction, can lead to better solutions. The resulting tools perform significantly better than the studied baseline: up to 33.57% boost in precision and 128.38% boost in recall compared to the best performing model in the literature. Overall, this paper elucidates existing DL-based vulnerability prediction systems' potential issues and draws a roadmap for future DL-based vulnerability prediction research. In that spirit, we make available all the artifacts supporting our results: https://git.io/Jf6IA., Comment: Under Review IEEE Transactions on Software Engineering
- Published
- 2022
46. Empirically Evaluating the Effect of the Physics of Notations on Model Construction
- Author
-
Mohamed El-Attar
- Subjects
Comprehension ,Process (engineering) ,business.industry ,Computer science ,Empirical evidence ,Software engineering ,business ,Notation ,Software - Abstract
In 2009, Moody introduced nine principles for evaluating, improving and designing cognitively effective notations called the Physics of Notations [49] motivating many research works ever since, being cited more than 1250 times at the time of writing this paper. Many research works have adopted the nine principles of the Physics of Notations to improve existing notations or devise new notations. Modeling is a two-step process that has the goal of communicating a mental concept by a model constructor (step one) to a model reader (step two). A subset of the research works utilizing the Physics of Notations have empirically validated the cognitive effectiveness of the new notations by their readers. However, there lacks any empirical evidence that investigates the effect of using Physics of Notations-enabled notations in model construction. This is a serious matter to be investigated as naturally model construction preludes model comprehension. Poorly constructed models can at best be poorly comprehended by its readers having dire consequences in downstream development activities. This paper reports on three different experiments that use software engineering professionals as subjects. The experiments investigate the effect of using notations that adhere to the Physics if Notations principles on model construction efforts. The results do not indicate an outright advantage for model constructors who utilize Physics of Notations-enabled notations in comparison to using their original versions of these notations.
- Published
- 2022
47. Editorial.
- Author
-
Ramamoorthy, C. V.
- Subjects
PERIODICAL editors ,COLLEGE teachers ,SOFTWARE engineering ,COMPUTER science - Abstract
The note announces the addition of Jack Mostow and Peter Ng to the Editorial Board of the periodical "IEEE Transactions on Software Engineering." Jack Mostow is a professor of computer science at Rutgers University and was previously associated with the University of Southern California's Information Sciences Institute as a research scientist in the area of artificial intelligence. Meanwhile, Peter Ng is a professor of computer science at the University of Missouri in Columbia and is also Chairman of its Department of Computer Science. Their appointments were approved by the Editorial Board and the Publications Board, Transactions Advisory Committee and Executive Committee of the IEEE Computer Society.
- Published
- 1985
48. GEA: A Goal-Driven Approach toDiscovering Early Aspects.
- Author
-
Lee, Jonathan and Hsu, Kuo-Hsun
- Subjects
FUZZY logic ,MATHEMATICAL logic ,FUZZY systems ,USE cases (Systems engineering) ,SYSTEMS engineering - Abstract
Aspect-oriented software development has become an important development and maintenance approach to software engineering across requirements, design and implementation phases. However, discovering early aspects from requirements for a better integration of crosscutting concerns into a target system is still not well addressed in the existing works. In this paper, we propose a Goal-driven Early Aspect approach (called GEA) to discovering early aspects by means of a clustering algorithm in which relationships among goals and use cases are utilized to explore similarity degrees of clustering goals, and total interaction degrees are devised to check the validity of the formation of each cluster. Introducing early aspects not only enhances the goal-driven requirements modeling to manage crosscutting concerns, but also provides modularity insights into the analysis and design of software development. Moreover, relationships among goals represented numerically are more informative to discover early aspects and more easily to be processed computationally than qualitative terms. The proposed approach is illustrated by using two problem domains: a meeting scheduler system and a course enrollment system. An experiment is also conducted to evaluate the benefits of the proposed approach with Mann-Whitney U-test to show that the difference between with GEA and without GEA is statistically significant. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
49. A Survey on the Use of Computer Vision to Improve Software Engineering Tasks
- Author
-
Mohammad Bajammal, Andrea Stocco, Ali Mesbah, and Davood Mazinanian
- Subjects
Source code ,Computer science ,business.industry ,media_common.quotation_subject ,Perspective (graphical) ,Field (computer science) ,Variety (cybernetics) ,Software ,Computer vision ,Artificial intelligence ,Software analysis pattern ,business ,Software engineering ,media_common - Abstract
Software engineering (SE) research has traditionally revolved around engineering the source code. However, novel approaches that analyze software through computer vision have been increasingly adopted in SE. These approaches allow analyzing the software from a different complementary perspective other than the source code, and they are used to either complement existing source code-based methods, or to overcome their limitations. The goal of this manuscript is to survey the use of computer vision techniques in SE with the aim of assessing their potential in advancing the field of SE research. We examined an extensive body of literature from top-tier SE venues, as well as venues from closely related fields (machine learning, computer vision, and human-computer interaction). Our inclusion criteria targeted papers applying computer vision techniques that address problems related to any area of SE. We collected an initial pool of 2,716 papers, from which we obtained 66 final relevant papers covering a variety of SE areas. We analyzed what computer vision techniques have been adopted or designed, for what reasons, how they are used, what benefits they provide, and how they are evaluated. Our findings highlight that visual approaches have been adopted in a wide variety of SE tasks, predominantly for effectively tackling software analysis and testing challenges in the web and mobile domains. The results also show a rapid growth trend of the use of computer vision techniques in SE research.
- Published
- 2022
50. Efficient Summary Reuse for Software Regression Verification
- Author
-
Liming Cai, Fei He, and Qianshan Yu
- Subjects
Software ,Computer science ,business.industry ,Linux kernel ,Context (language use) ,Software system ,Software regression ,Reuse ,Software engineering ,business ,Regression ,Counterexample - Abstract
Software systems evolve throughout their life cycles. Many revisions are produced over time. Verifying each revision of the software is impractical. Regression verification suggests reusing intermediate results from the previous verification runs. This paper studies regression verification via summary reuse. Not only procedure summaries, but also loop summaries are proposed to be reused. This paper proposes a fully automatic regression verification technique in the context of CEGAR. A lazy counterexample analysis technique is developed to improve the efficiency of summary reuse. We performed extensive experiments on two large sets of industrial programs (3,675 revisions of 488 Linux kernel device drivers). Results show that our summary reuse technique saves 84% to 93% analysis time of the regression verification.
- Published
- 2022
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.