319 results
Search Results
2. Guest Editors' Introduction to the Special Issue on the International Conference on Software Maintenance and Evolution.
- Author
-
Gyimóthy, Tibor and Rajlich, Václav
- Subjects
SOFTWARE maintenance ,CONFERENCES & conventions ,ASSOCIATIONS, institutions, etc. ,COMPUTER software ,REVERSE engineering ,COMPUTER programming ,COMPUTER science - Abstract
The article provides an introduction to this special issue related to software maintenance and evolution. The topics discussed in this issue were developed from papers presented at the International Conference on Software Maintenance and Evolution (ICSM), which was held in Budapest, Hungary from September 26-29, 2005; the articles focus on software maintenance and change. Topics such as reverse engineering, documenting, and programming computer software are discussed.
- Published
- 2006
- Full Text
- View/download PDF
3. Editorial: A Message from the New Editor-in-Chief.
- Author
-
Kramer, Jeff
- Subjects
SOFTWARE engineering ,HIGH technology industries ,PERIODICALS ,COMPUTER engineering ,COMPUTER science ,PROFESSIONAL peer review ,COMPUTER software industry - Abstract
An editorial is presented by Jeff Kramer, the new Editor-in-Chief of the "IEEE Transactions on Software Engineering" (TSE) journal, regarding the publications future. A vision statement is presented for the new issues and applications that face the computer engineering industry, and plans are offered to strengthen the Editorial Board. Three proposals are offered to improve the quality of TSE papers, including inviting submissions, being flexible with innovative papers that require further improvements, and the use of case studies.
- Published
- 2006
- Full Text
- View/download PDF
4. A Simple Experiment in Top-Down Design.
- Author
-
Comer, Douglas and Halstead, Maurice H.
- Subjects
COMPUTER software ,COMPUTER systems ,SOFTWARE engineering ,COMPUTER programming ,COMPUTER science - Abstract
In this paper we: 1) discuss the need for quantitatively reproducible experiments in the study of top-down design; 2) propose the design and writing of tutorial papers as a suitably general and in- expensive vehicle; 3) suggest the software science parameters as appropriate metrics; 4) report two experiments validating the use of these metrics on outlines and prose; and 5) demonstrate that the experiments tended toward the same optimal modularity. The last point appears to offer a quantitative approach to the estimation of the total length or volume (and the mental effort required to produce it) from an early stage of the top-down design process. If results of these experiments are validated elsewhere, then they will provide basic guidelines for the design process. [ABSTRACT FROM AUTHOR]
- Published
- 1979
5. Equality to Equals and Unequals: A Revisit of the Equivalence and Nonequivalence Criteria in Class-Level Testing of Object-Oriented Software.
- Author
-
Chen, Huo Yan and Tse, T.H.
- Subjects
OBJECT-oriented programming ,OBJECT-oriented methods (Computer science) ,COMPUTER software testing ,ALGEBRA ,SOFTWARE engineering - Abstract
Algebraic specifications have been used in the testing of object-oriented programs and received much attention since the 1990s. It is generally believed that class-level testing based on algebraic specifications involves two independent aspects: the testing of equivalent and nonequivalent ground terms. Researchers have cited intuitive examples to illustrate the philosophy that even if an implementation satisfies all the requirements specified by the equivalence of ground terms, it may still fail to satisfy some of the requirements specified by the nonequivalence of ground terms. Thus, both the testing of equivalent ground terms and the testing of nonequivalent ground terms have been considered as significant and cannot replace each other. In this paper, we present an innovative finding that, given any canonical specification of a class with proper imports, a complete implementation satisfies all the observationally equivalent ground terms if and only if it satisfies all the observationally nonequivalent ground terms. As a result, these two aspects of software testing cover each other and can therefore replace each other. These findings provide a deeper understanding of software testing based on algebraic specifications, rendering the theory more elegant and complete. We also highlight a couple of important practical implications of our theoretical results. [ABSTRACT FROM PUBLISHER]
- Published
- 2013
- Full Text
- View/download PDF
6. Foreword.
- Author
-
Wong, Harry K. T.
- Subjects
DATABASE management ,STATISTICS ,ARTIFICIAL intelligence ,EXPERT systems ,SOFTWARE engineering ,COMPUTER science - Abstract
The note introduces a special section containing research papers on statistical and scientific database (SSDB) management, which appeared in the October 1985 issue of the periodical "IEEE Transactions on Software Engineering." The collection of papers reflect the diversity of approaches to the solutions of SSDB management including physical storage organization and the application of artificial intelligence expert system techniques. The first paper is an introduction to the key research issues of SSDB management. The second paper is on data compression. The third paper is on data management support of computational statistical data. The fourth paper is on query facilities for SSDB. The last paper is on applying the production systems techniques in artificial intelligence for performing estimation.
- Published
- 1985
7. A Static Analysis of the NAG Library.
- Author
-
Hennell, Michael A. and Prudom, J. Alan
- Subjects
FORTRAN ,PROGRAMMING languages ,COMPILERS (Computer programs) ,APPLICATION program interfaces ,COMPUTER software ,COMPUTER science - Abstract
This paper reports results obtained from a static analysis of the NAG Mark 4 Fortran numerical algorithms library. The analysis consists of two components. We rust record the gross features, such as the frequency with which language constructs occur, interpret our findings in terms of the NAG coding standards and properties of the Fortran compilers on which the library normally runs. The second component is a more detailed analysis of the routines' logical structure. This part of the paper is an attempt to determine some of the fundamental characteristics of good software. [ABSTRACT FROM AUTHOR]
- Published
- 1980
8. The Third International Conference on Data Engineering.
- Author
-
Wah, Benjamin W.
- Subjects
SOFTWARE engineering ,CONFERENCES & conventions ,COMPUTER science ,COMPUTER engineering ,ARTIFICIAL intelligence - Abstract
The Third International Conference on Data Engineering was held in Los Angeles, California on February 2 to 6, 1987. The conference was designed as an international forum for bringing together researchers, developers, managers, strategic planners and other users with an interest in the research, design and development of data engineering methodologies, strategies and systems. Its scope includes computer science, artificial intelligence, electrical engineering, and computer engineering. It featured papers from all major areas of data engineering, including database design and modeling, performance evaluation, algorithms, integrity, security, fault tolerance, query language, artificial intelligence approaches, knowledge bases, database machines, distributed databases and data engineering applications.
- Published
- 1988
9. Synthesis of Partial Behavior Models from Properties and Scenarios.
- Author
-
Uchitel, Sebastian, Brunet, Greg, and Chechik, Marsha
- Subjects
COMPUTER software development ,STAKEHOLDERS ,EMAIL systems ,ONLINE education ,COMPUTER science ,SOCIAL computing - Abstract
Synthesis of behavior models from software development artifacts such as scenario-based descriptions or requirements specifications helps reduce the effort of model construction. However, the models favored by existing synthesis approaches are not sufficiently expressive to describe both universal constraints provided by requirements and existential statements provided by scenarios. In this paper, we propose a novel synthesis technique that constructs behavior models in the form of Modal Transition Systems (MTS) from a combination of safety properties and scenarios. MTSs distinguish required, possible, and proscribed behavior, and their elaboration not only guarantees the preservation of the properties and scenarios used for synthesis but also supports further elicitation of new requirements. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
10. Guest Editor's Introduction Experimental Computer Science.
- Author
-
Iyer, Ravi K.
- Subjects
COMPUTER science ,TECHNOLOGY ,COMPUTERS ,SOFTWARE engineering ,COMPUTER software ,COMPUTER systems ,ELECTRONIC systems - Abstract
Experimental research in computer science is a relatively new, yet fast developing area. It is encouraging to that there is substantial research going on in this important area. A study on experimental evaluation of a reusability-oriented parallel programming environment is given. There is a significant amount of experimental research in the area of computer dependability.
- Published
- 1990
11. Ambiguity in Processing Boolean Queries on TDMS Tree Structures: A Study of Four Different Philosophies.
- Author
-
Hardgrave, W Terry
- Subjects
QUERY languages (Computer science) ,DATA structures ,ELECTRONIC data processing ,DATABASES ,COMPUTER science ,DATABASE management - Abstract
This paper defines and demonstrates four philosophies for processing queries on tree structures; shows that the data semantics of queries should be described by designating sets of nodes from which values for attributes may be returned to the data consumer; shows that the data semantics of database processing can be specified totally independent of any machine, file structure, or implementation; shows that set theory is a natural and effective vehicle for analyzing the semantics of queries on tree structures; and finally, shows that Bolts is an adequate formalism for conveying the semantics of tree structure processing. [ABSTRACT FROM AUTHOR]
- Published
- 1980
12. The Effectiveness of Software Diversity in a Large Population of Programs.
- Author
-
Van der Meulen, Meine J. P. and Revilla, Miguel A.
- Subjects
COMPUTER reliability ,COMPUTER programming ,FAULT tolerance (Engineering) ,PROGRAMMING languages ,COMPUTER simulation ,COMPUTER system failures ,COMPUTER science - Abstract
In this paper, we first present an exploratory analysis of the aspects of multiple-version software diversity using 36,123 programs written to the same specification. We do so within the framework of the theories of Eckhardt and Lee and Littlewood and Miller. We analyze programming faults made, explore failure regions and difficulty functions, and show how effective 1-out-of-2 diversity is and how language diversity increases this effectiveness. The second part of this paper generalizes the findings about 1-out-of-2 diversity and its special case language diversity by performing statistical analyses of 89,402 programs written to 60 specifications. Most observations in the exploratory analysis are confirmed; however, although the benefit of language diversity can be observed, its effectiveness-appears to be low. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
13. Constructing Interaction Test Suites for Highly-Configurable Systems in the Presence of Constraints: A Greedy Approach.
- Author
-
Cohen, Myra B., Dwyer, Matthew B., and Jiangfan Shi
- Subjects
COMBINATORICS ,CONSTRAINT satisfaction ,PROGRAMMING languages ,ADAPTIVE computing systems ,ALGORITHMS ,COMPUTER science - Abstract
Researchers have explored the application of combinatorial interaction testing (CIT) methods to construct samples to drive systematic testing of software system configurational Applying CIT to highly-configurable software systems is complicated by the fact that, in many such systems, there are constraints between specific configuration parameters that render certain combinations invalid. Many CIT algorithms lack a mechanism to avoid these. In recent work, automated constraint solving methods have been combined with search-based CIT construction methods to address the constraint problem with promising results. However, these techniques can incur a nontrivial overhead. In this paper, we build upon our previous work to develop a family of greedy CIT sample generation algorithms that exploit calculations made by modern Boolean satisfiability (SAT) solvers to prune the search space of the CIT problem. We perform a comparative evaluation of the cost effectiveness of these algorithms on four real-world highly-configurable software systems and on a population of synthetic examples that share the characteristics of these systems. In combination, our techniques reduce the cost of CIT in the presence of constraints to 30 percent of the cost of widely used unconstrained CIT methods without sacrificing the quality of the solutions. [ABSTRACT FROM AUTHOR]
- Published
- 2008
- Full Text
- View/download PDF
14. Toward the Reverse Engineering of UML Sequence Diagrams for Distributed Java Software.
- Author
-
Briand, Lionel C., Labiche, Yvan, and Leduc, Johanne
- Subjects
REVERSE engineering ,METHODOLOGY ,UNIFIED modeling language ,SOURCE code ,COMPUTER software ,QUALITY assurance ,CHARTS, diagrams, etc. ,COMPUTER science - Abstract
This paper proposes a methodology and instrumentation infrastructure toward the reverse engineering of UML (Unified Modeling Language) sequence diagrams from dynamic analysis. One motivation is, of course, to help people understand the behavior of systems with no (complete) documentation. However, such reverse-engineered dynamic models can also be used for quality assurance purposes, They can, for example, be compared with design sequence diagrams and the conformance of the implementation to the design can thus be verified. Furthermore, discrepancies can also suggest failures in meeting the specifications. Due to size constraints, this paper focuses on the distribution aspects of the methodology we propose. We formally define our approach using metamodels and consistency rules. The instrumentation is based on Aspect-Oriented Programming in order to alleviate the effort overhead usually associated with source code instrumentation. A case study is discussed to demonstrate the applicability of the approach on a concrete example. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
15. Design by Contract to Improve Software Vigilance.
- Author
-
Traon, Yves Le, Baudry, Benoit, and Jézéquel, Jean-Marc
- Subjects
SOFTWARE measurement ,OBJECT-oriented methods (Computer science) ,COMPUTER science ,DIAGNOSIS ,QUALITY control ,COMPUTER software quality control ,COMPUTER programming ,MATHEMATICAL models - Abstract
Design by contract is a lightweight technique for embedding elements of formal specification (such as invariants, pre and postconditions) into an object-oriented design. When Contracts are made executable, they can play the role of embedded, online oracles. Executable contracts allow components to be responsive to erroneous states and, thus, may help in detecting and locating faults. In this paper, we define Vigilance as the degree to which a program is able to detect an erroneous state at runtime. Diagnosability represents the effort needed to locate a fault once it has been detected. In order to estimate the benefit of using Design by Contract, we formalize both notions of Vigilance and Diagnosability as software quality measures. The main steps of measure elaboration are given, from informal definitions of the factors to be measured to the mathematical model of the measures. As is the standard in this domain, the parameters are then fixed through actual measures, based on a mutation analysis in our case. Several measures are presented that reveal and estimate the contribution of contracts to the overall quality of a system in terms of vigilance and diagnosability. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
16. An Empirical Investigation of the Key Factors for Success in Software Process Improvement.
- Author
-
Dybå, Tore
- Subjects
COMPUTER systems ,COMPUTER software ,ORGANIZATION ,INFORMATION technology ,COMPUTER networks ,COMPUTER science - Abstract
Understanding how to implement software process improvement (SRI) successfully is arguably the most challenging issue facing the SPI field today. The SPI literature contains many case studies of successful companies and descriptions of their SRI programs. However, the research efforts to date are limited and inconclusive and without adequate theoretical and psychometric justification. This paper extends and integrates models from prior research by performing an empirical investigation of the key factors for success in SPI. A quantitative survey of 120 software organizations was designed to test the conceptual model and hypotheses of the study. The results indicate that success depends critically on six organizational factors, which explained more than 50 percent of the variance in the outcome variable. The main contribution of the paper is to increase the understanding of the influence of organizational issues by empirically showing that they are at least as important as technology for succeeding with SPI and, thus, to provide researchers and practitioners with important new insights regarding the critical factors of success in SPI. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
17. Retargeting Sequential Image-Processing Programs for Data Parallel Execution.
- Author
-
Baumstark, Jr., Lewis B. and Wills, Linda M.
- Subjects
SOFTWARE compatibility ,COMPUTER software ,COMPUTER systems ,COMPUTER architecture ,COMPUTER science - Abstract
New compact, low-power implementation technologies for processors and imaging arrays can enable a new generation of portable video products. However, software compatibility with large bodies of existing applications written in C prevents more efficient, higher performance data parallel architectures from being used in these embedded products. If this software could be automatically retargeted explicitly for data parallel execution, product designers could incorporate these architectures into embedded products. The key challenge is exposing the parallelism that is inherent in these applications but that is obscured by artifacts imposed by sequential programming languages. This paper presents a recognition-based approach for automatically extracting a data parallel program model from sequential image processing code and retargeting it to data parallel execution mechanisms. The explicitly parallel model presented, called multidimensional data flow (MDDF), captures a model of how operations on data regions (e.g., rows, columns, and tiled blocks) are composed and interact. To extract an MDDF model, a partial recognition technique is used that focuses on identifying array access patterns in loops, transforming only those program elements that hinder parallelization, while leaving the core algorithmic computations intact. The paper presents results of retargeting a set of production programs to a representative data parallel processor array to demonstrate the capacity to extract parallelism using this technique. The retargeted applications yield a potential execution throughput limited only by the number of processing elements, exceeding thousands of instructions per cycle in massively parallel implementations. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
18. Toward Formalizing Domain Modeling Semantics in Language Syntax.
- Author
-
Evermann, Joerg and Wand, Yair
- Subjects
INFORMATION resources ,PROGRAMMING languages ,UNIFIED modeling language ,COMPUTER programming ,COMPUTER software development ,COMPUTER science - Abstract
Information Systems are situated in and are representations of some business or organizational domain. Hence, understanding the application domain is critical to the success of information systems development. To support domain understanding, the application domain is represented in conceptual models. The correctness of conceptual models can affect the development outcome and prevent costly rework during later development stages. This paper proposes a method to restrict the syntax of a modeling language to ensure that only possible configurations of a domain can be modeled, thus increasing the likelihood of creating correct domain models. The proposed method, based on domain ontologies, captures relationships among domain elements via constraints on the language metamodel, thus restricting the set of statements about the domain that can be generated with the language. In effect, this method creates domain specific modeling languages from more generic ones. The method is demonstrated using the Unified Modeling Language (UML). Specifically, it is applied to the subset of UML dealing with object behavior and its applicability is demonstrated on a specific modeling example. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
19. Dynamic Coupling Measurement for Object-Oriented Software.
- Author
-
Arisholm, Erik, Briand, Lionel C., and Føyen, Audun
- Subjects
OBJECT-oriented methods (Computer science) ,COMPUTER software ,COMPUTER science ,ELECTRONIC systems ,DATA transmission systems ,COMPUTER systems - Abstract
The relationships between coupling and external quality factors of object-oriented software have been studied extensively for the past few years. For example, several, studies have identified clear empirical relationships between class-level coupling and class fault-proneness. A common way to define and measure coupling is through structural properties and static code analysis. However, because of polymorphism, dynamic binding, and the common presence of unused ("dead") code in commercial software, the resulting coupling measures are imprecise as they do not perfectly reflect the actual coupling taking place among classes at runtime. For example, when using static analysis to measure coupling, it is difficult and sometimes impossible to determine what actual methods can be invoked from a client class if those methods are overridden in the subclasses of the server classes. Coupling measurement has traditionally been performed using static code analysis, because most of the existing work was done on nonobject oriented code and because dynamic code analysis is more expensive and complex to perform. For modem software systems, however, this focus on static analysis can be problematic because although dynamic binding existed before the advent of object-orientation, its usage has increased significantly in the last decade. This paper describes how coupling can be defined and precisely measured based on dynamic analysis of systems. We refer to this type of coupling as dynamic coupling. An empirical evaluation of the proposed dynamic coupling measures is reported in which we study the relationship of these measures with the change proneness of classes. Data from maintenance releases of a large Java system are used for this purpose. Preliminary results suggest that some dynamic coupling measures are significant indicators of change proneness and that they complement existing coupling measures based on static analysis. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
20. Correction to "A Practical View of Software Measurement and Implementation Experiences Within Motorola".
- Author
-
Leveson, N.C.
- Subjects
SOFTWARE engineering ,COMPUTER multitasking ,MATHEMATICAL logic ,COMPUTER programming ,ELECTRONIC data processing ,COMPUTER science - Abstract
In the paper, "Temporal Logic-Based Deadlock Analysis for Ada," reference [9] was duplicated as reference [10], causing the remaining references to be misnumbered by one. The correctly numbered reference list is given below in its entirety. All reference citations in the paper's text are correct as published.
- Published
- 1993
21. Editorial: Program Transformations.
- Author
-
Balzer, Robert and Cheatham Jr., Thomas E.
- Subjects
PROGRAM transformation ,COMPUTER software ,SOFTWARE engineering ,COMPUTER programming ,COMPUTER science - Abstract
The field of transformations is developing a methodology for manipulating programs, mapping them from one form into another. Transforming a program will preserve some of its properties, and alter others. While many choices could be made about which properties to preserve, the transformation field has focused on altering the performance characteristics of programs while preserving their 'semantics.' The papers published in the January 1981 issue of the periodical "IEEE Transactions on Software Engineering" are necessarily limited. Many of the important topics are not trated. The papers were selected to present a flavor for some aspects of the transformation field and to illustrate some research.
- Published
- 1981
22. Guest Editorial.
- Author
-
Ramamoorthy, C. V.
- Subjects
SOFTWARE engineering ,PERIODICALS ,COMPUTER science ,CONFERENCES & conventions ,APPLICATION software ,COMPUTER software - Abstract
Some papers from the 1978 computer science conference, COMPSAC 78 Conference, are presented in this issue of the "IEEE Transaction on Software Engineering". Computer science of which software engineering is a very important component, is a multifaceted discipline and the computer applications are providing the major avenues of its growth, just as physical and biological sciences have been the beacon lights of the growth of mathematical sciences. The papers in this issue and those presented at COMPSAC 78 portray this emerging thrust of software application experience.
- Published
- 1980
23. Debugging Larch Shared Language Specifications.
- Author
-
Garland, Stephen J., Guttag, John V., and Horning, James J.
- Subjects
PROGRAMMING languages ,DEBUGGING ,AUTOMATION ,ELECTRONIC data processing ,ARTIFICIAL languages ,COMPUTER science - Abstract
The Larch family of specification languages supports a two-tiered definitional approach to specification. Each specification has components written in two languages: one designed for a specific programming language and another independent of any programming language. The former are called Larch interface languages, and the latter the Larch Shared Language (LSL). The Larch style of specification emphasizes brevity and clarity rather than executability. To make it possible to test specifications without executing or implementing them, Larch permits specifiers to make claims about logical properties of specifications and to check these claims at specification time. Since these claims are undecidable in the general case, it is impossible to build a tool that will automatically certify claims about arbitrary specifications. However, it is feasible to build tools that assist specifiers in checking claims as they debug specifications. This paper describes the checkability designed into LSL and discusses two tools that help perform the checking. This paper is a revised and expanded version of a paper presented at the April 1990 IFIP Working Conference on Programming Concepts and Methods [7]. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
24. Mechanically Verifying Concurrent Programs with the Boyer-Moore Prover.
- Author
-
Goldschlag, David M.
- Subjects
SEMANTICS ,ALGORITHMS ,ENCODING ,COMPUTER software ,COMPUTER science - Abstract
This paper describes a proof system suitable for the mechanical verification of concurrent programs. Mechanical verification, which uses a computer program to validate a formal proof, increases one's confidence in the correctness of the validated proof. This proof system is based on Unity [12], and may be used to specify and verify both safety and liveness properties. However, it is defined with respect to an operational semantics of the transition system model of concurrency. Proof rules are simply theorems of this operational semantics. This methodology makes a clear distinction between the theorems in the proof system and the logical inference rules and syntax which define the underlying logic. Since this proof system essentially encodes Unity in another sound logic, and this encoding has been mechanically verified, this encoding proves the soundness of this formalization of Unity. This proof system has been mechanically verified by the Boyer-Moore prover, a computer program mechanizing the Boyer-Moore logic [7]. This proof system has been used to mechanically verify the correctness of a distributed algorithm that computes the minimum node value in a tree. This paper also describes this algorithm and its correctness theorems, and presents the key lemmas that aided the mechanical verification. The mechanized proof closely resembles a hand proof, but is longer, since all concepts are defined from first principles. This proof system is suitable for the mechanical verification of a wide class of programs, since the underlying prover, though automatic, is guided by the user. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
25. Mechanizing CSP Trace Theory in Higher Order Logic.
- Author
-
Camilleri, Albert John
- Subjects
CSP (Computer program language) ,PROGRAMMING languages ,HUMAN error ,ALGEBRA ,DISTRIBUTED computing ,COMPUTER science - Abstract
The process algebra CSP is widely used for formal reasoning in the areas of concurrency, communication, and distributed systems. Mathematical proof plays a key role in CSP reasoning, but despite this, little mechanical proof support has been developed for CSP to facilitate the exercise and eliminate the risk of human error. In this paper we describe how a mechanized tool for reasoning about CSP can be developed by customizing an existing general-purpose theorem prover based on higher order logic. We investigate how the trace semantics of CSP operators can be mechanized in higher order logic, and show how the laws associated with these operators can be proved from their semantic definitions. The resulting system is one in which natural-deduction style proofs can be conducted using the standard CSP laws. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
26. The Specification and Verified Decomposition of System Requirements Using CSP.
- Author
-
Moore, Andrew P.
- Subjects
COMPUTER systems ,SYSTEMS design ,SYNCHRONIZATION ,SEQUENTIAL processing (Computer science) ,SYSTEM analysis ,COMPUTER science - Abstract
An important principle of building trustworthy systems is to rigorously analyze the critical requirements early in the development process, even before starting system design. Existing proof methods for systems of communicating processes focus on the bottom-up composition of component-level specifications into system-level specifications. Trustworthy system development requires, instead, the top-down derivation of component requirements from the critical system requirements. This paper describes a formal method for decomposing the requirements of a system into requirements of its component processes and a minimal, possibly empty, set of synchronization requirements. The Trace Model of Hoare's Communicating Sequential Processes (CSP) is the basis for the formal method. We apply the method to an abstract voice transmitter and describe the role that the EHDM verification system plays in the transmitter's decomposition. In combination with other verification techniques, we expect that the method defined here will promote the development of more trustworthy systems. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
27. Programmer-Transparent Coordination of Recovering Concurrent Processes: Philosophy and Rules for Efficient Implementation.
- Author
-
Kim, K. H.
- Subjects
COMPUTER programming ,ELECTRONIC data processing ,COMPUTER science ,COMPUTER systems ,COMPUTER engineering ,ALGORITHMS - Abstract
A new approach to coordination of cooperating concurrent processes, each capable of error detection and recovery, is presented. Error detection, rollback, and retry in a process are specified by a well-structured language construct called recovery block. Recovery points of processes must be properly coordinated to prevent a disastrous avalanche of process rollbacks. In contrast to the previously studied approaches that require the program designer to coordinate the recovery block structures of interacting processes (thereby coordinating the recovery points of processes), the new approach relieves the program designer of that burden It instead relies upon an intelligent processor system (that runs processes) capable of establishing and discarding the recovery points of interacting processes in a well coordinated manner such that 1) a process never makes two consecutive rollbacks without making a retry between the two, and 2) every process rollback becomes a minimum-distance rollback. Following the discussion of the underlying philosophy of the new approach, basic rules of reducing storage and time overhead in such a processor system are discussed. Throughout this paper examples are drawn from the systems in which processes communicate through monitors. [ABSTRACT FROM AUTHOR]
- Published
- 1988
- Full Text
- View/download PDF
28. Statistical and Scientific Database Issues.
- Author
-
Shoshani, Arie and Wong, Harry K. T.
- Subjects
DATABASE management ,MULTIDIMENSIONAL databases ,USER interfaces ,DATABASES ,ELECTRONIC data processing ,COMPUTER science - Abstract
The purpose of this paper is to summarize the research issues of statistical and scientific databases (SSDB's). It organizes the issues into four major groups: physical organization and access methods, operators, logical organization and user interfaces, and miscellaneous issues. It emphasizes the differences between SSDB's and traditional database applications, and motivates the need for new and innovative techniques for the support of SSDB's. In addition to describing current work in this field, it discusses open research areas and proposes possible approaches to their solution. [ABSTRACT FROM AUTHOR]
- Published
- 1985
29. A Distributed Drafting Algorithm for Load Balancing.
- Author
-
Ni, Lionel M., Chong-Wei Xu, and Gendreau, Thomas B.
- Subjects
DISTRIBUTED computing ,ALGORITHMS ,COMPUTER networks ,DISTRIBUTED operating systems (Computers) ,SOFTWARE engineering ,COMPUTER science - Abstract
It is desirable for the load in a distributed system to be balanced evenly. A dynamic process migration protocol is needed in order to achieve load balancing in a user transparent manner. A distributed algorithm for load balancing which is network topology independent is proposed in this paper. Different network topologies and low-level communications protocols affect the choice of only some system design parameters. The "drafting" algorithm attempts to compromise two contradictory goals: maximize the processor utilization and minimize the communication overhead. The main objective of this paper is to describe the dynamic process migration protocol based on the proposed drafting algorithm. A sample distributed system is used to further illustrate the drafting algorithm and to show how to define system design parameters. The system performance is measured by simulation experiments based on the sample system. [ABSTRACT FROM AUTHOR]
- Published
- 1985
30. A Successful Software Development.
- Author
-
Wong, Carolyn
- Subjects
COMPUTER software development ,COMMAND & control systems ,STRUCTURED programming ,STRUCTURED walkthrough (Computer science) ,SOFTWARE engineering ,COMPUTER science - Abstract
In 1980, System Development Corporation (SDC) delivered software for a modern air defense system (ADS) for a foreign country. Development of the ADS software was a successful SCD project where all products were delivered within budget and within an ambitious 25 month schedule. This paper describes SDC's approach and experience in developing ADS software. SDC's software development approach included the first time use of an oft-the-shelf operating system for a major air defense system, the application of a selective set of modern software development techniques, and use of a matrix management structure. SDC's successful application on ADS of a commercial (`pent- log system, a higher order language, a Program Design Language, a Pro- gram Production Library, structured walk-throughs, structured programming techniques, incremental build implementation and test procedures, interactive development, and word processing is described. A discussion of the advantages realized and difficulties encountered in the ADS matrix management structure is presented. The paper concludes with a summary of how SDC will develop software on future projects as a result of its experience on ADS. [ABSTRACT FROM AUTHOR]
- Published
- 1984
31. Software Science Revisited: A Critical Analysis of the Theory and Its Empirical Support.
- Author
-
Shen, Vincent Y., Conte, Samuel D., and Dunsmore, H. E.
- Subjects
COMPUTER software ,COMPUTER science ,COMPUTER training ,SOFTWARE engineering - Abstract
-c_The theory of software science was developed by the late M. H. Halstead of Purdue University during the early 1970's. It was first presented in unified form in the monograph Elements of Software Science published by Elsevier North-Holland in 1977. Since it claimed to apply scientific methods to the very complex and important problem of software production, and since experimental evidence supplied by Halstead and others seemed to support the theory, it drew widespread attention from the computer science community. Some researchers have raised serious questions about the underlying theory of software science. At the same time, experimental evidence supporting some of the metrics continue to be presented. This paper is a critique of the theory as presented by Halstead and a review of experimental results concerning software science metrics published since 1977. [ABSTRACT FROM AUTHOR]
- Published
- 1983
32. The Implementation of Run-Time Diagnostics in Pascal.
- Author
-
Fischer, Charles N. and Leblanc, Richard J.
- Subjects
PASCAL (Computer program language) ,COMPILERS (Computer programs) ,PROGRAMMING languages ,COMPUTER programming ,COMPUTER software ,COMPUTER science - Abstract
This paper considers the role of run-time diagnostic checking in enforcing the rules of the Pascal programming language. Run- time diagnostic checks must be both complete (covering all language requirements) and efficient. Further, such checks should be implemented so that the cost of enforcing the correct use of a given construct is borne by users of that construct. This paper describes simple and efficient mechanisms currently in use with a diagnostic Pascal compiler that monitor the run-time behavior of such sensitive Pascal constructs as pointers, variant records, reference (i.e., vat) parameters, and with statements. The use of these mechanisms with related constructs in other languages is considered. Language modifications that simplify run-time checking are also noted. [ABSTRACT FROM AUTHOR]
- Published
- 1980
33. Compile Time Memory Allocation for Parallel Processes.
- Author
-
Bochmann, Gregor V.
- Subjects
DYNAMIC storage allocation (Computer science) ,COMPUTER programming ,COMPUTER science ,MATHEMATICAL analysis ,SOFTWARE engineering - Abstract
This paper discusses the problem of allocating storage for the activation records of procedure calls within a system of parallel processes. A compile time storage allocation scheme is given, which determines the relative address within the memory segment of a process for the activation records of all procedures called by the process. This facilitates the generation of an efficient run-time code. The allocation scheme applies to systems in which data and procedures can be shared among several processes. However, recursive procedure calls are not supported. [ABSTRACT FROM AUTHOR]
- Published
- 1978
34. Understanding Code Mobility.
- Author
-
Fuggetta, Alfonso, Picco, Gian Pietor, and Vigna, Giovanni
- Subjects
APPLICATION software ,INTERNET ,SCALABILITY ,COMPUTER networks ,INDUSTRIALIZATION ,COMPUTER architecture ,TECHNOLOGY ,COMPUTER science - Abstract
The technologies, architecures, and methodologies traditionally used to develop distributed applications exhibit a variety of limitations and drawbacks when applied to large scale distributed settlings (e.g., the Internet). In particular, they fail in providing the desired degree of configurability, scalability, and customizability. To address these issues, researchers are investigating a varity of innovative approaches. The most promising and intriguing ones are those based on the ability of moving code across the nodes of a network, expoloiting the notion of mobile code. As an emerging research field, code mobility is generating a growing body of scientific literature and industrial developments. Nevertheless, the field is still characterized by the lack of a sound and comprehensive body of concepts and terms. As a consequence, it is rather difficult to understand, assess, and compare the existing approaches. In turn, this limits our ability to fully exploit them in practice, and to further promote the research work on mobile code. Indeed, a significant symptom of this situation is the lack of a commonly accepted and sound definition of the term "mobile code" itself. This paper presents a conceptual framework for understanding code mobility. The framework is centered around a classification that introduces three dimensions: technologies, design paradigms, and applications. The contribution of the paper is two-fold. First, it provides a set of terms and concepts to understand and compare the approaches based on the notion of mobile code. Second, it introduces criteria and guidelines that support the developer in the identification of the classes of applications that can leverage off of mobile code, in the design of these applications, and finally, in the selection of the most appropriate implemention technlogies. The presentation of the classification is interwined with a review of state-of-the-art in the field. Finally, the use of the classification is exemplified in a case study. Mobile code, mobile agent, distributed application, design paradigm. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
35. A Procedure for Analyzing Unbalanced Datasets.
- Author
-
Kitchenham, Barbara
- Subjects
SOFTWARE measurement ,BEST practices ,MATHEMATICAL statistics ,EXPERIMENTAL design ,SOFTWARE engineering ,SOFTWARE productivity ,COMPUTER science - Abstract
This paper describes a procedure for analyzing unbalanced datasets that include many nominal- and ordinal-scale factors. Such datasets are often found in company datasets used for benchmarking and productivity assessment. The two major problems caused by lack of balance are that the impact of factors can be concealed and that spurious impacts can be observed. These effects are examined with the help of two small artificial datasets. The paper proposes a method of forward pass residual analysis to analyze such datasets. The analysis procedure is demonstrated on the artificial datasets and then applied to the COCOMO dataset. The paper ends with a discussion of the advantages and limitations of the analysis procedure. [ABSTRACT FROM AUTHOR]
- Published
- 1998
- Full Text
- View/download PDF
36. A Multiframe Model for Real-Time Tasks.
- Author
-
Mok, Aloysius K. and Deji Chen
- Subjects
PRODUCTION scheduling ,COMPUTER science ,REAL-time programming ,ALGORITHMS ,COMPUTER systems ,MULTIMEDIA computer applications ,JOB descriptions - Abstract
The well-known periodic task model of Liu and Layland [10] assumes a worst-case execution time bound for every task and may be too pessimistic if the worst-case execution time of a task is much longer than the average. In this paper, we give a multiframe real-time task model which allows the execution time of a task to vary from one instance to another by specifying the execution time of a task in terms of a sequence of numbers. We investigate the schedulability problem for this model for the preemptive fixed priority scheduling policy. We show that a significant improvement in the utilization bound can be established in our model. [ABSTRACT FROM AUTHOR]
- Published
- 1997
- Full Text
- View/download PDF
37. Cover2.
- Subjects
COMPUTER software periodicals ,SOFTWARE engineering ,PERIODICAL editors ,COMPUTER science ,MEMBERSHIP in associations, institutions, etc. ,PUBLICATIONS ,SOCIETIES - Published
- 2011
- Full Text
- View/download PDF
38. Editorial: A New Editor-in-Chief and the State of TSE.
- Author
-
Knight, John
- Subjects
COMPUTER software industry ,SOFTWARE engineering ,COLLEGE teachers ,COMPUTER science ,ENGINEERING ,INDUSTRIAL costs - Abstract
The editorial introduces the new Editor-in-Chief of the "IEEE Transactions on Software Engineering" journal, Jeff Kramer, professor in the Department of Computing at the Imperial College of Science, Technology, and Medicine in London, England. The status of the journal is discussed, including gratitude for the creativity displayed in the 341 papers received in 2005. The future of the software industry is discussed, along with difficulties in improving reliability and lowering production cost.
- Published
- 2006
- Full Text
- View/download PDF
39. Editorial.
- Author
-
Ramamoorthy, C. V.
- Subjects
PERIODICAL editors ,COLLEGE teachers ,SOFTWARE engineering ,COMPUTER science - Abstract
The note announces the addition of Jack Mostow and Peter Ng to the Editorial Board of the periodical "IEEE Transactions on Software Engineering." Jack Mostow is a professor of computer science at Rutgers University and was previously associated with the University of Southern California's Information Sciences Institute as a research scientist in the area of artificial intelligence. Meanwhile, Peter Ng is a professor of computer science at the University of Missouri in Columbia and is also Chairman of its Department of Computer Science. Their appointments were approved by the Editorial Board and the Publications Board, Transactions Advisory Committee and Executive Committee of the IEEE Computer Society.
- Published
- 1985
40. GEA: A Goal-Driven Approach toDiscovering Early Aspects.
- Author
-
Lee, Jonathan and Hsu, Kuo-Hsun
- Subjects
FUZZY logic ,MATHEMATICAL logic ,FUZZY systems ,USE cases (Systems engineering) ,SYSTEMS engineering - Abstract
Aspect-oriented software development has become an important development and maintenance approach to software engineering across requirements, design and implementation phases. However, discovering early aspects from requirements for a better integration of crosscutting concerns into a target system is still not well addressed in the existing works. In this paper, we propose a Goal-driven Early Aspect approach (called GEA) to discovering early aspects by means of a clustering algorithm in which relationships among goals and use cases are utilized to explore similarity degrees of clustering goals, and total interaction degrees are devised to check the validity of the formation of each cluster. Introducing early aspects not only enhances the goal-driven requirements modeling to manage crosscutting concerns, but also provides modularity insights into the analysis and design of software development. Moreover, relationships among goals represented numerically are more informative to discover early aspects and more easily to be processed computationally than qualitative terms. The proposed approach is illustrated by using two problem domains: a meeting scheduler system and a course enrollment system. An experiment is also conducted to evaluate the benefits of the proposed approach with Mann-Whitney U-test to show that the difference between with GEA and without GEA is statistically significant. [ABSTRACT FROM PUBLISHER]
- Published
- 2014
- Full Text
- View/download PDF
41. Comments on "Temporal Logic-Based Deadlock Analysis for Ada".
- Author
-
Young, Michal, Levine, David L., and Taylor, Richard N.
- Subjects
COMPUTER multitasking ,SOFTWARE engineering ,MATHEMATICAL logic ,COMPUTER programming ,ELECTRONIC data processing ,COMPUTER science - Abstract
Karam and Buhr describes a deadlock analysis technique based on enumeration of paths in a concurrent program. They also attempt to classify prior work on analysis techniques for concurrent systems, a and to compare their technique to the prior work. Several flaws in the article "Temporal Logic-Based Deadlock Analysis for Ada" was pointed out. The characterization of operational and axiomic proof methods is muddled and inaccurate. The classification of modeling techniques for concurrent systems confuses the distinction between state-based and event-based models with the essential distinction between explicit enumeration of behaviors and symbolic manipulation of properties. The statements about the limitations of linear-time temporal logic vis a vis nondeterminism are inaccurate.
- Published
- 1993
- Full Text
- View/download PDF
42. Understanding Exception Handling: Viewpoints of Novices and Experts.
- Author
-
Shah, Hina B., Görg, Carsten, and Harrold, Mary Jean
- Subjects
COMPUTER software development ,COMPUTER software developers ,SOFTWARE engineering ,COMPUTER science ,INFORMATION technology - Abstract
Several recent studies indicate that many industrial applications exhibit poor quality in the design of exception-handling. To improve the quality of error-handling, we need to understand the problems and obstacles that developers face when designing and implementing exception-handling. In this paper, we present our research on understanding the viewpoint of developers--novices and experts--toward exception-handling. First, we conducted a study with novice developers in industry. The study results reveal that novices tend to ignore exceptions because of the complex nature of exception-handling. Then, we conducted a second study with experts in industry to understand their perspective on exception-handling. The study results show that, for experts, exception-handling is a crucial part in the development process. Experts also confirm the novices' approach of ignoring exception-handling and provide insights as to why novices do so. After analyzing the study data, we identified factors that influence experts' strategy selection process for handling exceptions and then built a model that represents a strategy selection process experts use to handle exceptions. Our model is based on interacting modules and fault scope. We conclude with some recommendations to help novices improve their understanding of exception-handling. [ABSTRACT FROM AUTHOR]
- Published
- 2010
- Full Text
- View/download PDF
43. Mutation Operators for Spreadsheets.
- Author
-
Abraham, Robin and Erwig, Martin
- Subjects
SOFTWARE engineering ,ELECTRONIC spreadsheets ,MUTATION testing of computer software ,END-user computing ,ERROR-correcting codes ,COMPUTER science ,RELIABILITY in engineering - Abstract
Based on 1) research into mutation testing for general-purpose programming languages and 2) spreadsheet errors that have been reported in the literature, we have developed a suite of mutation operators for spreadsheets. We present an evaluation of the mutation adequacy of definition-use adequate test suites generated by a constraint-based automatic test-case generation system we have developed in previous work. The results of the evaluation suggest additional constraints that can be incorporated into the system to target mutation adequacy. In addition to being useful in mutation testing of spreadsheets, the operators can be used in the evaluation of error-detection tools and also for seeding spreadsheets with errors for empirical studies. We describe two case studies where the suite of mutation operators helped us carry out such empirical evaluations. The main contribution of this paper is a suite of mutation operators for spreadsheets that can be used for performing empirical evaluations of spreadsheet tools to indicate ways in which the tools can be improved. [ABSTRACT FROM AUTHOR]
- Published
- 2009
- Full Text
- View/download PDF
44. Comparison and Evaluation of Clone Detection Tools.
- Author
-
Bellon, Stefan, Koschke, Rainer, Antoniol, Giuliano, Krinke, Jens, and Merlo, Ettore
- Subjects
SOFTWARE engineering ,SOURCE code ,COMPUTER software ,COMPUTER science ,JAVA programming language ,SOFTWARE measurement - Abstract
Many techniques for detecting duplicated source code (software clones) have been proposed in the past. However, it is not yet clear how these techniques compare in terms of recall and precision as well as space and time requirements. This paper presents an experiment that evaluates six clone detectors based on eight large C and Java programs (altogether almost 850 KLOC). Their clone candidates were evaluated by one of the authors as an independent third party. The selected techniques cover the whole spectrum of the state-of-the-art in clone detection, The techniques work on text, lexical and syntactic information, software metrics, and program dependency graphs. [ABSTRACT FROM AUTHOR]
- Published
- 2007
- Full Text
- View/download PDF
45. Analysis of Restart Mechanisms in Software Systems.
- Author
-
Moorsel, Aad.P.A. van and Wolte, Katinka
- Subjects
ELECTRONIC systems ,ALGORITHMS ,COMPUTER systems ,COMPUTER architecture ,COMPUTER science ,COMPUTER research - Abstract
Restarts or retries are a common phenomenon in computing systems, for instance, in preventative maintenance, software rejuvination, or when a failure is suspected. Typically, one sets a time-out to trigger the restart. We analyze and optimize time-out strategies for scenarios in which the expected required remaining time of a task is not always decreasing with the time invested in it. Examples of such tasks include the download of Web pages, randomized algorithms, distributed queries, and jobs subject to network or other failures. Assuming the independence of the completion time of successive tries, we derive computationally attractive expressions for the moments of the completion time, as well as for the probability that a task is able to meet a deadline. These expressions facilitate efficient algorithms to compute optimal restart strategies and are promising candidates for pragmatic online optimization. [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
46. Scenario-Based Assessment of Nonfunctional Requirements.
- Author
-
Gregoriades, Andreas and Sutcliffe, Alistair
- Subjects
SYSTEMS design ,SYSTEM analysis ,COMPUTER science ,ELECTRONIC data processing ,MATHEMATICAL optimization ,SYSTEMS theory - Abstract
This paper describes a method and a tool for validating nonfunctional requirements in complex socio-technical systems. The System Requirements Analyzer (SRA) tool validates system reliability and operational performance requirements using scenario- based testing. Scenarios are transformed into sequences of task steps and the reliability of human agents performing tasks with computerized technology is assessed using Bayesian Belief Network (BN) models. The tool tests system performance within an envelope of environmental variations and reports the number of tests that pass a benchmark threshold. The tool diagnoses problematic areas in scenarios representing pathways through system models, assists in the identification of their causes, and supports comparison of alternative requirements specifications and system designs. It is suitable for testing socio-technical systems where operational scenarios are sequential and deterministic, in domains where designs are incrementally modified so set up costs of the BNs can be defrayed over multiple tests. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
47. Efficient Relational Calculation for Software Analysis.
- Author
-
Beyer, Dirk, Noack, Andreas, and Lewerentz, Claus
- Subjects
COMPUTER software ,COMPUTER systems ,COMPUTER science ,DATABASE management ,SYSTEMS design - Abstract
Calculating with graphs and relations has many applications in the analysis of software systems, for example, the detection of design patterns or patterns of problematic design and the computation of design metrics. These applications require an expressive query language, in particular, for the detection of graph patterns, and an efficient evaluation of the queries even for large graphs. In this paper, we introduce RML, a simple language for querying and manipulating relations based on predicate calculus, and CrocoPat, an interpreter for RML programs. RML is general because it enables the manipulation not only of graphs (i.e., binary relations), but of relations of arbitrary arity. CrocoPat executes RML programs efficiently because it internally represents relations as binary decision diagrams, a data structure that is well-known as a compact representation of large relations in computer-aided verification. We evaluate RML by giving example programs for several software analyses and CrocoPat by comparing its performance with calculators for binary relations, a Prolog system, and a relational database management system. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
48. Class Point: An Approach for the Size Estimation of Object-Oriented Systems.
- Author
-
Costagliola, Gennaro, Ferrucci, Filomena, Tortora, Genoveffa, and Vitiello, Giuliana
- Subjects
COMPUTER systems ,OBJECT-oriented programming ,COMPUTER programming ,SOFTWARE measurement ,COMPUTER science ,SOFTWARE engineering - Abstract
In this paper, we present an FP-like approach, named Class Point, which was conceived to estimate the size of object-oriented products. In particular, two measures are proposed, which are theoretically validated showing that they satisfy well-known properties necessary for size measures. An initial empirical validation is also performed, meant to assess the usefulness and effectiveness of the proposed measures to predict the development effort of object-oriented systems. Moreover, a comparative analysis is carried out, taking into account several other size measures. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
49. Guest Editors' Introduction to the Special Section on the First International Conference on the Quantitative Evaluation of systems (QEST).
- Author
-
Franceschinis, Giuliana, Katoen, Joost-Pieter, and Woodside, Murray
- Subjects
CONFERENCES & conventions ,MARKOV processes ,STOCHASTIC processes ,COMPUTER software development ,COMPUTER software quality control ,COMPUTER science ,COMPUTER programming ,EDUCATION - Abstract
The article discusses the first International Conference on the Quantitative Evaluation of SysTems, entitled QEST 2004, which was held at the University of Twente in the Netherlands. Topics such as modeling formalisms and methodologies, theory, tools, and case studies, as well as performance, dependability, and safety were discussed at the conference. The review received over 80 paper submissions and narrowed them down to 28 for publication. The guest editors present several brief descriptions that are presented in this issue, including "Backward Bisimulation in Markov Chain Model Checking" and "Analysis of Restart Mechanisms in Software Systems." [ABSTRACT FROM AUTHOR]
- Published
- 2006
- Full Text
- View/download PDF
50. Model-Based Performance Prediction in Software Development: A Survey.
- Author
-
Balsamo, Simonetta, Di Marco, Antinisca, Inverardi, Paola, and Simeoni, Marta
- Subjects
COMPUTER software development ,COMPUTER architecture ,COMPUTER science ,MATHEMATICAL models ,COMPUTER simulation ,PERFORMANCE evaluation ,SOFTWARE engineering - Abstract
Over the last decade, a lot of research has been directed toward integrating performance analysis into the software development process. Traditional software development methods focus on software correctness, introducing performance issues later in the development process. This approach does not take into account the fact that performance problems may require considerable changes in design, for example, at the software architecture level, or even worse at the requirement analysis level. Several approaches were proposed in order to address early software performance analysis. Although some of them have been successfully applied, we are still far from seeing performance analysis integrated into ordinary software development. In this paper, we present a comprehensive review of recent research in the field of model-based performance prediction at software development time in order to assess the maturity of the field and point out promising research directions. [ABSTRACT FROM AUTHOR]
- Published
- 2004
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.