27 results
Search Results
2. Correction to "A Practical View of Software Measurement and Implementation Experiences Within Motorola".
- Author
-
Leveson, N.C.
- Subjects
SOFTWARE engineering ,COMPUTER multitasking ,MATHEMATICAL logic ,COMPUTER programming ,ELECTRONIC data processing ,COMPUTER science - Abstract
In the paper, "Temporal Logic-Based Deadlock Analysis for Ada," reference [9] was duplicated as reference [10], causing the remaining references to be misnumbered by one. The correctly numbered reference list is given below in its entirety. All reference citations in the paper's text are correct as published.
- Published
- 1993
3. Debugging Larch Shared Language Specifications.
- Author
-
Garland, Stephen J., Guttag, John V., and Horning, James J.
- Subjects
PROGRAMMING languages ,DEBUGGING ,AUTOMATION ,ELECTRONIC data processing ,ARTIFICIAL languages ,COMPUTER science - Abstract
The Larch family of specification languages supports a two-tiered definitional approach to specification. Each specification has components written in two languages: one designed for a specific programming language and another independent of any programming language. The former are called Larch interface languages, and the latter the Larch Shared Language (LSL). The Larch style of specification emphasizes brevity and clarity rather than executability. To make it possible to test specifications without executing or implementing them, Larch permits specifiers to make claims about logical properties of specifications and to check these claims at specification time. Since these claims are undecidable in the general case, it is impossible to build a tool that will automatically certify claims about arbitrary specifications. However, it is feasible to build tools that assist specifiers in checking claims as they debug specifications. This paper describes the checkability designed into LSL and discusses two tools that help perform the checking. This paper is a revised and expanded version of a paper presented at the April 1990 IFIP Working Conference on Programming Concepts and Methods [7]. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
4. Programmer-Transparent Coordination of Recovering Concurrent Processes: Philosophy and Rules for Efficient Implementation.
- Author
-
Kim, K. H.
- Subjects
COMPUTER programming ,ELECTRONIC data processing ,COMPUTER science ,COMPUTER systems ,COMPUTER engineering ,ALGORITHMS - Abstract
A new approach to coordination of cooperating concurrent processes, each capable of error detection and recovery, is presented. Error detection, rollback, and retry in a process are specified by a well-structured language construct called recovery block. Recovery points of processes must be properly coordinated to prevent a disastrous avalanche of process rollbacks. In contrast to the previously studied approaches that require the program designer to coordinate the recovery block structures of interacting processes (thereby coordinating the recovery points of processes), the new approach relieves the program designer of that burden It instead relies upon an intelligent processor system (that runs processes) capable of establishing and discarding the recovery points of interacting processes in a well coordinated manner such that 1) a process never makes two consecutive rollbacks without making a retry between the two, and 2) every process rollback becomes a minimum-distance rollback. Following the discussion of the underlying philosophy of the new approach, basic rules of reducing storage and time overhead in such a processor system are discussed. Throughout this paper examples are drawn from the systems in which processes communicate through monitors. [ABSTRACT FROM AUTHOR]
- Published
- 1988
- Full Text
- View/download PDF
5. Statistical and Scientific Database Issues.
- Author
-
Shoshani, Arie and Wong, Harry K. T.
- Subjects
DATABASE management ,MULTIDIMENSIONAL databases ,USER interfaces ,DATABASES ,ELECTRONIC data processing ,COMPUTER science - Abstract
The purpose of this paper is to summarize the research issues of statistical and scientific databases (SSDB's). It organizes the issues into four major groups: physical organization and access methods, operators, logical organization and user interfaces, and miscellaneous issues. It emphasizes the differences between SSDB's and traditional database applications, and motivates the need for new and innovative techniques for the support of SSDB's. In addition to describing current work in this field, it discusses open research areas and proposes possible approaches to their solution. [ABSTRACT FROM AUTHOR]
- Published
- 1985
6. Comments on "Temporal Logic-Based Deadlock Analysis for Ada".
- Author
-
Young, Michal, Levine, David L., and Taylor, Richard N.
- Subjects
COMPUTER multitasking ,SOFTWARE engineering ,MATHEMATICAL logic ,COMPUTER programming ,ELECTRONIC data processing ,COMPUTER science - Abstract
Karam and Buhr describes a deadlock analysis technique based on enumeration of paths in a concurrent program. They also attempt to classify prior work on analysis techniques for concurrent systems, a and to compare their technique to the prior work. Several flaws in the article "Temporal Logic-Based Deadlock Analysis for Ada" was pointed out. The characterization of operational and axiomic proof methods is muddled and inaccurate. The classification of modeling techniques for concurrent systems confuses the distinction between state-based and event-based models with the essential distinction between explicit enumeration of behaviors and symbolic manipulation of properties. The statements about the limitations of linear-time temporal logic vis a vis nondeterminism are inaccurate.
- Published
- 1993
- Full Text
- View/download PDF
7. Scenario-Based Assessment of Nonfunctional Requirements.
- Author
-
Gregoriades, Andreas and Sutcliffe, Alistair
- Subjects
SYSTEMS design ,SYSTEM analysis ,COMPUTER science ,ELECTRONIC data processing ,MATHEMATICAL optimization ,SYSTEMS theory - Abstract
This paper describes a method and a tool for validating nonfunctional requirements in complex socio-technical systems. The System Requirements Analyzer (SRA) tool validates system reliability and operational performance requirements using scenario- based testing. Scenarios are transformed into sequences of task steps and the reliability of human agents performing tasks with computerized technology is assessed using Bayesian Belief Network (BN) models. The tool tests system performance within an envelope of environmental variations and reports the number of tests that pass a benchmark threshold. The tool diagnoses problematic areas in scenarios representing pathways through system models, assists in the identification of their causes, and supports comparison of alternative requirements specifications and system designs. It is suitable for testing socio-technical systems where operational scenarios are sequential and deterministic, in domains where designs are incrementally modified so set up costs of the BNs can be defrayed over multiple tests. [ABSTRACT FROM AUTHOR]
- Published
- 2005
- Full Text
- View/download PDF
8. Superviews: Virtual Integration of Multiple Databases.
- Author
-
Motro, Amihai
- Subjects
- *
DATABASES , *COMPUTER files , *VIRTUAL machine systems , *ELECTRONIC data processing , *COMPUTER science , *SOFTWARE engineering - Abstract
An important advantage of a database system is that it provides each application with a custom view of the data. The issue addressed in this paper is how to provide such custom views to applications that access multiple databases. The paper describes a formal method that generates such superviews, In an interactive process of schema editing operations. A mapping of the superview into the individual databases is derived from the editing process, and is stored together with the superview as a virtual database. When this database is interrogated, the mapping is used to decompose each query into a set of queries against the individual databases, and recompose the answers to form an answer to the original query. As this process is transparent to the user, virtual databases may be regarded as a more general type of databases. A prototype database system, that allows users to construct virtual databases and interrogate them, has been developed. [ABSTRACT FROM AUTHOR]
- Published
- 1987
9. On Satisfying Timing Constraints in Hard-Real-Time Systems.
- Author
-
Xu, Jia and Parnas, David Lorge
- Subjects
COMPUTER systems ,COMPUTER software ,SOFTWARE engineering ,EMBEDDED computer systems ,COMPUTER algorithms ,ELECTRONIC data processing ,COMPUTER science - Abstract
We explain why pre-run-time scheduling is essential if one wishes to guarantee that timing constraints will be satisfied in a large complex hard-real-time system. We examine some of the major concerns in pre-run-time scheduling and consider what formulations of mathematical scheduling problems can be used to address those concerns. This paper provides a guide to the available algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 1993
10. Semantically Extended Data Flow Diagrams: A Formal Specification Tool.
- Author
-
France, Robert B.
- Subjects
COMPUTER software development ,FLOW charts ,DATA flow computing ,SEMANTIC integration (Computer systems) ,ELECTRONIC data processing ,TRANSITION flow ,SOFTWARE verification ,COMPUTER science ,TECHNICAL specifications - Abstract
The popularity of the Data Flow Diagram (DFD) specification tool in industry seems to stems from its use of intuitively defined concepts and notation. The use of such concepts and notation gives flexibility to the tool and most often results in intuitively appealing specifications. The flexibility comes at a price—the lack of a formal basis for DFD concepts and notation prohibits its use as a formal specification tool. On the other hand, a problem commonly stated as a deterrent to the use of formal specification techniques in industry is the lack of intuitive appeal in the specifications, which has led some researchers to suggest that formal specifications be supplemented with less formal, more intuitive documentations of specified functionalities. In this paper we describe a method for associating a DFD with a formal specification. The intention is to enhance the use of the DFD as a formal specification tool, thus gaining a tool that can be used to document application functionality in an understandable manner and, at the same time, be capable of producing a formal specification that can be used to rigorously investigate semantic properties of the application. [ABSTRACT FROM AUTHOR]
- Published
- 1992
- Full Text
- View/download PDF
11. Formal Verification of Ada Programs.
- Author
-
Guaspari, David, Marceau, Carla, and Polak, Wolfgang
- Subjects
PROGRAMMING languages ,SOFTWARE verification ,ELECTRONIC data processing ,SEQUENTIAL processing (Computer science) ,COMPUTER software ,COMPUTER science - Abstract
This paper describes the Penelope verification editor and its formal basis. Penelope is a prototype system for the interactive development and verification of programs that are written in a rich sub- set of sequential Ada. Because it generates verification conditions incrementally, Penelope can be used to develop a program and its correctness proof in concert. If an already-verified program is modified, one can attempt to prove the modified version by replaying and modifying the original sequence of proof steps. Verification conditions are generated by predicate transformers whose logical soundness can be proven by establishing a precise formal connection between predicate transformation and denotational definitions in the style of continuation semantics. Penelope's specification language, Larch/Ada, belongs to the family of Larch interface languages. It scales up properly, in the sense that we can demonstrate the soundness of decomposing an implementation hierarchically and reasoning locally about the implementation of each node in the hierarchy. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
12. Using Larch to Specify Avalon/C + + Objects.
- Author
-
Wing, Jeannette M.
- Subjects
C++ ,PROGRAMMING languages ,COMPUTER software ,COMPUTER science ,COMPUTER programmers ,ELECTRONIC data processing - Abstract
This paper gives a formal specification of three base Avalon/C++ classes; recoverable, atomic, and subatomic. Programmers derive from class recoverable to define persistent objects, and from either class atomic or class subatomic to define atomic objects. The specifications, written in Larch, provide the means for showing that classes derived from the base classes implement objects that are persistent or atomic, and thus exemplify the applicability of an existing specification method to specifying "nonfunctional" properties. Writing these format specifications for Avalon/C++'s built-in classes has helped to clarify places in the programming language where features interact, to make unstated assumptions explicit, and to characterize complex properties of objects. [ABSTRACT FROM AUTHOR]
- Published
- 1990
- Full Text
- View/download PDF
13. A Comparison of Some Structural Testing Strategies.
- Author
-
Ntafos, Simeon C.
- Subjects
SOFTWARE engineering ,TESTING ,COMPUTER science ,COMPUTER programming ,ELECTRONIC data processing ,INFORMATION science - Abstract
In this paper we compare a number of structural testing strategies in terms of their relative coverage of the program's structure and also in terms of the number of test cases needed to satisfy each strategy. We also discuss some of the deficiencies of such comparisons. [ABSTRACT FROM AUTHOR]
- Published
- 1988
- Full Text
- View/download PDF
14. Statistical Database Query Languages.
- Author
-
Ozsoyoglu, Gultekin and Ozsoyoglu, Zehra Meral
- Subjects
DATABASES ,DATABASE management ,QUERY languages (Computer science) ,ELECTRONIC data processing ,DATABASE design ,COMPUTER science - Abstract
Databases that are mainly used for statistical analysis are called statistical databases (SDB). A statistical database management system (SDBMS) may be defined as a database management system that provides capabilities 1) to model, store, and manipulate data in a manner suitable for the needs of SDB users, and 2) to apply statistical data analysis techniques that range from simple summary statistics to advanced procedures. This paper surveys the existing and proposed SDB data definition and data manipulation (i.e., query) languages. [ABSTRACT FROM AUTHOR]
- Published
- 1985
15. Qualified Data Flow Problems.
- Author
-
Holley, L. Howard and Rosen, Barry K.
- Subjects
DATA flow computing ,ELECTRONIC data processing ,COMPUTER software ,SOFTWARE engineering ,COMPUTER science ,COMPUTER programming - Abstract
It is known that not all paths are possible in the run time control flow of many programs. It is also known that data flow analysis cannot restrict attention to exactly those paths that are possible. It is, therefore, usual for analytic methods to consider all paths. Sharper information can be obtained by considering a recursive set of paths that is large enough to include all possible paths, but small enough to exclude many of the impossible ones. This paper presents a simple uniform methodology for sharpening data flow information by considering certain recursive path sets of practical importance. Associated with each control flow arc there is a relation on a finite set C). The paths that qualify to be considered are (essentially) those for which the com- position of the relations encountered is nonempty. For example, Q might be the set of all assignments of values to each of several bit variables used by a program to remember some facts about the past and branch accordingly in the future. Given any data flow problem together with qualifying relations on Q associated with the control flow arcs, we construct a new problem. Considering all paths in the new problem is equivalent to considering only qualifying paths in the old one. Preliminary experiments (with a small set of real programs) Indicate that qualified analysis is feasible and substantially more informative than ordinary analysis. The methodology also ban beneficial feedback effect on the delicate task of passing from programs to meaningful data flow analysis problems. Even when all paths qualify, unusually sharp information can be obtained by passing from programs to problems in ways suggested by theorems proved here. [ABSTRACT FROM AUTHOR]
- Published
- 1981
16. Overhead Storage Considerations and a Multilinear Method for Data File Compression.
- Author
-
Young, Tzay Y. and Liu, Philip S.
- Subjects
DATA compression ,CLUSTER analysis (Statistics) ,VIRTUAL storage (Computer science) ,ELECTRONIC file management ,COMPUTER science ,ELECTRONIC data processing - Abstract
The paper is concerned with the reduction of overhead storage, i.e., the stored compression/decompression (C/D) table, in field-level data file compression. A large C/D table can occupy a large fraction of main memory space during compression and decompression, and may cause excessive page swapping in virtual memory systems. A two-stage approach is studied, including the required additional C/D table decompression time. It appears that the approach has limitations and is not completely satisfactory. A multilinear compression method is proposed which is capable of reducing the overhead storage by a significant factor. Multilinear compression groups data items into several clusters and then compresses each cluster by a binary-field linear transformation. Algorithms for clustering and transformation are developed, and data compression examples are presented. [ABSTRACT FROM AUTHOR]
- Published
- 1980
17. An Approach to Formal Definitions and Proofs of Programming Principles.
- Author
-
Misra, Jayadev
- Subjects
COMPUTER programming ,COMPUTER algorithms ,ELECTRONIC data processing ,MATHEMATICAL analysis ,COMPUTER science ,COMPUTER software ,SOFTWARE engineering ,ENGINEERING - Abstract
A method for formal description of programming principles is presented in this paper. Programming principles, such as sequential search can be defined and proven even in the absence of an application. We represent a principle as a program scheme which has partially interpreted functions in it. The functions must obey certain input constraints. Use of these ideas in program proving is illustrated with examples. [ABSTRACT FROM AUTHOR]
- Published
- 1978
18. Exact Analysis of Bernoulli Superposition of Streams Into a Least Recently Used Cache.
- Author
-
Levy, Hanoch and Morris, Robert J.T.
- Subjects
- *
DATABASE management , *COMPUTER storage devices , *ELECTRONIC data processing , *COMPUTER architecture , *DATABASES , *COMPUTER science - Abstract
We present an exact analysis of the superposition of address streams into a cache buffer which is managed according to a least recently used (LRU) replacement policy. Each of the streams is characterized by a stack depth distribution, and we seek the cache hit ratio for each stream, when the combined, or superposed, stream is applied to a shared LRU cache. In this pa- per the combining process is taken to be a Bernoulli switching process. This problem arises in a number of branches of computer science, particularly in database systems and processor architecture. Previously, a number of approximation techniques of various complexities have been proposed for the solution of this problem. The main contribution of this paper is the description of an exact technique. We evaluate the performance of the exact and an approximate technique on realistic data, both in a lab environment and a large database installation. The results allow comparisons of the techniques, and provide insight into the validity of the Bernoulli switching assumption. [ABSTRACT FROM AUTHOR]
- Published
- 1995
- Full Text
- View/download PDF
19. A Case Study Of CES: A Distributed Collaborative Editing System Implemented In Argus.
- Author
-
Greif, Irene, Seliger, Robert, and Weihl, William
- Subjects
- *
COMPUTER networks , *ELECTRONIC data processing , *COMPUTER science , *PROGRAMMING languages , *SOFTWARE engineering , *COMPUTER programming - Abstract
As distributed configurations of high-powered workstations connected by networks become prevalent, tools for writing distributed programs take on increasing importance. This paper describes our experience implementing CES, a distributed Collaborative Editing System. CES was written in Argus, a language that was designed to support the construction of reliable distributed programs,and exhibits a number of requirements typical of distributed applications. Our experience illustrates numerous areas in which the support provided by Argus for meeting those requirements was quite helpful, but also identifies several areas in which the support provided by Argus was inadequate. Some of the problems arise because of the distinction in Argus (and in other systems) between locally and remotely accessible data and the mechanisms provided for implementing each. Others arise because of limitations of the mechanisms for building user-defined data types. We discuss the problems we encountered, including their implications for other systems. We also suggest solutions to the problems, or in some cases further research directed at finding solutions. [ABSTRACT FROM AUTHOR]
- Published
- 1992
- Full Text
- View/download PDF
20. Determining an Optimal Time Interval for Testing and Debugging Software.
- Author
-
Singpurwalla, Nozer D.
- Subjects
- *
COMPUTER software , *DEBUGGING , *ELECTRONIC data processing , *COMPUTER programming , *RELIABILITY in engineering , *SOFTWARE engineering - Abstract
In this paper, we describe an approach for addressing the important problem of how long to test and debug software before it is released. Our approach is based on the principles of decision making under uncertainty, and involves a maximization of expected utility. We suggest two plausible forms for the utility function, one based on costs alone and the other involving the realized reliability of the software. Using results from the literature on probabilistic models for software failure, we outline, for the case of single state testing, the ensuing optimization problem which can be addressed using numerical techniques. The sensitivity of our results to the various input parameters is discussed and some directions for future research outlined. [ABSTRACT FROM AUTHOR]
- Published
- 1991
- Full Text
- View/download PDF
21. Chameleon: A System for Solving the Data-Translation Problem.
- Author
-
Mamrak, Sandra A., Kaelbling, Michael J., Nicholas, Charles K., and Share, Michael
- Subjects
- *
ELECTRONIC data processing , *ELECTRONIC records , *RECORDS management , *COMPUTER software , *SOFTWARE engineering , *COMPUTER science - Abstract
There is a need for widespread exchange of electronic documents in domains as diverse as book publishing, automated offices, factories, and research laboratories. The variety of data representations, and the subsequent need for data translation, is a major obstacle to this exchange. This paper describes a comprehensive data translation system with the following characteristics: 1) it is derived from a formal model of the translation task; 2) it supports the building of translation tools; 3) it supports the use of translation tools; and 4) it is accessible to its targeted end-users. A software architecture to achieve the translation capability is fully implemented. Translators have been generated using the architecture, both by the original software developers and by industrial associates who have installed the architecture at their own sites. [ABSTRACT FROM AUTHOR]
- Published
- 1989
- Full Text
- View/download PDF
22. Time-Sensitive Cost Models in the Commercial MIS Environment.
- Author
-
Jeffery, D. Ross
- Subjects
- *
MANAGEMENT information systems , *INFORMATION resources management , *DATABASE management , *ELECTRONIC data processing , *DATA warehousing , *COMPUTER science - Abstract
Current lime-sensitive cost models suggest a significant impact on project effort if elapsed time compression or expansion is implemented. This paper reports an empirical study into the applicability of these models in the management information systems environment. It is found that elapsed time variation does not consistently affect project effort. This result is analyzed in terms of the theory supporting such a relationship, and an alternate relationship is suggested. [ABSTRACT FROM AUTHOR]
- Published
- 1987
23. A Conceptual Analysis of the Draco Approach to Constructing Software Systems.
- Author
-
Freeman, Peter
- Subjects
- *
SOFTWARE engineering , *COMPUTER software development , *COMPUTER software reusability , *SOFTWARE architecture , *ELECTRONIC data processing , *COMPUTER science - Abstract
This paper analyzes the concepts of software construction embodied in the Draco approach. The analysis relates specific aspects of Draco to particular software engineering (SE) principles and suggests future research needed to extend the approach. The purpose of this analysis is to help researchers understand Draco better and thus to enhance future research. [ABSTRACT FROM AUTHOR]
- Published
- 1987
24. Some Aspects of the Verification of Loop Computations.
- Author
-
Misra, Jayadev
- Subjects
COMPUTER programming ,COMPUTER algorithms ,ELECTRONIC data processing ,MATHEMATICAL analysis ,SOFTWARE engineering ,COMPUTER science - Abstract
The problem of proving whether or not a loop computes a given function is investigated. We consider loops which have a certain ‘closure’ property and derive necessary and sufficient conditions for such a loop to compute a given function. It is argued that closure is a fundamental concept in program proving. Extensions of the basic result to proofs involving relations other than functional relations, which typically arise in nondeterministic loops, are explored. Several applications of these results are given, particularly in showing that certain classes of programs may be directly proven (their loop invariants generated) given only their input-output relationships. Implications of these results are discussed. [ABSTRACT FROM AUTHOR]
- Published
- 1978
25. Specifying and Verifying Requirements of Real-Time Systems.
- Author
-
Ravn, Anders P., Rischel, Hans, and Hansen, Kirsten Mark
- Subjects
COMPUTER systems ,COMPUTER engineering ,COMPUTER programming ,COMPUTER algorithms ,ELECTRONIC data processing ,COMPUTER science - Abstract
An approach to specification of requirements and verification of design for real-time systems is presented. A system is defined by a conventional mathematical model for a dynamic system where application specific states denote functions of real time. Specifications are formulas in duration calculus, a real-time interval logic, where predicates define durations of states. Requirements define safety and functionality constraints on the system or a component. A top-level design is given by a control law: a predicate that defines an automaton controlling the transition between phases of operation. Each phase maintains certain relations among the system states; this is analogous to the control functions known from conventional control theory. The top-level design is decomposed into an architecture for a distributed system with specifications for sensor, actuator, and program components. Programs control the distributed computation through synchronous events. Sensors and actuators relate events with system states. Verification is a deduction showing that a design implies requirements. [ABSTRACT FROM AUTHOR]
- Published
- 1993
- Full Text
- View/download PDF
26. Antisampling for Estimation: An Overview.
- Author
-
Rowe, Neil C.
- Subjects
EXPERT systems ,STATISTICAL sampling ,STATISTICS ,DATABASES ,COMPUTER science ,ELECTRONIC data processing - Abstract
We survey a new way to get quick estimates of the values of simple statistics (like count, mean, standard deviation, maximum, median, and mode frequency) on a large data set. This approach is a comprehensive attempt (apparently the first) to estimate statistics without any sampling. Our "antisampling" techniques have analogies to those of sampling, and exhibit similar estimation accuracy, but can be done much faster than sampling with large computer databases. Anti-sampling exploits computer science ideas from database theory and expert systems, building an auxiliary structure called a "database abstract." We make detailed comparisons to several different kinds of sampling. [ABSTRACT FROM AUTHOR]
- Published
- 1985
27. Closed Covers: To Verify Progress for Communicating Finite State Machines.
- Author
-
Gouda, Mohamed C.
- Subjects
ELECTRONIC data processing ,COMPUTER network protocols ,DATA structures ,ELECTRONIC file management ,COMPUTER science ,SOFTWARE engineering - Abstract
Consider communicating finite slate machines which exchange messages over unbounded FIFO channels. We discuss a technique to verify that the communication between a given pair of such machines will progress indefinitely; this implies that the Communication is free from deadlocks and unspecified receptions. The technique is based on finding a set of global states for the communicating pair such that the following two conditions (along with other conditions) are satisfied: I) the initial global state is in that set; and 2) starting from any global state in that set, an "acyclic version" of the communicating pair must reach a global state in that set. We call such a set a closed cover, and show that the existence of a closed cover for a communicating pair is sufficient to guarantee indefinite communication progress. We also show that in many practical instances, if the communication is guaranteed to progress indefinitely, then the existence of a closed cover is necessary. [ABSTRACT FROM AUTHOR]
- Published
- 1984
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.