334 results
Search Results
2. Methodological Standards in Accessibility Research on Motor Impairments: A Survey.
- Author
-
SARSENBAYEVA, ZHANNA, VAN BERKEL, NIELS, VELLOSO, EDUARDO, GONCALVES, JORGE, and KOSTAKOS, VASSILIS
- Subjects
COMPUTER science - Abstract
The design and evaluation of accessibility technology is a core component of the computer science landscape, aiming to ensure that digital innovations are accessible to all. One of the most prominent and long-lasting areas of accessibility research focuses on motor impairments--deficiencies that affect the ability to move, manipulate objects, and interact with the physical world. In this survey article, we present an extensive overview of the past two decades of research into accessibility for people with motor impairments. Following a structured selection process, we analyzed the study details as reported in 177 relevant papers. Based on this analysis, we critically assess user representation, measurement instruments, and existing barriers that exist in accessibility research. Finally, we discuss future directions for accessibility research within the computer science domain. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
3. A Survey on the Usage of Eye-Tracking in Computer Programming.
- Author
-
OBAIDELLAH, UNAIZAH, AL HAEK, MOHAMMED, and CHENG, PETER C-H.
- Subjects
COMPUTER programming ,EYE tracking ,COMPUTER engineering ,COMPUTER science ,PROGRAMMING languages ,DATABASES - Abstract
Traditional quantitative research methods of data collection in programming, such as questionnaires and interviews, are the most common approaches for researchers in this field. However, in recent years, eye-tracking has been on the rise as a new method of collecting evidence of visual attention and the cognitive process of programmers. Eye-tracking has been used by researchers in the field of programming to analyze and understand a variety of tasks such as comprehension and debugging. In this article, we will focus on reporting how experiments that used eye-trackers in programming research are conducted, and the information that can be collected from these experiments. In this mapping study, we identify and report on 63 studies, published between 1990 and June 2017, collected and gathered via manual search on digital libraries and databases related to computer science and computer engineering. Among the five main areas of research interest are program comprehension and debugging, which received an increased interest in recent years, non-code comprehension, collaborative programming, and requirements traceability research, which had the fewest number of publications due to possible limitations of the eye-tracking technology in this type of experiments. We find that most of the participants in these studies were students and faculty members from institutions of higher learning, and while they performed programming tasks on a range of programming languages and programming representations, we find Java language and Unified Modeling Language (UML) representation to be the most used materials. We also report on a range of eye-trackers and attention tracking tools that have been utilized, and find Tobii eye-trackers to be the most used devices by researchers. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
4. Computer Vision-based Analysis of Buildings and Built Environments: A Systematic Review of Current Approaches.
- Author
-
STARZYŃSKA-GRZEŚ, MAŁGORZATA B., ROUSSEL, ROBIN, JACOBY, SAM, and ASADIPOUR, ALI
- Subjects
BUILT environment ,COMPUTER architecture ,EVIDENCE gaps ,COMPUTER science ,ARCHITECTURAL design ,COMPUTER vision - Abstract
Analysing 88 sources published from 2011 to 2021, this article presents a first systematic review of the computer vision-based analysis of buildings and the built environment. Its aim is to assess the potential of this research for architectural studies and the implications of a shift to a cross-disciplinarity approach between architecture and computer science for research problems, aims, processes, and applications. To this end, the types of algorithms and data sources used in the reviewed studies are discussed in respect to architectural applications such as a building classification, detail classification, qualitative environmental analysis, building condition survey, and building value estimation. Based on this, current research gaps and trends are identified, with two main research aims emerging. First, studies that use or optimise computer vision methods to automate time-consuming, labour-intensive, or complex tasks when analysing architectural image data. Second, work that explores the methodological benefits of machine learning approaches to overcome limitations of conventional analysis to investigate new questions about the built environment by finding patterns and relationships among visual, statistical, and qualitative data. The growing body of research offers new methods to architectural and design studies, with the article identifying future challenges and directions of research. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
5. A Survey on Automatic Detection of Hate Speech in Text.
- Author
-
FORTUNA, PAULA and NUNES, SÉRGIO
- Subjects
HATE speech ,AUTOMATIC speech recognition ,VIRTUAL communities ,NATURAL language processing ,COMPUTER science ,DIGITAL media - Abstract
The scientific study of hate speech, from a computer science point of view, is recent. This survey organizes and describes the current state of the field, providing a structured overview of previous approaches, including core algorithms, methods, and main features used. This work also discusses the complexity of the concept of hate speech, defined in many platforms and contexts, and provides a unifying definition. This area has an unquestionable potential for societal impact, particularly in online communities and digital media platforms. The development and systematization of shared resources, such as guidelines, annotated datasets in multiple languages, and algorithms, is a crucial step in advancing the automatic detection of hate speech. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
6. Privacy in the Genomic Era.
- Author
-
NAVEED, MUHAMMAD, AYDAY, ERMAN, CLAYTON, ELLEN W., FELLAY, JACQUES, GUNTER, CARL A., HUBAUX, JEAN-PIERRE, MALIN, BRADLEY A., and XIAOFENG WANG
- Subjects
NUCLEOTIDE sequencing ,GENOTYPES ,GENOMES ,GOVERNMENT policy ,COMPUTER science - Abstract
Genome sequencing technology has advanced at a rapid pace and it is now possible to generate highlydetailed genotypes inexpensively. The collection and analysis of such data has the potential to support various applications, including personalized medical services. While the benefits of the genomics revolution are trumpeted by the biomedical community, the increased availability of such data has major implications for personal privacy; notably because the genome has certain essential features, which include (but are not limited to) (i) an association with traits and certain diseases, (ii) identification capability (e.g., forensics), and (iii) revelation of family relationships. Moreover, direct-to-consumer DNA testing increases the likelihood that genome data will be made available in less regulated environments, such as the Internet and for-profit companies. The problem of genome data privacy thus resides at the crossroads of computer science, medicine, and public policy. While the computer scientists have addressed data privacy for various data types, there has been less attention dedicated to genomic data. Thus, the goal of this paper is to provide a systematization of knowledge for the computer science community. In doing so, we address some of the (sometimes erroneous) beliefs of this field and we report on a survey we conducted about genome data privacy with biomedical specialists. Then, after characterizing the genome privacy problem, we review the state-of-the-art regarding privacy attacks on genomic data and strategies for mitigating such attacks, as well as contextualizing these attacks from the perspective of medicine and public policy. This paper concludes with an enumeration of the challenges for genome data privacy and presents a framework to systematize the analysis of threats and the design of countermeasures as the field moves forward. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
7. Evaluating Domain Ontologies: Clarification, Classification, and Challenges.
- Author
-
MCDANIEL, MELINDA and STOREY, VEDA C.
- Subjects
ONTOLOGIES (Information retrieval) ,EVALUATION ,CLASSIFICATION ,HIERARCHICAL clustering (Cluster analysis) ,LATENT semantic analysis ,SEMANTIC Web ,COMPUTER science - Abstract
The number of applications being developed that require access to knowledge about the real world has increased rapidly over the past two decades. Domain ontologies, which formalize the terms being used in a discipline, have become essential for research in areas such as Machine Learning, the Internet of Things, Robotics, and Natural Language Processing, because they enable separate systems to exchange information. The quality of these domain ontologies, however, must be ensured for meaningful communication. Assessing the quality of domain ontologies for their suitability to potential applications remains difficult, even though a variety of frameworks and metrics have been developed for doing so. This article reviews domain ontology assessment efforts to highlight the work that has been carried out and to clarify the important issues that remain. These assessment efforts are classified into five distinct evaluation approaches and the state of the art of each described. Challenges associated with domain ontology assessment are outlined and recommendations are made for future research and applications. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
8. A Survey on Hyperdimensional Computing aka Vector Symbolic Architectures, Part I: Models and Data Transformations.
- Author
-
KLEYKO, DENIS, RACHKOVSKIJ, DMITRI A., and OSIPOV, EVGENY
- Subjects
ARTIFICIAL intelligence ,BINARY codes ,COGNITIVE science ,TENSOR products ,COMPUTER science ,COGNITIVE computing - Abstract
This two-part comprehensive survey is devoted to a computing framework most commonly known under the names Hyperdimensional Computing and Vector Symbolic Architectures (HDC/VSA). Both names refer to a family of computational models that use high-dimensional distributed representations and rely on the algebraic properties of their key operations to incorporate the advantages of structured symbolic representations and distributed vector representations. Notable models in the HDC/VSA family are Tensor Product Representations, Holographic Reduced Representations, Multiply-Add-Permute, Binary Spatter Codes, and Sparse Binary Distributed Representations but there are other models too. HDC/VSA is a highly interdisciplinary field with connections to computer science, electrical engineering, artificial intelligence, mathematics, and cognitive science. This fact makes it challenging to create a thorough overview of the field. However, due to a surge of new researchers joining the field in recent years, the necessity for a comprehensive survey of the field has become extremely important. Therefore, amongst other aspects of the field, this Part I surveys important aspects such as: known computational models of HDC/VSA and transformations of various input data types to high-dimensional distributed representations. Part II of this survey [84] is devoted to applications, cognitive computing and architectures, as well as directions for future work. The survey is written to be useful for both newcomers and practitioners. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
9. Weight-Sharing Neural Architecture Search: A Battle to Shrink the Optimization Gap
- Author
-
WangLanfei, ChangJianlong, BiKaifeng, ZhangXiaopeng, XieLingxi, XiaoAn, ChenXin, ChenZhengsu, TianQi, XuYuhui, and WeiLonghui
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Battle ,General Computer Science ,business.industry ,Computer science ,Computer Vision and Pattern Recognition (cs.CV) ,media_common.quotation_subject ,Computer Science - Computer Vision and Pattern Recognition ,Machine Learning (cs.LG) ,Theoretical Computer Science ,Artificial intelligence ,Architecture ,business ,media_common - Abstract
Neural architecture search (NAS) has attracted increasing attentions in both academia and industry. In the early age, researchers mostly applied individual search methods which sample and evaluate the candidate architectures separately and thus incur heavy computational overheads. To alleviate the burden, weight-sharing methods were proposed in which exponentially many architectures share weights in the same super-network, and the costly training procedure is performed only once. These methods, though being much faster, often suffer the issue of instability. This paper provides a literature review on NAS, in particular the weight-sharing methods, and points out that the major challenge comes from the optimization gap between the super-network and the sub-architectures. From this perspective, we summarize existing approaches into several categories according to their efforts in bridging the gap, and analyze both advantages and disadvantages of these methodologies. Finally, we share our opinions on the future directions of NAS and AutoML. Due to the expertise of the authors, this paper mainly focuses on the application of NAS to computer vision problems and may bias towards the work in our group., Comment: 24 pages, 3 figures, 2 tables, meta data updated
- Published
- 2021
10. k-Nearest Neighbour Classifiers - A Tutorial
- Author
-
Pádraig Cunningham, Sarah Jane Delany, and SFI
- Subjects
Artificial Intelligence and Robotics ,Speedup ,General Computer Science ,Computer science ,Dimension (graph theory) ,02 engineering and technology ,Machine learning ,computer.software_genre ,Theoretical Computer Science ,Machine Learning ,0504 sociology ,Similarity (network science) ,Classifier (linguistics) ,0202 electrical engineering, electronic engineering, information engineering ,Code (cryptography) ,𝑘-Nearest Neighbour Classifiers ,computer.programming_language ,business.industry ,Data Science ,05 social sciences ,k-NN ,050401 social sciences methods ,Python (programming language) ,Class (biology) ,ComputingMethodologies_PATTERNRECOGNITION ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,computer ,Curse of dimensionality - Abstract
Perhaps the most straightforward classifier in the arsenal or Machine Learning techniques is the Nearest Neighbour Classifier—classification is achieved by identifying the nearest neighbours to a query example and using those neighbours to determine the class of the query. This approach to classification is of particular importance, because issues of poor runtime performance is not such a problem these days with the computational power that is available. This article presents an overview of techniques for Nearest Neighbour classification focusing on: mechanisms for assessing similarity (distance), computational issues in identifying nearest neighbours, and mechanisms for reducing the dimension of the data. This article is the second edition of a paper previously published as a technical report [16]. Sections on similarity measures for time-series, retrieval speedup, and intrinsic dimensionality have been added. An Appendix is included, providing access to Python code for the key methods.
- Published
- 2021
11. Computer--Music Interfaces: A Survey.
- Author
-
Pennycook, Bruce W.
- Subjects
- *
COMPUTER composition , *COMPOSERS , *ENTERTAINERS , *COMPUTER music , *MUSIC , *COMPUTER science , *INTERFACE circuits - Abstract
This paper is a study of the unique problems posed by the use of computers by composers and performers of music. The paper begins with a presentation of the basic concepts involved in the musical interaction with computer devices, followed by a detailed discussion of three musical tasks: music manuscript preparation, music language interfaces for composition, and real-time performance interaction. Fundamental design principles are exposed through an examination of several early computer music systems, especially the Structured Sound Synthesis Project. A survey of numerous systems, based on the following categories, is presented: compositions and synthesis languages, graphics score editing, performance instruments, digital audio processing tools, and computer- aided instruction in music systems. An extensive reference list is provided for further study in the field. [ABSTRACT FROM AUTHOR]
- Published
- 1985
- Full Text
- View/download PDF
12. Functional Testing of Semiconductor Random Access Memories.
- Author
-
Abadir, Magdy S. and Reghbati, Hassan K.
- Subjects
- *
RANDOM access memory , *INTEGRATED circuit fault tolerance , *COMPUTER input-output equipment , *COMPUTER science , *SEMICONDUCTORS , *ALGORITHMS - Abstract
This paper presents an overview of the problem of testing semiconductor random access memories (RAMs). An important aspect of this test procedure is the detection of permanent faults that cause the memory to function incorrectly. Functional-level fault models are very useful for describing a wide variety of RAM faults. Several fault models are discussed throughout the paper, including the stuck-at-0/1 faults, coupled-cell faults, and single-cell pattern-algorithms. Test procedures for these fault models are presented and their fault coverage and execution times are discussed. The paper is intended for the general computer science audience and presupposes no background in the hardware testing area. [ABSTRACT FROM AUTHOR]
- Published
- 1983
- Full Text
- View/download PDF
13. Parallel Search of Strongly Ordered Game Trees.
- Author
-
Marsland, T. A. and Campbell, M.
- Subjects
- *
COMPUTER chess , *COMPUTER algorithms , *VIDEO games , *COMPUTER software , *MULTIPROCESSORS , *COMPUTER science - Abstract
The "alpha-beta" algorithm forms the basis of many programs that search game trees. A number of methods have been designed to improve the utility of the sequential version of this algorithm, especially for use in game-playing programs. These enhancements are based on the observation that alpha-beta is most effective when the best move in each position is considered early in the search. Trees that have this so-called "strong ordering" property are not only of practical importance but possess characteristics that can be exploited in both sequential and parallel environments. This paper draws upon experiences gained during the development of programs which search chess game trees. Over the past decade major enhancements to the alpha-beta algorithm have been developed by people building game-playing programs, and many of these methods will be surveyed and compared here. The balance of the paper contains a study of contemporary methods for searching chess game trees in parallel, using an arbitrary number of independent processors. To make efficient use of these processors, one must have a clear understanding of the basic properties of the trees actually traversed when alpha-beta cutoffs occur. This paper provides such insights and concludes with a brief description of our own refinement to a standard parallel search algorithm for this problem. [ABSTRACT FROM AUTHOR]
- Published
- 1982
- Full Text
- View/download PDF
14. Cache Memories.
- Author
-
Smith, Alan Jay
- Subjects
- *
CACHE memory , *PAGING (Computer science) , *INFORMATION storage & retrieval systems , *COMPUTER architecture , *DATA structures , *COMPUTER science - Abstract
Cache memories are used in modern, medium and high-speed CPUs to hold temporarily those portions of the contents of main memory which are (believed to be) currently in use. Since instructions and data in cache memories can usually be referenced in 10 to 25 percent of the time required to access main memory, cache memories permit the execution rate of the machine to be substantially increased. In order to function effectively, cache memories must be carefully designed and implemented. In this paper, we explain the various aspects of cache memories and discuss in some detail the design features and trade-offs. A large number of original, trace-driven simulation results are presented. Consideration is given to practical implementation questions as well as to more abstract design issues. Specific aspects of cache memories that are investigated include: the cache fetch algorithm (demand versus prefetch), the placement and replacement algorithms, line size, store-through versus copy-back updating of main memory, cold-start versus warm-start miss ratios, multicache consistency, the effect of input/output through the cache, the behavior of split data/instruction caches, and cache size. Our discussion includes other aspects of memory system architecture, including translation lookaside buffers. Throughout the paper, we use as examples the implementation of the cache in the Amdahl 470V/6 and 470V/7, the IBM 3081, 3033, and 370/168, and the DEC VAX 11/780. An extensive bibliography is provided. [ABSTRACT FROM AUTHOR]
- Published
- 1982
- Full Text
- View/download PDF
15. Charging for Computing Resources.
- Author
-
Mckell, Lynn J., Hansen, James V., and Heitger, Lester E.
- Subjects
- *
INFORMATION resources management , *INFORMATION modeling , *RESOURCE allocation , *TRANSFER pricing , *COMPUTER input-output equipment , *COMPUTER science - Abstract
Modem computer configurations are often designed to share a host of resources among many users who may be simultaneously competing for their utilization. A substantive issue in computer management focuses on how to effectively allocate computing resources and subsequently charge for them in this competing environment. The issue is generally made more complex by the dependencies in resource availability stemming from the hardware configuration. This paper surveys significant charging mechanisms which have been proposed. The paper does not discuss specific commercial software available to implement charging approaches, but rather is written as a basic survey for readers with an interest in both computers and management science. An elementary understanding of probability and statistics will be helpful to the reader of this paper and essential to one who pursues the bibliography. [ABSTRACT FROM AUTHOR]
- Published
- 1979
- Full Text
- View/download PDF
16. Editorial Policy.
- Author
-
Goldberg, Adele
- Subjects
- *
INTELLECTUAL property , *SURVEYS , *AUTHOR-publisher relations , *COPYRIGHT , *AUTHORSHIP , *COMPUTER science - Abstract
The article focuses on the journal ACM Computing Surveys. The main difference between a survey paper and tutorial paper published in the journal are emphasis and audience. A survey paper assumes its audience has a general knowledge of the field. A tutorial assumes its audience as inexpert. Most of the papers published in the journal combines the two approaches. The journal prefers papers that meet the minimum standards. A standard of first importance is technical accuracy, quality and clarity. The format for typing the manuscripts have been provided. The journal's policy is to own the copyrights in its technical publications. This protects its authors and employers and facilitates the proper use of the material by others. Each author of an accepted paper is required to sign the journal's copyright form as a condition of publication. Private authors retain all proprietary rights other than copyright such as patent rights. They retain the right to use any part of the article with proper acknowledgement to the journal.
- Published
- 1979
- Full Text
- View/download PDF
17. A Tutorial on ALGOL 68.
- Author
-
Tanenbaum, Andrew S.
- Subjects
- *
PROGRAMMING languages , *COMPUTER programmers , *ELECTRONIC data processing , *ARTIFICIAL languages , *COMPUTER science , *ALGOL (Computer program language) - Abstract
This paper is an introduction to the main features of ALGOL 68, emphasizing the novel features not found in many other programming languages. The topics, data types (modes), type conversion (coercion), generalized expressions (units), procedures, operators, the standard prelude, and input/output, form the basis of the paper. The approach is informal, relying heavily on many short examples. The paper applies to the Revised Report, published in 1975, rather than to the original report, published in 1969. [ABSTRACT FROM AUTHOR]
- Published
- 1976
- Full Text
- View/download PDF
18. Evolution of Data-Base Management Systems.
- Author
-
Fry, James P. and Sibley, Edgar K.
- Subjects
- *
DATABASE management , *COMPUTER science , *DATABASES , *METHODOLOGY , *INFORMATION science , *TECHNOLOGY - Abstract
This paper deals with the history and definitions common to data-base technology. It delimits the objectives of data-base management systems, discusses important concepts, and defines terminology for use by other papers in this issue, traces the development of data-base systems methodology, gives a uniform example, and presents some trends and issues. [ABSTRACT FROM AUTHOR]
- Published
- 1976
- Full Text
- View/download PDF
19. Mitigating Bias in Algorithmic Systems—A Fish-eye View.
- Author
-
ORPHANOU, KALIA, OTTERBACHER, JAHNA, KLEANTHOUS, STYLIANI, BATSUREN, KHUYAGBAATAR, GIUNCHIGLIA, FAUSTO, BOGINA, VERONIKA, TAL, AVITAL SHULNER, HARTMAN, ALAN, and KUFLIK, TSVI
- Subjects
COMMUNITIES ,COMPUTER science ,SCIENTIFIC computing ,FAIRNESS ,EYE - Abstract
Mitigating bias in algorithmic systems is a critical issue drawing attention across communities within the information and computer sciences. Given the complexity of the problem and the involvement of multiple stakeholders—including developers, end users, and third-parties—there is a need to understand the landscape of the sources of bias, and the solutions being proposed to address them, from a broad, cross-domain perspective. This survey provides a “fish-eye view,” examining approaches across four areas of research. The literature describes three steps toward a comprehensive treatment—bias detection, fairness management, and explainability management—and underscores the need to work from within the system as well as from the perspective of stakeholders in the broader context. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
20. Object Detection Using Deep Learning Methods in Traffic Scenarios
- Author
-
Zhijun Hou and Azzedine Boukerche
- Subjects
050210 logistics & transportation ,General Computer Science ,Computer science ,business.industry ,Deep learning ,05 social sciences ,Feature extraction ,02 engineering and technology ,Machine learning ,computer.software_genre ,Convolutional neural network ,Object detection ,Field (computer science) ,Theoretical Computer Science ,Task (project management) ,Open research ,0502 economics and business ,0202 electrical engineering, electronic engineering, information engineering ,Key (cryptography) ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,computer - Abstract
The recent boom of autonomous driving nowadays has made object detection in traffic scenes a hot topic of research. Designed to classify and locate instances in the image, this is a basic but challenging task in the computer vision field. With its powerful feature extraction abilities, which are vital for object detection, deep learning has expanded its application areas to this field during the past several years and thus achieved breakthroughs. However, even with such powerful approaches, traffic scenarios have their own specific challenges, such as real-time detection, changeable weather, and complex lighting conditions. This survey is dedicated to summarizing research and papers on applying deep learning to the transportation environment in recent years. More than 100 research papers are covered, and different aspects such as key generic object detection frameworks, categorized object detection applications in traffic scenario, evaluation metrics, and classified datasets are included. Some open research fields are also provided. We believe that it is the first survey focusing on deep learning-based object detection in traffic scenario.
- Published
- 2021
21. A Survey of Blockchain-Based Strategies for Healthcare
- Author
-
Jó Ueyama, Bruno S. Faiçal, Bhaskar Krishnamachari, and Erikson Júlio de Aguiar
- Subjects
Immutability ,Blockchain ,General Computer Science ,Computer science ,business.industry ,Supply chain ,Image sharing ,020206 networking & telecommunications ,02 engineering and technology ,computer.software_genre ,Data science ,Decentralization ,Theoretical Computer Science ,Health care ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Confidentiality ,REGISTROS MÉDICOS ,Log management ,business ,computer - Abstract
Blockchain technology has been gaining visibility owing to its ability to enhance the security, reliability, and robustness of distributed systems. Several areas have benefited from research based on this technology, such as finance, remote sensing, data analysis, and healthcare. Data immutability, privacy, transparency, decentralization, and distributed ledgers are the main features that make blockchain an attractive technology. However, healthcare records that contain confidential patient data make this system very complicated because there is a risk of a privacy breach. This study aims to address research into the applications of the blockchain healthcare area. It sets out by discussing the management of medical information, as well as the sharing of medical records, image sharing, and log management. We also discuss papers that intersect with other areas, such as the Internet of Things, the management of information, tracking of drugs along their supply chain, and aspects of security and privacy. As we are aware that there are other surveys of blockchain in healthcare, we analyze and compare both the positive and negative aspects of their papers. Finally, we seek to examine the concepts of blockchain in the medical area, by assessing their benefits and drawbacks and thus giving guidance to other researchers in the area. Additionally, we summarize the methods used in healthcare per application area and show their pros and cons.
- Published
- 2020
22. A Survey on Modality Characteristics, Performance Evaluation Metrics, and Security for Traditional and Wearable Biometric Systems
- Author
-
Arif I. Sarwat, Aditya Sundararajan, and Alexander P. Pons
- Subjects
FOS: Computer and information sciences ,Computer Science - Cryptography and Security ,Modality (human–computer interaction) ,Modalities ,General Computer Science ,Biometrics ,Computer science ,Wearable computer ,020206 networking & telecommunications ,02 engineering and technology ,Data science ,Field (computer science) ,Bridge (nautical) ,Theoretical Computer Science ,Identification (information) ,0202 electrical engineering, electronic engineering, information engineering ,Key (cryptography) ,020201 artificial intelligence & image processing ,Cryptography and Security (cs.CR) - Abstract
Biometric research is directed increasingly towards Wearable Biometric Systems (WBS) for user authentication and identification. However, prior to engaging in WBS research, how their operational dynamics and design considerations differ from those of Traditional Biometric Systems (TBS) must be understood. While the current literature is cognizant of those differences, there is no effective work that summarizes the factors where TBS and WBS differ, namely, their modality characteristics, performance, security and privacy. To bridge the gap, this paper accordingly reviews and compares the key characteristics of modalities, contrasts the metrics used to evaluate system performance, and highlights the divergence in critical vulnerabilities, attacks and defenses for TBS and WBS. It further discusses how these factors affect the design considerations for WBS, the open challenges and future directions of research in these areas. In doing so, the paper provides a big-picture overview of the important avenues of challenges and potential solutions that researchers entering the field should be aware of. Hence, this survey aims to be a starting point for researchers in comprehending the fundamental differences between TBS and WBS before understanding the core challenges associated with WBS and its design., Comment: Accepted by ACM Computing Surveys Journal
- Published
- 2019
23. Methods and Tools for Policy Analysis
- Author
-
Christopher Williams, Seraphin Calo, Amani Abu Jabal, Dinesh C. Verma, Elisa Bertino, Alessandra Russo, Maryam Davari, Christian Makaya, and IBM United Kingdom Ltd
- Subjects
08 Information And Computing Sciences ,021110 strategic, defence & security studies ,Correctness ,General Computer Science ,Computer science ,0211 other engineering and technologies ,02 engineering and technology ,Change impact analysis ,Policy analysis ,Data science ,Drone ,Theoretical Computer Science ,Consistency (database systems) ,Similarity analysis ,0202 electrical engineering, electronic engineering, information engineering ,Information system ,Robot ,020201 artificial intelligence & image processing ,Information Systems - Abstract
Policy-based management of computer systems, computer networks and devices is a critical technology especially for present and future systems characterized by large-scale systems with autonomous devices, such as robots and drones. Maintaining reliable policy systems requires efficient and effective analysis approaches to ensure that the policies verify critical properties, such as correctness and consistency. In this paper, we present an extensive overview of methods for policy analysis. Then, we survey policy analysis systems and frameworks that have been proposed and compare them under various dimensions. We conclude the paper by outlining novel research directions in the area of policy analysis.
- Published
- 2019
24. Editorial Policy...
- Author
-
Organick, Elliot I.
- Subjects
- *
SURVEYS , *AUTHOR-publisher relations , *AUTHORSHIP , *COMPUTER science , *ELECTRONIC data processing personnel - Abstract
The article introduces the March 1973 issue of the journal ACM Computing Surveys. The journal received various contributions from outstanding authors mostly computer professionals. The journal helped computer people to catch up on the latest ideas concepts and advances in the field. The effect of this change has created new opportunities and challenges for the authors of survey. An increasing number of readers are capable of benefiting from tutorial and survey papers presented at advanced levels. The author provides the characteristics of a paper acceptable by the magazine. A good tutorial paper covering an area of the computer field that has recently become important and better understood will get top priority. The body of the paper should be interesting. The text of the survey can be brief, letting the bibliography that attends it and the index into it be as long and up-to-date as required. An excellent survey article is likely to be given preference over a tutorial that is rated only good.
- Published
- 1973
- Full Text
- View/download PDF
25. Interoperability and Portability Approaches in Inter-Connected Clouds
- Author
-
Karanjeet Singh Kahlon, Kiranbir Kaur, and Sandeep Sharma
- Subjects
Service (systems architecture) ,General Computer Science ,Computer science ,business.industry ,Vendor ,Interoperability ,020206 networking & telecommunications ,020207 software engineering ,Cloud computing ,02 engineering and technology ,Identity management ,Theoretical Computer Science ,Term (time) ,Set (abstract data type) ,World Wide Web ,Software portability ,0202 electrical engineering, electronic engineering, information engineering ,business - Abstract
Inter-connected cloud computing is an inherent evolution of Cloud Computing. Numerous benefits provided by connecting clouds have garnered attraction from the academic as well as the industry sector. Just as every new evolution faces challenges, inter-connected clouds have their own set of challenges such as security, monitoring, authorization and identity management, vendor lock-in, and so forth. This article considers the vendor lock-in problem, which is a direct consequence of the lack of interoperability and portability. An extensive literature review by surveying more than 120 papers has been done to analyze and categorize various solutions suggested in literature for solving the interoperability and portability issues of inter-connected clouds. After categorizing the solutions, the literature has been mapped to a specific solution and a comparative analysis of the papers under the same solution has been done. The term “inter-connected clouds” has been used generically in this article to refer to any collaboration of clouds which may be from the user side (Multi-clouds or Aggregated service by Broker) or the provider side (Federated clouds or Hybrid clouds). Lastly, two closely related issues (Brokers and Meta-scheduling) and the remaining challenges of inter-connected clouds are discussed.
- Published
- 2017
26. Wireless Body Area Network (WBAN)
- Author
-
Youssef Nasser, Imed Romdhani, Ahmed Al-Dubai, and Marwa Salayma
- Subjects
General Computer Science ,Computer science ,004 Data processing & computer science ,QA75 Electronic computers. Computer science ,Reliability (computer networking) ,02 engineering and technology ,01 natural sciences ,Theoretical Computer Science ,Wireless body area networks, QoS, medical, channel access, fading, WBAN standards ,Open research ,Body area network ,Centre for Distributed Computing, Networking and Security ,0202 electrical engineering, electronic engineering, information engineering ,Ambient intelligence ,business.industry ,Quality of service ,010401 analytical chemistry ,020206 networking & telecommunications ,Fault tolerance ,AI and Technologies ,0104 chemical sciences ,Variety (cybernetics) ,Health ,Key (cryptography) ,Networks ,business ,Computer network - Abstract
Wireless Body Area Network (WBAN) has been a key element in e-health to monitor bodies. This technology enables new applications under the umbrella of different domains, including the medical field, the entertainment and ambient intelligence areas. This survey paper places substantial emphasis on the concept and key features of the WBAN technology. First, the WBAN concept is introduced and a review of key applications facilitated by this networking technology is provided. The study then explores a wide variety of communication standards and methods deployed in this technology. Due to the sensitivity and criticality of the data carried and handled by WBAN, fault tolerance is a critical issue and widely discussed in this paper. Hence, this survey investigates thoroughly the reliability and fault tolerance paradigms suggested for WBANs. Open research and challenging issues pertaining to fault tolerance, coexistence and interference management and power consumption are also discussed along with some suggested trends in these aspects.
- Published
- 2017
27. The Many Faces of Publish/Subscribe.
- Author
-
Eugster, Patrick Th., Felber, Pascal A., Guerraoui, Rachid, and Kermarrec, Anne-Marie
- Subjects
- *
COMPUTER programming , *DISTRIBUTED computing , *COMPUTER algorithms , *COMPUTER science , *COMMUNICATION , *SYNCHRONIZATION , *DISTRIBUTED shared memory , *COMPUTER memory management - Abstract
Well adapted to the loosely coupled nature of distributed interaction in large-scale applications, the publish/subscribe communication paradigm has recently received increasing attention. With systems based on the publish/subscribe interaction scheme, subscribers register their interest in an event, or a pattern of events, and are subsequently asynchronously notified of events generated by publishers. Many variants of the paradigm have recently been proposed, each variant being specifically adapted to some given application or network model. This paper factors out the common denominator underlying these variants: full decoupling of the communicating entities in time, space, and synchronization. We use these three decoupling dimensions to better identify commonalities and divergences with traditional interaction paradigms. The many variations on the theme of publish/subscribe are classified and synthesized. In particular, their respective benefits and shortcomings are discussed both in terms of interfaces and implementations. [ABSTRACT FROM AUTHOR]
- Published
- 2003
- Full Text
- View/download PDF
28. A Systematic Review on Cloud Testing
- Author
-
Boni García, Eda Marchetti, Micael Gallego, Antonia Bertolino, Guglielmo De Angelis, Francisco Gortázar, Francesca Lonetti, Comunidad de Madrid, Ministerio de Economía y Competitividad (España), and European Commission
- Subjects
Thesaurus (information retrieval) ,Telecomunicaciones ,General Computer Science ,Computer science ,business.industry ,Systematic literature review ,Testing ,systematic literature review ,020207 software engineering ,Cloud computing ,02 engineering and technology ,Cloud Computing ,Digital library ,Data science ,Field (computer science) ,testing ,Theoretical Computer Science ,Open research ,Systematic review ,Cloud testing ,Business intelligence ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,business - Abstract
A systematic literature review is presented that surveyed the topic of cloud testing over the period (2012-2017). Cloud testing can refer either to testing cloud-based systems (testing of the cloud), or to leveraging the cloud for testing purposes (testing in the cloud): both approaches (and their combination into testing of the cloud in the cloud) have drawn research interest. An extensive paper search was conducted by both automated query of popular digital libraries and snowballing, which resulted into the final selection of 147 primary studies. Along the survey a framework has been incrementally derived that classifies cloud testing research along six main areas and their topics. The paper includes a detailed analysis of the selected primary studies to identify trends and gaps, as well as an extensive report of the state of art as it emerges by answering the identified Research Questions. We find that cloud testing is an active research field, although not all topics have received so far enough attention, and conclude by presenting the most relevant open research challenges for each area of the classification framework. This paper describes research work mostly undertaken in the context of the European Project H2020 731535: ElasTest. This work has also been partially supported by: the Italian MIUR PRIN 2015 Project: GAUSS; the Regional Government of Madrid (CM) under project Cloud4BigData (S2013/ICE-2894) cofunded by FSE & FEDER; and the Spanish Government under project LERNIM (RTC-2016-4674-7) cofunded by the Ministry of Economy and Competitiveness, FEDER & AEI.
- Published
- 2019
- Full Text
- View/download PDF
29. Retargetable Code Generators.
- Author
-
Wulf, William A., Newcomer, Joe, Leverett, Bruce, Cattell, Rick, Knueven, Paul, Ganapathi, M., Hennessy, J. L., Fischer, C. N., Fraser, Christopher W., and Fischer, J. L.
- Subjects
- *
CODE generators , *HEURISTIC , *CODING theory , *SURVEYS , *COMPUTER science - Abstract
This paper presents responses of some surveyors regarding the article "Retargetable Code Generators," published in the December 1982 issue of the journal "ACM Computing Surveys" and the corresponding response of authors. The article identified the classified techniques for automated retargetable code generation and surveyed the work on these techniques. Among the work surveyed was that of Wulf's Production Quality Compiler-Compiler (PQCC) group. The letter written by William A. Wulf and colleagues in response to the survey points out that, "Operator mismatches invoke heuristic search." Authors of the article present their clarification to this point. Readers interested in detailed information are suggested to refer to the article "A Practical Toolkit for Making Portable Compilers," published in the September 1983 issue of the journal "Communications of the ACM."
- Published
- 1983
- Full Text
- View/download PDF
30. Distributed Query Processing.
- Author
-
Yu, C. T. and Chang, C. C.
- Subjects
- *
INFORMATION storage & retrieval systems , *ELECTRONIC data processing , *HEURISTIC programming , *ARTIFICIAL intelligence , *PROGRAM transformation , *COMPUTER science - Abstract
In this paper various techniques for optimizing queries in distributed databases are presented. Although no attempt is made to cover all proposed algorithms on this topic, quite a few ideas extracted from existing algorithms are outlined. It is hoped that large- scale experiments will be conducted to verify the usefulness of these ideas and that they will be integrated to construct a powerful algorithm for distributed query processing. [ABSTRACT FROM AUTHOR]
- Published
- 1984
- Full Text
- View/download PDF
31. File Servers for Network-Based Distributed Systems.
- Author
-
Svobodova, Liba
- Subjects
- *
INTERNET servers , *INFORMATION storage & retrieval systems , *DISTRIBUTED computing , *COMPUTER networks , *FILE conversion (Computer science) , *COMPUTER science - Abstract
A file server provides remote centralized storage of data to workstations connected to it via a communication network; it facilitates data sharing among autonomous workstations and support of inexpensive workstations that have limited or no secondary storage. Various characteristics of file servers and the corresponding implementation issues based on a survey of a number of experimental file servers are discussed and evaluated in this paper. Particular emphasis is placed on the problem of atomic update of data stored in a file server. The design issues related to the scope of atomic transactions and the granularity of data access supported by a file server are studied in detail. [ABSTRACT FROM AUTHOR]
- Published
- 1984
- Full Text
- View/download PDF
32. Program Transformation Systems.
- Author
-
Partsch, H. and Steinbrüggen, R.
- Subjects
- *
AUTOMATIC programming (Computer science) , *PROGRAMMING languages , *COMPUTER algorithms , *ARTIFICIAL intelligence , *SOFTWARE engineering , *COMPUTER science , *INFORMATION technology - Abstract
Interest is increasing in the transformational approach to programming and in mechanical aids for supporting the program development process. Available aids range from simple editorlike devices to rather powerful interactive transformation systems and even to automatic synthesis tools. This paper reviews and classifies transformation systems and is intended to acquaint the reader with the current state of the art and provide a basis for comparing the different approaches. It is also designed to provide easy access to specific details of the various methodologies. [ABSTRACT FROM AUTHOR]
- Published
- 1983
- Full Text
- View/download PDF
33. Document Formatting Systems: Survey, Concepts, and Issues.
- Author
-
Furuta, Richard, Scofield, Jeffrey, and Shaw, Alan
- Subjects
- *
FORMATTING of information display systems , *WORD processing , *COMPUTER software , *TEXT editors (Computer programs) , *COMPUTER science , *SURVEYS - Abstract
Formatting systems are concerned with the physical layout of a document for hard- and soft-copy media. This paper characterizes the formatting problem and its relation to other aspects of document processing, describes and evaluates several representative and seminal systems, and discusses some issues and problems relevant to future systems. The emphasis is on topics related to the specification of document formats; these include the underlying document and processing models, functions performed by a formatter, the formatting language and user interface, variety of document objects, the integration of formatters with other document processing tasks, and implementation questions. [ABSTRACT FROM AUTHOR]
- Published
- 1982
- Full Text
- View/download PDF
34. Interactive Editing Systems: Part II.
- Author
-
Meyrowitz, Norman and Dam, Andries van
- Subjects
- *
WORD processing , *COMPUTER software , *TEXT editors (Computer programs) , *INFORMATION storage & retrieval systems , *COMPUTER science , *SURVEYS - Abstract
This article, Part II of a two-part series, surveys the state of the art of computer-based interactive editing systems. This paper is a survey intended for a varied audience, including the more experienced user and the editor-designer as well as the curious novice. It presents numerous examples of systems in both the academic and commercial arenas, covering line editors, screen editors, interactive editor/formatters, structure editors, syntax-directed editors, and commercial word- processing editors. We discuss pertinent issues in the field, and conclude with some observations about the future of interactive editing. The references for both parts are provided at the end of Part II. [ABSTRACT FROM AUTHOR]
- Published
- 1982
- Full Text
- View/download PDF
35. Data-Driven and Demand-Driven Computer Architecture.
- Author
-
Treleaven, Philip C., Brownbridge, David R., and Hopkins, Richard P.
- Subjects
- *
COMPUTER architecture , *COMPUTER systems , *COMPUTER science , *COMPUTER input-output equipment , *COMPUTER software , *PROGRAMMING languages - Abstract
Novel data-driven and demand-driven computer architectures are under development in a large number of laboratories in the United States, Japan, and Europe. These computers are not based on the traditional von Neumann organization; instead, they are attempts to identify the next generation of computer. Basically, in data-driven (e.g., data-flow) computers the availability of operands triggers the execution of the operation to be performed on them, whereas in demand-driven (e.g., reduction) computers the requirement for a result triggers the operation that will generate it. Although there are these two distinct areas of research, each laboratory has developed its own individual model of computation, stored program representation, and machine organization. Across this spectrum of designs there is, however, a significant sharing of concepts. The aim of this paper is to identify the concepts and relationships that exist both within and between the two areas of research. It does this by examining data-driven and demand-driven architecture at three levels: computation organization, (stored) program organization, and machine organization. Finally, a survey of various novel computer architectures under development is given. [ABSTRACT FROM AUTHOR]
- Published
- 1982
- Full Text
- View/download PDF
36. A Comparison of the Programming Languages C and PASCAL.
- Author
-
Feuer, Alan A. and Gehani, Narain H.
- Subjects
- *
PROGRAMMING languages , *ELECTRONIC data processing , *ARTIFICIAL languages , *PASCAL (Computer program language) , *COMPUTER programmers , *COMPUTER science - Abstract
The languages C and PASCAL are growing in popularity, particularly among programmers of small computers. In this paper we summarize and compare the two languages covering their design philosophies, their handling of data types, the programming facilities they provide, the impact of these facilities on the quality of programs, and how useful the facilities are for programming in a variety of application domains. [ABSTRACT FROM AUTHOR]
- Published
- 1982
37. Human Factors Studies of Database Query Languages: A Survey and Assessment.
- Author
-
Reisner, Phyllis
- Subjects
- *
PSYCHOLOGY , *DATABASE design , *DATABASE management , *QUERY languages (Computer science) , *PROGRAMMING languages , *SYNTAX (Grammar) , *COMPUTER science - Abstract
Empirical studies have been undertaken to measure the ease-of-use of a query language, compare two or more such languages for ease-of-use, study controversial issues in query language design, and provide feedback to designers for improving a language. Some primitive attempts at constructing abstract models related to query languages also exist. This paper discusses some of the techniques that have been used and results obtained. A primary goal is to show the reader unfamiliar with behavioral research what the results do and do not mean. [ABSTRACT FROM AUTHOR]
- Published
- 1981
- Full Text
- View/download PDF
38. The Logical Record Access Approach to Database Design.
- Author
-
Teorey, Toby J. and Fry, James P.
- Subjects
- *
DATABASE design , *DATABASE management , *ELECTRONIC data processing , *INFORMATION storage & retrieval systems , *SYSTEMS design , *COMPUTER science - Abstract
Database management systems have evolved to the point of general acceptance and wide application; however a major problem still facing the user is the effective utilization of these systems. Important to achieving effective database usability and responsiveness is the design of the database. This paper presents a practical stepwise database design methodology that derives a DBMS-processable database structure from a set of user information and processing requirements. Although the methodology emphasizes the logical design step, the activities of requirements analysis and physical design are also addressed. The methodology is illustrated with a detailed example. Performance trade-offs among multiple users of a single integrated database are considered, and the relationship between short-term design and design for flexibility to changing requirements is discussed. Many steps in the database design process can be assisted with proper use of computer modeling techniques and other tools, such as requirements analysis software. The example design problem and its solution steps serve to point out when and where current technology can be effectively used. [ABSTRACT FROM AUTHOR]
- Published
- 1980
- Full Text
- View/download PDF
39. Office Information Systems and Computer Science.
- Author
-
Ellis, Clarence A. and Nutt, Gary J.
- Subjects
- *
OFFICE practice automation , *MANAGEMENT information systems , *INFORMATION resources management , *INFORMATION storage & retrieval systems , *COMPUTER science , *AUTOMATION - Abstract
Automated office systems are emerging as an interdisciplinary research area with a strong computer science component. In this paper office information systems are defined as entities which perform document storage, retrieval, manipulation, and control within a distributed environment. Some state-of-the-art implementations are described. The research is related to different areas of computer science and several detailed examples are provided. [ABSTRACT FROM AUTHOR]
- Published
- 1980
- Full Text
- View/download PDF
40. The Ubiquitous B-Tree.
- Author
-
Comer, Douglas
- Subjects
- *
ELECTRONIC file management , *COMPUTER programming , *ELECTRONIC data processing , *INFORMATION storage & retrieval systems , *DATABASE management , *COMPUTER science - Abstract
B-trees have become, de facto, a standard for file organization. File indexes of users, dedicated database systems, and general-purpose access methods have all been proposed and implemented using B-trees. This paper reviews B-trees and shows why they have been so successful. It discusses the major variations of the B-tree, especially the W-tree, contrasting the relative merits and costs of each implementation. It illustrates a general purpose access method which uses a B-tree. [ABSTRACT FROM AUTHOR]
- Published
- 1979
- Full Text
- View/download PDF
41. A Survey of Resource Directive Decomposition in Mathematical Programming.
- Author
-
Molina, Francisco Walter
- Subjects
- *
MATHEMATICAL programming , *ALGORITHMS , *DECOMPOSITION method , *SYSTEM analysis , *OPERATIONS research , *PROGRAM transformation , *COMPUTER science - Abstract
Because of the natural way in which subsystems are cast into subproblems, resource- directive decomposition methods of mathematical programming problems have attracted considerable attention in recent years. A review of the specialized literature is presented in this paper, where the features and drawbacks of the most representative resource-directive methods are analyzed. To give an appropriate chronological and technical perspective, early general methods, such as the ones of Dantzig-Wolfe and Benders, are also included in the survey. [ABSTRACT FROM AUTHOR]
- Published
- 1979
- Full Text
- View/download PDF
42. A Conceptual Framework for Computer Architecture.
- Author
-
Reddi, S.S. and Feustel, E.A.
- Subjects
- *
COMPUTER science , *COMPUTER architecture , *ELECTRONIC data processing , *SYSTEMS design , *INFORMATION storage & retrieval systems , *COMPUTERS in architecture - Abstract
The purpose of this paper is to describe the concepts, definitions, and ideas of computer architecture and to suggest that architecture can be viewed as composed of three components: physical organization; control and flow of information; and representation, interpretation and transformation of information. This framework can accommodate diverse architectural concepts such as array processing, microprogramming, stack processing and tagged architecture. Architectures of some existing machines are considered and methods of associating architectural concepts with the components are established. Architecture design problems and trade-offs are discussed in terms of the proposed framework. [ABSTRACT FROM AUTHOR]
- Published
- 1976
- Full Text
- View/download PDF
43. CODASYL Data- Base Management Systems.
- Author
-
Taylor, Robert W. and Frank, Randall L.
- Subjects
- *
DATABASE management , *DATABASES , *DATA structures , *COMPUTER software , *COMPUTER science , *DATABASE design - Abstract
This paper presents in tutorial fashion the concepts, notation, and data-base languages that were defined by the CODASYL Data Description Language and Programming Language Committees. Data structure diagram notation is explained, and sample data-base definition is developed along with several sample programs. Advanced features of the languages are discussed, together with examples of their use. An extensive bibliography is included. [ABSTRACT FROM AUTHOR]
- Published
- 1976
- Full Text
- View/download PDF
44. Hierarchical Data-Base Management: A Survey.
- Author
-
Tsichritzis, D. C. and Lochovsky, F. H.
- Subjects
- *
DATABASE design , *DATABASE management , *COMPUTER science , *DATABASES , *PROGRAMMING languages , *RELATIONAL databases - Abstract
This survey paper discusses the facilities provided by hierarchical data-base management systems. The systems are based on the hierarchical data model which is defined as a special case of the network data model. Different methods used to access hierarchically organized data are outlined. Constructs and examples of programming languages are presented to illustrate the features of hierarchical systems. This is followed by a discussion of techniques for implementing such systems. Finally, a brief comparison is made between the hierarchical, the network, and the relational systems. [ABSTRACT FROM AUTHOR]
- Published
- 1976
- Full Text
- View/download PDF
45. New Programming Languages for Artificial Intelligence Research.
- Author
-
Bobrow, Daniel G. and Raphael, Bertram
- Subjects
- *
PROGRAMMING languages , *ARTIFICIAL intelligence , *COMPUTER programming , *SUBROUTINES (Computer programs) , *COMPUTER programmers , *COMPUTER science - Abstract
New direction in Artificial Intelligence research have led to the need for certain novel features to be embedded in programming languages. This paper gives an overview of the nature of these features, and their implementation in four principal families of AI languages: SAIL; PLANNER/CONNIVER; QLISP/INTERLISP; AND POPLER/POP-2. The programming features described include: new data types and accessing mechanisms for stored expressions; more flexible control structures, including multiple processes and backtracking; pattern matching to allow comparison of data item with a template, and extraction of labeled subexpressions; and deductive mechanisms which allow the programming system to carry out certain activities including modifying the data base and deciding which subroutines to run next using only constraints and guidelines set up by the programmer. [ABSTRACT FROM AUTHOR]
- Published
- 1974
- Full Text
- View/download PDF
46. In Memoriam.
- Author
-
Abiteboul, Serge, Kuper, Gabriel M., Mairson, Harry G., Shvartsman, Alexander A., and Vardi, Moshe Y.
- Subjects
- *
COMPUTER science , *SCIENTISTS , *CONSTRAINT databases , *DATABASES , *RESEARCH - Abstract
Authors of this article have mourned the passing of a creative and thoughtful colleague, Paris C. Kanellakis, who was respected for his many contributions, both technical and professional, to the computer science research community. In mourning Kanellakis' tragic death, his technical facility, his broad knowledge, his insight, his commitment and his humor are missed. To write a research paper with Kanellakis was also an opportunity to observe his indefatigable attention to detail and to engage in vigorous debate with his editorial voice. To his technical ability in solving problems, Kanellakis added a mature editorial voice. These authors have taken this opportunity to present some of Kanellakis' contributions to database theory, including deductive, object-oriented and constraint databases, as well as his work in fault-tolerant distributed computation and in type theory. In each of these areas, they have recognized Kanellakis' research contributions that were examples not merely of good problem solving, but also of insightful problem formulation.
- Published
- 1996
- Full Text
- View/download PDF
47. Fringe Analysis Revisited.
- Author
-
Baeza-Yates, Ricardo A.
- Subjects
- *
DATA structures , *ELECTRONIC data processing , *COMPUTER programming , *DATABASE management , *SEARCH engines , *ELECTRONIC information resource searching , *COMPUTER science - Abstract
Fringe analysis is a technique used to study the average behavior of search trees. In this paper we survey the main results regarding this technique, and we improve a previous asymptotic theorem. At the same time, we present new developments and applications of the theory that allow improvements in several bounds on the behavior of search trees. Our examples cover binary search trees, AVL-trees, 2-3 trees, and B-trees. [ABSTRACT FROM AUTHOR]
- Published
- 1995
- Full Text
- View/download PDF
48. Model-Based Object Recognition in Dense-Range Images A Review.
- Author
-
Arman, Farshid and Aggarwal, J. K.
- Subjects
- *
IMAGE processing , *COMPUTER vision , *COMPUTER-aided design , *PATTERN recognition systems , *MODELS & modelmaking , *COMPUTER science , *COMPUTERS - Abstract
The goal in computer vision systems is to analyze data collected from the environment and derive an interpretation to complete a specified task. Vision system tasks may be divided into data acquisition, low-level processing, representation, model construction, and matching subtasks. This paper presents a comprehensive survey of model-based vision systems using dense-range images. A comprehensive survey of the recent publications in each subtask pertaining to dense-range image object recognition is presented. [ABSTRACT FROM AUTHOR]
- Published
- 1993
- Full Text
- View/download PDF
49. Self-Stabilization.
- Author
-
Schneider, Marco
- Subjects
- *
FAULT-tolerant computing , *COMPUTER science , *STOCHASTIC convergence , *INTEGRATED circuit fault tolerance , *ERRORS , *RESEARCH , *TECHNOLOGY - Abstract
In 1973 Dijkstra introduced to computer science the notion of self-stabilization in the context of distributed systems. He defined a system as self-stabilizing when "regardless of its initial state, it is guaranteed to arrive at a legitimate state in a finite number of steps." A system which is not self-stabilizing may stay in an illegitimate state forever. Dijkstra's notion of self-stabilization, which originally had a very narrow scope of application, is proving to encompass a formal and unified approach to fault tolerance under a model of transient failures for distributed systems. In this paper we define self-stabilization, examine its significance in the context of fault tolerance, define the important research themes that have arisen from it, and discuss the relevant results. In addition to the issues arising from Dijkstra's original presentation as well as several related issues, we discuss methodologies for designing self-stabilizing systems, the role of compilers with respect to self-stabilization, and some of the factors that prevent self-stabilization. [ABSTRACT FROM AUTHOR]
- Published
- 1993
- Full Text
- View/download PDF
50. Ten Mini-Languages: A Study of Topical Issues in Programming Languages.
- Author
-
Ledgard, Henry F.
- Subjects
- *
COMPUTER input design , *PROGRAMMING language semantics , *SYNTAX in programming languages , *COMPILERS (Computer programs) , *DATA structures , *COMPUTER science - Abstract
The proliferation of programming languages has raised many issues of language design, definition, and implementation. This paper presents a series of ten mini-languages, each of which exposes salient features found in existing programming languages. The value of the mini-languages lies in their brevity of description and the isolation of important linguistic features: in particular, the notions of assignment, transfer of control, functions, parameter passing, type checking, data structures, string manipulation, and input/output. The mini-languages may serve a variety of uses: notably, as a pedagogical tool for teaching programming languages, as a subject of study for the design of programming languages, and as a set of test cases for methods of language implementation or formal definition. [ABSTRACT FROM AUTHOR]
- Published
- 1971
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.