4,878 results
Search Results
2. Computed Tomography of the Body : A Radiological and Clinical Approach
- Author
-
Janet E. Husband, Ian Kelsey Fry, Janet E. Husband, and Ian Kelsey Fry
- Subjects
- Radiology, Computer science
- Published
- 1983
3. Fairness
- Author
-
Nissim Francez and Nissim Francez
- Subjects
- Logic design, Software engineering, Computer science
- Abstract
The main purpose of this book is to bring together much of the research conducted in recent years in a subject I find both fascinating and impor tant, namely fairness. Much of the reported research is still in the form of technical reports, theses and conference papers, and only a small part has already appeared in the formal scientific journal literature. Fairness is one of those concepts that can intuitively be explained very brieft.y, but bear a lot of consequences, both in theory and the practicality of programming languages. Scientists have traditionally been attracted to studying such concepts. However, a rigorous study of the concept needs a lot of detailed development, evoking much machinery of both mathemat ics and computer science. I am fully aware of the fact that this field of research still lacks matu rity, as does the whole subject of theoretical studies of concurrency and nondeterminism. One symptom of this lack of maturity is the proliferation of models used by the research community to discuss these issues, a variety lacking the invariance property present, for example, in universal formalisms for sequential computing.
- Published
- 1986
4. Expert Systems in Auditing
- Author
-
J C van Dijk, Paul Williams, Kenneth A. Loparo, J C van Dijk, Paul Williams, and Kenneth A. Loparo
- Subjects
- Economic history, Accounting, Computer science, Artificial intelligence
- Abstract
This book provides an understanding of the concepts and objectives of expert systems. It is a practical guide, intended to help the practitioner in identifying potential application in his/her own practice, and to understand the limitations of the technology. This should provide the auditor with a ground basis to direct, stimulate and control development efforts in his own practice. At the same time, it should give students in auditing a good grasp of the possibilities and limitations of the technology.
- Published
- 1990
5. Formal Models and Semantics
- Author
-
Bozzano G Luisa and Bozzano G Luisa
- Subjects
- Computer science
- Abstract
The second part of this Handbook presents a choice of material on the theory of automata and rewriting systems, the foundations of modern programming languages, logics for program specification and verification, and some chapters on the theoretic modelling of advanced information processing.
- Published
- 1990
6. Introduction To Theoretical Computer Science
- Author
-
Xiwen Ma and Xiwen Ma
- Subjects
- Computer science
- Abstract
The contents of this book are self-sufficient in the sense that no preliminary knowledge other than elementary set theory is needed and there are no complicated mathematical theorems in the book. A must for those entering the field.
- Published
- 1990
7. Mastering Business Microcomputing
- Author
-
D.E. Avison and D.E. Avison
- Subjects
- Business information services, Computer science
- Abstract
Designed to give an overview of the business applications of microcomputers, this is an introduction for college and polytechnic students on computing and business courses. Managers of small or medium sized businesses or of departments in larger companies will also find the book helpful.
- Published
- 1990
8. Computer Epistemology
- Author
-
Tibor Vamos and Tibor Vamos
- Subjects
- Artificial intelligence, Computer science
- Abstract
This book is an essay on relevant problems of epistemology (the theory of knowledge) related to computer science. It draws a continuous line between the earliest scientific approaches of epistemology, starting with the Greek Classics and the recent practical and theoretical problems of computer modelling, and by that the appropriate application of computers to our present problems. Uncertainty, logic and language are the key issues of this road leading to some new aspects of cognitive psychology and unification of the different results for a modelling procedure. The book is not a textbook but a critical survey of usual and advertised methods with an evaluation of them from the point of view of their applicability, reliability and limits. Probability, Bayesian, Dempster-Shafer, fuzzy and other approaches are treated in this way in uncertainty, different worlds'concepts, non-monotonic logic and other methods and views in logic. The emphasis in linguistics is put on the meta concept, and in cognitive applications of the pattern concept.Written mostly in an entertaining style, this book provides a more palatable reading of a profound subject.
- Published
- 1991
9. Sequence Analysis Primer
- Author
-
Michael Gribskov, John Devereux, Michael Gribskov, and John Devereux
- Subjects
- Biotechnology, Computer science
- Abstract
Covers the basic computer analyses used for new DNA sequences and attempts to provide the researcher with the necessary background in order to understand and use efficiently these programs.
- Published
- 1991
10. Applications in Decision-aiding Software
- Author
-
Stuart S. Nagel and Stuart S. Nagel
- Subjects
- Computer science, Political science
- Abstract
Decision-aiding software is applied in this book to government, personal decisions, law, teaching, decision-analysis research, cross-national decision-making, business and politics.
- Published
- 1992
11. Computer Science and Operations Research: New Developments in Their Interfaces
- Author
-
Osman Balci and Osman Balci
- Subjects
- Operations research, Computer science, Electronic data processing
- Abstract
The interface of Operation Research and Computer Science - although elusive to a precise definition - has been a fertile area of both methodological and applied research. The papers in this book, written by experts in their respective fields, convey the current state-of-the-art in this interface across a broad spectrum of research domains which include optimization techniques, linear programming, interior point algorithms, networks, computer graphics in operations research, parallel algorithms and implementations, planning and scheduling, genetic algorithms, heuristic search techniques and data retrieval.
- Published
- 1992
12. Theoretical Studies in Computer Science
- Author
-
Jeffrey D. Ullman and Jeffrey D. Ullman
- Subjects
- Electronic data processing, Computer science
- Abstract
Theoretical Studies in Computer Science focuses on the field of theoretical computer science. This book discusses the context-free multi-languages, non-membership in certain families of context-free languages, and single tree grammars. The complexity of structural containment and equivalence, interface between language theory and database theory, and automata theory for database theoreticians are also deliberated. This text likewise covers the datalog linearization of chain queries, expressive power of query languages, and object identity and query equivalences. Other topics include the unified approach to data and meta-data modification for data/knowledge bases, polygon clipping algorithms, and convex polygon generator. This publication is intended for computer scientists and researchers interested in theoretical computer science.
- Published
- 1992
13. Advanced Topics in Shannon Sampling and Interpolation Theory
- Author
-
Robert J.II Marks and Robert J.II Marks
- Subjects
- Chemistry--Mathematics, Engineering, Computer engineering, Computer science, Computer organization, Electronic data processing
- Abstract
Advanced Topics in Shannon Sampling and Interpolation Theory is the second volume of a textbook on signal analysis solely devoted to the topic of sampling and restoration of continuous time signals and images. Sampling and reconstruction are fundamental problems in any field that deals with real-time signals or images, including communication engineering, image processing, seismology, speech recognition, and digital signal processing. This second volume includes contributions from leading researchers in the field on such topics as Gabor's signal expansion, sampling in optical image formation, linear prediction theory, polar and spiral sampling theory, interpolation from nonuniform samples, an extension of Papoulis's generalized sampling expansion to higher dimensions, and applications of sampling theory to optics and to time-frequency representations. The exhaustive bibliography on Shannon sampling theory will make this an invaluable research tool as well as an excellent text for students planning further research in the field.
- Published
- 1993
14. Between Mind And Computer: Fuzzy Science And Engineering
- Author
-
Pei Zhuang Wang, Kia Fock Loe, Pei Zhuang Wang, and Kia Fock Loe
- Subjects
- Fuzzy sets, Fuzzy logic, Computer science, Electronic data processing
- Abstract
The “Fuzzy Explosion” emanating from Japan has compelled more people than ever to ponder the meaning and potential of fuzzy engineering. Scientists all over are now beginning to harness the power of fuzzy recognition and decision-making — reminescent of the way the human mind works — in computer applications.In this book a blue-ribbon list of contributors discusses the latest developments in topics such as possibility logic programming, truth-valued flow inference, fuzzy neural-logic networks and default knowledge representation. This volume is the first in a series aiming to document advances in fuzzy set theory and its applications.
- Published
- 1993
15. Current Trends In Theoretical Computer Science: Essays And Tutorials
- Author
-
Grzegorz Rozenberg, Arto Salomaa, Herbert Edelsbrunner, Hartmut Ehrig, Yuri Gurevich, J Hartmanis, Grzegorz Rozenberg, Arto Salomaa, Herbert Edelsbrunner, Hartmut Ehrig, Yuri Gurevich, and J Hartmanis
- Subjects
- Electronic data processing, Computer science
- Abstract
The book is a very up-to-date collection of articles in theoretical computer science, written by leading authorities in the field. The topics range from algorithms and complexity to algebraic specifications, and from formal languages and language-theoretic modeling to computational geometry. The material is based on columns and articles that have appeared in the EATCS Bulletin during the past two to three years. Although very recent research is discussed, the largely informal style of writing makes the book accessible to readers with little or no previous knowledge of the topics.
- Published
- 1993
16. Logic and Information Flow
- Author
-
Jan van Eijck, Albert Visser, Jan van Eijck, and Albert Visser
- Subjects
- Electronic data processing, Computer science, Natural language processing (Computer science), Logic, Symbolic and mathematical
- Abstract
The thirteen chapters written expressly for this book by logicians, theoretical computer scientists, philosophers, and semanticists address, from the perspective of mathematical logic, the problems of understanding and studying the flow of information through any information-processing system. The logic of information flow has applications in both computer science and natural language processing and is a growing area within mathematical and philosophical logic. Consequently, Logic and Information Flow will be of interest to theoretical computer scientists wanting information on up-to-date formalisms of dynamic logic, and their possible applications; logicians who wish to expand their discipline beyond the realm of sound reasoning in the narrow sense; and philosophers who are looking at the nature of information and action, and at the relation between those concepts. Foundations of Computing series
- Published
- 1994
17. Thinking Computers and Virtual Persons : Essays on the Intentionality of Machines
- Author
-
Eric Dietrich and Eric Dietrich
- Subjects
- Artificial intelligence, Computer science
- Abstract
Thinking Computers and Virtual Persons: Essays on the Intentionality of Machines explains how computations are meaningful and how computers can be cognitive agents like humans. This book focuses on the concept that cognition is computation. Organized into four parts encompassing 13 chapters, this book begins with an overview of the analogy between intentionality and phlogiston, the 17th-century principle of burning. This text then examines the objection to computationalism that it cannot prevent arbitrary attributions of content to the various data structures and representations involved in a computational process. Other chapters consider that the notion of original intentionality is incoherent. This book argues as well that the only way to build an intelligent machine is to build a neural network. The final chapter claims that an entire theoretical framework in cognitive psychology is incompatible with the view that human brains are computers of some sort. This book is a valuable resource for cognitive scientists.
- Published
- 1994
18. Advances in Computers
- Author
-
Marvin Zelkowitz and Marvin Zelkowitz
- Subjects
- Computers, Electronic data processing, Computer science
- Abstract
Praise for the Series'Mandatory for academic libraries supporting computer science departments.'-CHOICESince its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of sugnificant, lasting value in this rapidly expanding field.
- Published
- 1995
19. Advances in Computers
- Author
-
Marvin Zelkowitz and Marvin Zelkowitz
- Subjects
- Computers, Electronic digital computers, Computer science, Electronic data processing
- Abstract
Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in computer hardware, software, theory, design, and applications. It has also provided contributors with a medium in which they can explore their subjects in greater depth and breadth than journal articles usually allow. As a result, many articles have become standard references that continue to be of significant, lasting value in this rapidly expanding field.
- Published
- 1995
20. Computing Perspectives
- Author
-
Maurice V. Wilkes and Maurice V. Wilkes
- Subjects
- Microcomputers, Computers, Electronic data processing, Computer science, Computer engineering
- Abstract
In this insightful collection of essays, Maurice Wilkes shares his unique perspective on the development of computers and the current state of the art. These enlightening essays discuss the foundational ideas behind modern computing and provide a solid grounding for the appreciation of emerging computer technologies.Wilkes, one of the founders of computing, has provided enormous contributions to the development of computers, including the design and construction of the EDSAC computer and early development of programming for a stored program computer. He was responsible for the concept of microprogramming. Wilkes also wrote the first paper to appear on cache memories and was an early worker in the field of wide bandwidth local area networks. In 1992 he was awarded the prestigious Kyoto Prize for Advanced Technology.These essays will be of interest to everyone involved with computers and how they arrived at their present state. Wilkes presents his perspectives with keen historical sensibility and engineering practicality. Readers are invited to consider these observations and form their own perspectives on the present state of the computer art.
- Published
- 1995
21. Discrete Structures, Logic, and Computability
- Author
-
Hein, James L. and Hein, James L.
- Subjects
- Computer science, Data structures (Computer science), Logic, Symbolic and mathematical, Computable functions
- Published
- 1995
22. Advances in Computers
- Author
-
Marvin Zelkowitz and Marvin Zelkowitz
- Subjects
- Computers, Electronic digital computers, Computer science, Electronic data processing
- Abstract
Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in hardware and software and in computer theory, design, and applications. It has also provided contributorswith a medium in which they can examine their subjects in greater depth and breadth than that allowed by standard journal articles. As a result, many articles have become standard references that continue to be of significant, lasting value despite the rapid growth taking place in the field.
- Published
- 1996
23. First-Order Logic and Automated Theorem Proving
- Author
-
Melvin Fitting and Melvin Fitting
- Subjects
- Logic, Symbolic and mathematical, Logic design, Computer science, Electronic data processing
- Abstract
There are many kinds of books on formal logic. Some have philosophers as their intended audience, some mathematicians, some computer scien tists. Although there is a common core to all such books, they will be very different in emphasis, methods, and even appearance. This book is intended for computer scientists. But even this is not precise. Within computer science formal logic turns up in a number of areas, from pro gram verification to logic programming to artificial intelligence. This book is intended for computer scientists interested in automated theo rem proving in classical logic. To be more precise yet, it is essentially a theoretical treatment, not a how-to book, although how-to issues are not neglected. This does not mean, of course, that the book will be of no interest to philosophers or mathematicians. It does contain a thorough presentation of formal logic and many proof techniques, and as such it contains all the material one would expect to find in a course in formal logic covering completeness but, not incompleteness issues. The first item to be addressed is, What are we talking about and why are we interested in it? We are primarily talking about truth as used in mathematical discourse, and our interest in it is, or should be, self evident. Truth is a semantic concept, so we begin with models and their properties. These are used to define our subject.
- Published
- 1996
24. Numerical Bayesian Methods Applied to Signal Processing
- Author
-
Joseph J.K. O Ruanaidh, William J. Fitzgerald, Joseph J.K. O Ruanaidh, and William J. Fitzgerald
- Subjects
- Statistics, Computers, Computer science, Electronic data processing, Electronic digital computers
- Abstract
This book is concerned with the processing of signals that have been sam pled and digitized. The fundamental theory behind Digital Signal Process ing has been in existence for decades and has extensive applications to the fields of speech and data communications, biomedical engineering, acous tics, sonar, radar, seismology, oil exploration, instrumentation and audio signal processing to name but a few [87]. The term'Digital Signal Processing', in its broadest sense, could apply to any operation carried out on a finite set of measurements for whatever purpose. A book on signal processing would usually contain detailed de scriptions of the standard mathematical machinery often used to describe signals. It would also motivate an approach to real world problems based on concepts and results developed in linear systems theory, that make use of some rather interesting properties of the time and frequency domain representations of signals. While this book assumes some familiarity with traditional methods the emphasis is altogether quite different. The aim is to describe general methods for carrying out optimal signal processing.
- Published
- 1996
25. Advances in Computers
- Author
-
Zelkowitz, Marvin V. and Zelkowitz, Marvin V.
- Subjects
- Computers, Electronic data processing, Computer science
- Abstract
Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in hardware and software and in computer theory, design, and applications. It has also provided contributorswith a medium in which they can examine their subjects in greater depth and breadth than that allowed by standard journal articles. As a result, many articles have become standard references that continue to be of significant, lasting value despite the rapid growth taking place in the field.
- Published
- 1997
26. Great Ideas in Computer Science : A Gentle Introduction
- Author
-
Alan W. Biermann and Alan W. Biermann
- Subjects
- Computer science
- Abstract
In Great Ideas in Computer Science: A Gentle Introduction, Alan Biermann presents the'great ideas'of computer science that together comprise the heart of the field. He condenses a great deal of complex material into a manageable, accessible form. His treatment of programming, for example, presents only a few features of Pascal and restricts all programs to those constructions. Yet most of the important lessons in programming can be taught within these limitations. The student's knowledge of programming then provides the basis for understanding ideas in compilation, operating systems, complexity theory, noncomputability, and other topics. Whenever possible, the author uses common words instead of the specialized vocabulary that might confuse readers.Readers of the book will learn to write a variety of programs in Pascal, design switching circuits, study a variety of Von Neumann and parallel architectures, hand simulate a computer, examine the mechanisms of an operating system, classify various computations as tractable or intractable, learn about noncomputability, and explore many of the important issues in artificial intelligence.This second edition has new chapters on simulation, operating systems, and networks. In addition, the author has upgraded many of the original chapters based on student and instructor comments, with a view toward greater simplicity and readability.
- Published
- 1997
27. Algorithms - ESA '98 : 6th Annual European Symposium, Venice, Italy, August 24-26, 1998, Proceedings
- Author
-
Gianfranco Bilardi, Giuseppe F. Italiano, Andrea Pietracaprina, Geppino Pucci, Gianfranco Bilardi, Giuseppe F. Italiano, Andrea Pietracaprina, and Geppino Pucci
- Subjects
- Computer science, Algorithms, Computer science—Mathematics, Discrete mathematics, Computer networks, Artificial intelligence—Data processing, Probabilities
- Abstract
9
- Published
- 1998
28. Applications of Artificial Intelligence
- Author
-
Zelkowitz, Marvin V. and Zelkowitz, Marvin V.
- Subjects
- Computers, Electronic data processing, Computer science
- Abstract
Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in hardware and software and in computer theory, design, and applications. It has also provided contributors with a medium in which they can examine their subjects in greater depth and breadth than that allowed by standard journal articles. As a result, many articles have become standard references that continue to be of significant, lasting value despite the rapid growth taking place in the field.Volume 47 contains seven chapters. The first four cover artificial intelligence, which is the use of technology to perform tasks generally assumed to require human thinking. These chapters present natural language processing, visualization, and self-replication as machine implementations of human activities. The remaining three chapters cover other recent advances that are important to the information processing field.
- Published
- 1998
29. Globalization, Growth and Marginalization
- Author
-
A.S. Bhalla and A.S. Bhalla
- Subjects
- Development economics, Computer science, Economic development, International relations, Globalization
- Abstract
Globalization is defined in economic terms to mean freer flows of trade, foreign direct investment and finance, and liberalization of trade and investment policies. Impacts of globalization and information technology are examined in terms of growth and productivity, poverty and income distribution, and employment. Experiences of Africa, East and Southeast Asia, South Asia, and Latin America in the era of globalization are discussed. It is argued that benefits of freer trade and capital flows need to be managed carefully in order to minimise the costs and maximise gains.
- Published
- 1998
30. Transaction Management : Managing Complex Transactions and Sharing Distributed Databases
- Author
-
D. Chorafas and D. Chorafas
- Subjects
- Management, International business enterprises, Computer science, Data structures (Computer science), Information theory
- Abstract
This book provides an essential update for experienced data processing professionals, transaction managers and database specialists who are seeking system solutions beyond the confines of traditional approaches. It provides practical advice on how to manage complex transactions and share distributed databases on client servers and the Internet. Based on extensive research in over 100 companies in the USA, Europe, Japan and the UK, topics covered include : • the challenge of global transaction requirements within an expanding business perspective •how to handle long transactions and their constituent elements •possible benefits from object-oriented solutions • the contribution of knowledge engineering in transaction management • the Internet, the World Wide Web and transaction handling • systems software and transaction-processing monitors • OSF/1 and the Encina transaction monitor • active data transfers and remote procedure calls • serialization in a transaction environment • transaction locks, two-phase commit and deadlocks • improving transaction-oriented database management • the successful development of an increasingly complex transaction environment.
- Published
- 1998
31. Advances in Computers
- Author
-
Marvin Zelkowitz and Marvin Zelkowitz
- Subjects
- Computers, Electronic digital computers, Computer science, Electronic data processing
- Abstract
Since its first volume in 1960, Advances in Computers has presented detailed coverage of innovations in hardware and software and in computer theory, design, and applications. It has also provided contributors with a medium in which they can examine their subjects in greater depth and breadth than that allowed by standard journal articles. As a result, many articles have become standard references that continue to be of significant, lasting value despite the rapid growth taking place in the field.
- Published
- 1999
32. Complexity and Approximation : Combinatorial Optimization Problems and Their Approximability Properties
- Author
-
Giorgio Ausiello, Pierluigi Crescenzi, Giorgio Gambosi, Viggo Kann, Alberto Marchetti-Spaccamela, Marco Protasi, Giorgio Ausiello, Pierluigi Crescenzi, Giorgio Gambosi, Viggo Kann, Alberto Marchetti-Spaccamela, and Marco Protasi
- Subjects
- Computational complexity, Computer science--Mathematics, Electronic data processing, Computer science, Computer software, Management information systems, Computer programs
- Abstract
N COMPUTER applications we are used to live with approximation. Var I ious notions of approximation appear, in fact, in many circumstances. One notable example is the type of approximation that arises in numer ical analysis or in computational geometry from the fact that we cannot perform computations with arbitrary precision and we have to truncate the representation of real numbers. In other cases, we use to approximate com plex mathematical objects by simpler ones: for example, we sometimes represent non-linear functions by means of piecewise linear ones. The need to solve difficult optimization problems is another reason that forces us to deal with approximation. In particular, when a problem is computationally hard (i. e., the only way we know to solve it is by making use of an algorithm that runs in exponential time), it may be practically unfeasible to try to compute the exact solution, because it might require months or years of machine time, even with the help of powerful parallel computers. In such cases, we may decide to restrict ourselves to compute a solution that, though not being an optimal one, nevertheless is close to the optimum and may be determined in polynomial time. We call this type of solution an approximate solution and the corresponding algorithm a polynomial-time approximation algorithm. Most combinatorial optimization problems of great practical relevance are, indeed, computationally intractable in the above sense. In formal terms, they are classified as Np-hard optimization problems.
- Published
- 1999
33. How We Became Posthuman : Virtual Bodies in Cybernetics, Literature, and Informatics
- Author
-
N. Katherine Hayles and N. Katherine Hayles
- Subjects
- Electronic data processing, Cybernetics, Artificial intelligence, Computer science, Virtual reality in literature, Virtual reality
- Abstract
In this age of DNA computers and artificial intelligence, information is becoming disembodied even as the'bodies'that once carried it vanish into virtuality. While some marvel at these changes, envisioning consciousness downloaded into a computer or humans'beamed'Star Trek-style, others view them with horror, seeing monsters brooding in the machines. In How We Became Posthuman, N. Katherine Hayles separates hype from fact, investigating the fate of embodiment in an information age. Hayles relates three interwoven stories: how information lost its body, that is, how it came to be conceptualized as an entity separate from the material forms that carry it; the cultural and technological construction of the cyborg; and the dismantling of the liberal humanist'subject'in cybernetic discourse, along with the emergence of the'posthuman.'Ranging widely across the history of technology, cultural studies, and literary criticism, Hayles shows what had to be erased, forgotten, and elided to conceive of information as a disembodied entity. Thus she moves from the post-World War II Macy Conferences on cybernetics to the 1952 novel Limbo by cybernetics aficionado Bernard Wolfe; from the concept of self-making to Philip K. Dick's literary explorations of hallucination and reality; and from artificial life to postmodern novels exploring the implications of seeing humans as cybernetic systems. Although becoming posthuman can be nightmarish, Hayles shows how it can also be liberating. From the birth of cybernetics to artificial life, How We Became Posthuman provides an indispensable account of how we arrived in our virtual age, and of where we might go from here.
- Published
- 1999
34. Net Benefit : Guaranteed Electronic Markets: the Ultimate Potential of Online Trade
- Author
-
W. Rowan and W. Rowan
- Subjects
- International business enterprises, International economic relations, Computer science
- Abstract
This book explores the possible creation and impact of electronic markets underpinned by government. How far could electronic trade go? The author outlines a world in which open online marketplaces are routinely used to trade everything from office space to bicycle rental between individuals. Each transaction would be guaranteed by the system, not the reputation of the seller. Anyone could enter the market as an equal. The author argues that the electronic marketplaces of the future will have widespread and fundamental economic and social consequences. For more information about Guaranteed Electronic Markets visit the Gems Website at www.gems.org.uk
- Published
- 1999
35. Algorithm Theory - SWAT 2000 : 7th Scandinavian Workshop on Algorithm Theory Bergen, Norway, July 5-7, 2000 Proceedings
- Author
-
Magnus M. Halldorsson and Magnus M. Halldorsson
- Subjects
- Computer programming, Computer science, Algorithms, Artificial intelligence—Data processing, Computer networks, Discrete mathematics
- Published
- 2000
36. Computational Geometry : Algorithms and Applications
- Author
-
Mark de Berg, Marc van Krefeld, Mark Overmars, Otfried Cheong, Mark de Berg, Marc van Krefeld, Mark Overmars, and Otfried Cheong
- Subjects
- Geometry, Physics, Engineering, Computer graphics, Computer science, Geography, Computer software, Electronic data processing, Computer programs
- Abstract
Computational geometry emerged from the field of algorithms design and anal ysis in the late 1970s. It has grown into a recognized discipline with its own journals, conferences, and a large community of active researchers. The suc cess of the field as a research discipline can on the one hand be explained from the beauty of the problems studied and the solutions obtained, and, on the other hand, by the many application domains-computer graphics, geographic in formation systems (GIS), robotics, and others-in which geometric algorithms playafundamental role. For many geometric problems the early algorithmic solutions were either slow or difficult to understand and implement. In recent years a number of new algorithmic techniques have been developed that improved and simplified many of the previous approaches. In this textbook we have tried to make these modem algorithmic solutions accessible to a large audience. The book has been written as a textbook for a course in computational geometry, but it can also be used for self-study.
- Published
- 2000
37. Essentials of Electronic Testing for Digital, Memory and Mixed-Signal VLSI Circuits
- Author
-
M. Bushnell, Vishwani Agrawal, M. Bushnell, and Vishwani Agrawal
- Subjects
- Electronics, Computer science, Electronic circuits, Electrical engineering, Computer-aided engineering
- Abstract
The modern electronic testing has a forty year history. Test professionals hold some fairly large conferences and numerous workshops, have a journal, and there are over one hundred books on testing. Still, a full course on testing is offered only at a few universities, mostly by professors who have a research interest in this area. Apparently, most professors would not have taken a course on electronic testing when they were students. Other than the computer engineering curriculum being too crowded, the major reason cited for the absence of a course on electronic testing is the lack of a suitable textbook. For VLSI the foundation was provided by semiconductor device techn- ogy, circuit design, and electronic testing. In a computer engineering curriculum, therefore, it is necessary that foundations should be taught before applications. The field of VLSI has expanded to systems-on-a-chip, which include digital, memory, and mixed-signalsubsystems. To our knowledge this is the first textbook to cover all three types of electronic circuits. We have written this textbook for an undergraduate “foundations” course on electronic testing. Obviously, it is too voluminous for a one-semester course and a teacher will have to select from the topics. We did not restrict such freedom because the selection may depend upon the individual expertise and interests. Besides, there is merit in having a larger book that will retain its usefulness for the owner even after the completion of the course. With equal tenacity, we address the needs of three other groups of readers.
- Published
- 2000
38. Handbook of Logic in Computer Science: Volume 5. Algebraic and Logical Structures
- Author
-
S. Abramsky, Dov M. Gabbay, T. S. E. Maibaum, S. Abramsky, Dov M. Gabbay, and T. S. E. Maibaum
- Subjects
- Computer science, Logic, Symbolic and mathematical
- Abstract
This handbook volume covers fundamental topics of semantics in logic and computation. The chapters (some monographic in length), were written following years of co-ordination and follow a thematic point of view. The volume brings the reader up to front line research, and is indispensable to any serious worker in the areas.
- Published
- 2000
39. Theoretical Computer Science: Exploring New Frontiers of Theoretical Informatics : International Conference IFIP TCS 2000 Sendai, Japan, August 17-19, 2000 Proceedings
- Author
-
Jan van Leeuwen, Osamu Watanabe, Masami Hagiya, Peter D. Mosses, Takayasu Ito, Jan van Leeuwen, Osamu Watanabe, Masami Hagiya, Peter D. Mosses, and Takayasu Ito
- Subjects
- Data structures (Computer science), Information theory, Compilers (Computer programs), Computer science, Artificial intelligence—Data processing, Computer graphics, Computer networks
- Abstract
In 1996 the International Federation for Information Processing (IFIP) establ- hed its rst Technical Committee on foundations of computer science, TC1. The aim of IFIP TC1 is to support the development of theoretical computer science as a fundamental science and to promote the exploration of fundamental c- cepts, models, theories, and formal systems in order to understand laws, limits, and possibilities of information processing. This volume constitutes the proceedings of the rst IFIP International C- ference on Theoretical Computer Science (IFIP TCS 2000) { Exploring New Frontiers of Theoretical Informatics { organized by IFIP TC1, held at Tohoku University, Sendai, Japan in August 2000. The IFIP TCS 2000 technical program consists of invited talks, contributed talks, and a panel discussion. In conjunction with this program there are two special open lectures by Professors Jan van Leeuwen and Peter D. Mosses. The decision to hold this conference was made by IFIP TC1 in August 1998, and since then IFIP TCS 2000 has bene ted from the e orts of many people; in particular, the TC1 members and the members of the Steering Committee, the Program Committee, and the Organizing Committee of the conference. Our special thanks go to the Program Committee Co-chairs: Track (1): Jan van Leeuwen (U. Utrecht), Osamu Watanabe (Tokyo Inst. Tech.) Track (2): Masami Hagiya (U. Tokyo), Peter D. Mosses (U. Aarhus).
- Published
- 2000
40. Advances in Computers
- Author
-
Zelkowitz, Marvin V. and Zelkowitz, Marvin V.
- Subjects
- Computers, Electronic data processing, Computer science
- Abstract
Volume 55 covers some particularly hot topics. Linda Harasim writes about education and the Web in'The Virtual University: A State of the Art.'She discusses the issues that will need to be addressed if online education is to live up to expectations. Neville Holmes covers a related subject in his chapter'The Net, the Web, and the Children.'He argues that the Web is an evolutionary, rather than revolutionary, development and highlights the division between the rich and the poor within and across nations. Continuing the WWW theme, George Mihaila, Louqa Raschid, and Maria-Esther Vidal look at the problems of using the Web and finding the information you want.Naren Ramakrishnan and Anath Grama discuss another aspect of finding relevant information in large databases in their contribution. They discuss the algorithms, techniques, and methodologies for effective application of scientific data mining.Returning to the Web theme, Ross Anderson, Frank Stajano, and Jong-Hyeon Lee address the issue of security policies. Their survey of the most significant security policy models in the literature shows how security may mean different things in different contexts.John Savage, Alan Selman, and Carl Smith take a step back from the applications and address how theoretical computer science has had an impact on practical computing concepts. Finally, Yuan Taur takes a step even further back and discusses the development of the computer chip.Thus, Volume 55 takes us from the very fundamentals of computer science-the chip-right to the applications and user interface with the Web.
- Published
- 2001
41. Automata Theory and Its Applications
- Author
-
Bakhadyr Khoussainov, Anil Nerode, Bakhadyr Khoussainov, and Anil Nerode
- Subjects
- Computer science, Machine theory, Electronic data processing
- Abstract
The theory of finite automata on finite stings, infinite strings, and trees has had a dis tinguished history. First, automata were introduced to represent idealized switching circuits augmented by unit delays. This was the period of Shannon, McCullouch and Pitts, and Howard Aiken, ending about 1950. Then in the 1950s there was the work of Kleene on representable events, of Myhill and Nerode on finite coset congruence relations on strings, of Rabin and Scott on power set automata. In the 1960s, there was the work of Btichi on automata on infinite strings and the second order theory of one successor, then Rabin's 1968 result on automata on infinite trees and the second order theory of two successors. The latter was a mystery until the introduction of forgetful determinacy games by Gurevich and Harrington in 1982. Each of these developments has successful and prospective applications in computer science. They should all be part of every computer scientist's toolbox. Suppose that we take a computer scientist's point of view. One can think of finite automata as the mathematical representation of programs that run us ing fixed finite resources. Then Btichi's SIS can be thought of as a theory of programs which run forever (like operating systems or banking systems) and are deterministic. Finally, Rabin's S2S is a theory of programs which run forever and are nondeterministic. Indeed many questions of verification can be decided in the decidable theories of these automata.
- Published
- 2001
42. Current Trends In Theoretical Computer Science - Entering The 21st Century
- Author
-
Gheorghe Paun, Grzegorz Rozenberg, Arto Salomaa, Gheorghe Paun, Grzegorz Rozenberg, and Arto Salomaa
- Subjects
- Electronic data processing, Computer science
- Abstract
The scientific developments at the end of the past millennium were dominated by the huge increase and diversity of disciplines with the common label “computer science”. The theoretical foundations of such disciplines have become known as theoretical computer science. This book highlights some key issues of theoretical computer science as they seem to us now, at the beginning of the new millennium.The text is based on columns and tutorials published in the Bulletin of the European Association for Theoretical Computer Science in the period 1995-2000. The columnists themselves selected the material they wanted for the book, and the editors had a chance to update their work. Indeed, much of the material presented here appears in a form quite different from the original. Since the presentation of most of the articles is reader-friendly and does not presuppose much knowledge of the area, the book constitutes suitable supplementary reading material for various courses in computer science.
- Published
- 2001
43. Graph-Theoretic Concepts in Computer Science : 27th International Workshop, WG 2001 Boltenhagen, Germany, June 14-16, 2001 Proceedings
- Author
-
Andreas Brandstädt, Van Bang Le, Andreas Brandstädt, and Van Bang Le
- Subjects
- Computer programming, Computer science, Data structures (Computer science), Information theory, Algorithms, Computer science—Mathematics, Discrete mathematics, Artificial intelligence—Data processing
- Abstract
This book constitutes the thoroughly refereed post-workshop proceedings of the 27th International Workshop on Graph-Theoretic Concepts in Computer Science, WG 2001, held in Boltenhagen, Germany, in June 2001.The 27 revised full papers presented together with two invited contributions were carefully reviewed and selected from numerous submissions. The papers provide a wealth of new results for various classes of graphs, graph computations, graph algorithms and graph-theoretical applications in various fields.
- Published
- 2001
44. High Performance Computing - HiPC 2001 : 8th International Conference, Hyderabad, India, December, 17-20, 2001. Proceedings
- Author
-
Burkhard Monien, Viktor K. Prasanna, Sriram Vajapeyam, Burkhard Monien, Viktor K. Prasanna, and Sriram Vajapeyam
- Subjects
- Microprocessors, Computer architecture, Software engineering, Computer engineering, Computer networks, Algorithms, Computer science, Computer science—Mathematics
- Published
- 2001
45. Higher National Computing
- Author
-
Hellingsworth, Bruce, Anderson, Howard, Hall, Patrick A. V., Hellingsworth, Bruce, Anderson, Howard, and Hall, Patrick A. V.
- Subjects
- Electronic data processing, Computer science--Problems, exercises, etc, Computer science
- Abstract
Full coverage of the core units of the new Higher National Certificate / Higher National Diploma in Computing from Edexcel. Written specifically to cover the latest syllabus requirementsEncourages independent studyClear and straightforward textKnowledge-check questions and activities throughoutAnswers to numerical problems includedHigher National Computing is the only course book written specifically to cover the compulsory core units of the new BTEC Higher National scheme in Computing, including the four core units for HNC and the two additional core units required at HND. Students following.
- Published
- 2001
46. How to Design Programs : An Introduction to Programming and Computing
- Author
-
Felleisen, Matthias and Felleisen, Matthias
- Subjects
- Computer science, Computer programming, Electronic data processing
- Abstract
This introduction to programming places computer science in the core of a liberal arts education. Unlike other introductory books, it focuses on the program design process. This approach fosters a variety of skills—critical reading, analytical thinking, creative synthesis, and attention to detail—that are important for everyone, not just future computer programmers. The book exposes readers to two fundamentally new ideas. First, it presents program design guidelines that show the reader how to analyze a problem statement; how to formulate concise goals; how to make up examples; how to develop an outline of the solution, based on the analysis; how to finish the program; and how to test. Each step produces a well-defined intermediate product. Second, the book comes with a novel programming environment, the first one explicitly designed for beginners. The environment grows with the readers as they master the material in the book until it supports a full-fledged language for the whole spectrum of programming tasks.All the book's support materials are available for free on the Web. The Web site includes the environment, teacher guides, exercises for all levels, solutions, and additional projects.A second edition is now available.
- Published
- 2001
47. Multiset Processing : Mathematical, Computer Science, and Molecular Computing Points of View
- Author
-
Christian S. Calude, Gheorghe Paun, Grzegorz Rozenberg, Arto Salomaa, Christian S. Calude, Gheorghe Paun, Grzegorz Rozenberg, and Arto Salomaa
- Subjects
- Artificial intelligence—Data processing, Data structures (Computer science), Information theory, Mathematical logic, Computer science, Machine theory, Bioinformatics
- Published
- 2001
48. Self-Stabilizing Systems : 5th International Workshop, WSS 2001, Lisbon, Portugal, October 1-2, 2001 Proceedings
- Author
-
Ajoy K. Datta, Ted Herman, Ajoy K. Datta, and Ted Herman
- Subjects
- Computer networks, Software engineering, Telecommunication, Computers, Special purpose, Computer science, Algorithms
- Abstract
Physicalsystemswhichrightthemselvesafterbeingdisturbedevokeourcuriosity becausewe wantto understand howsuchsystemsareableto reactto unexpected stimuli. Themechanismsareallthe morefascinatingwhensystemsarecomposed of small, simple units, and the ability of the system to self-stabilize emerges out of its components. Faithful computer simulations of such physical systems exhibit the self-stabilizing property, but in the realm of computing, particularly for distributed systems, wehavegreaterambition. We imaginethat all manner of software, ranging from basic communication protocols to high-level applications, could enjoy self-corrective properties. Self-stabilizing software o?ers a unique, non-traditional approach to the c- cial problem of transient fault tolerance. Many successful instances of modern fault-tolerant networks are based on principles of self-stabilization. Surprisingly, the most widely accepted technical de?nition of a self-stabilizing system does not refer to faults: it is the property that the system can be started in any i- tial state, possibly an “illegal state,” and yet the system guarantees to behave properly in?nite time. This, and similar de?nitions, break many traditional approaches to program design, in which the programmer by habit makes - sumptions about initial conditions. The composition of self-stabilizing systems, initially seen as a daunting challenge, has been transformed into a mana- able task, thanks to an accumulation of discoveries by many investigators. - search on various topics in self-stabilization continues to supply new methods for constructing self-stabilizing systems, determines limits and applicability of the paradigm of self-stabilization, and connects self-stabilization to related areas of fault tolerance anddistributed computing.
- Published
- 2001
49. Systems and Software Verification : Model-Checking Techniques and Tools
- Author
-
B. Berard, M. Bidoit, A. Finkel, F. Laroussinie, A. Petit, L. Petrucci, P. Schnoebelen, B. Berard, M. Bidoit, A. Finkel, F. Laroussinie, A. Petit, L. Petrucci, and P. Schnoebelen
- Subjects
- Information Systems, Electronic data processing, Computer science, Software engineering, Artificial intelligence
- Abstract
Model checking is a powerful approach for the formal verification of software. When applicable, it automatically provides complete proofs of correctness, or explains, via counter-examples, why a system is not correct.This book provides a basic introduction to this new technique. The first part describes in simple terms the theoretical basis of model checking: transition systems as a formal model of systems, temporal logic as a formal language for behavioral properties, and model-checking algorithms. The second part explains how to write rich and structured temporal logic specifications in practice, while the third part surveys some of the major model checkers available.
- Published
- 2001
50. Advances in Computers
- Author
-
Marvin Zelkowitz and Marvin Zelkowitz
- Subjects
- Computer science
- Abstract
Advances in Computers remains at the forefront in presenting the new developments in the ever-changing field of information technology. Since 1960, Advances in Computers has chronicled the constantly shifting theories and methods of this technology that greatly shape our lives today. Volume 56 presents eight chapters that describe how the software, hardware and applications of computers are changing the use of computers during the early part of the 21st century: Software Evolution and the Staged Model of the Software Lifecycle; Embedded Software; Empirical Studies of Quality Models in Object-Oriented Systems; Software Fault Prevention by Language Choice; Quantum computing and communication; Exception Handling; Breaking the Robustness Barrier: Recent Progress on the Design of Robust Multimodal Systems; Using Data Mining to Discover the Preferences of Computer Criminals. As the longest-running continuous serial on computers, Advances in Computers presents technologies that will affect the industry in the years to come, covering hot topics from fundamentals to applications. Additionally, readers benefit from contributions of both academic and industry professionals of the highest caliber. Software Evolution and the Staged Model of the Software Lifecycle Embedded Software Empirical Studies of Quality Models in Object-Oriented Systems Software Fault Prevention by Language Choice Quantum computing and communication Exception Handling Breaking the Robustness Barrier: Recent Progress on the Design of Robust Multimodal Systems Using Data Mining to Discover the Preferences of Computer Criminals
- Published
- 2002
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.