327 results on '"COMPUTER system design & construction"'
Search Results
2. Tracking and Controlling Microservice Dependencies.
- Author
-
GHIROTTI, SILVIA ESPARRACHIARI, REILLY, TANYA, and RENTZ, ASHLEIGH
- Subjects
- *
FUNCTIONAL dependencies , *COMPUTER system design & construction , *SOFTWARE architecture , *DIRECTED acyclic graphs - Abstract
The article reports on the problem of cyclic dependencies in system and software design. It mentions the possible lack of full integration that arises from microservice software, the use of a directed acyclic graph (DAG) to represent the dependencies, and ways to handle dependencies and prevent bootstrapping problems.
- Published
- 2018
- Full Text
- View/download PDF
3. The Natural Science of Computing: As unconventional computing comes of age, we believe a revolution is needed in our view of computer science.
- Author
-
Horsman, Dominic, Kendon, Vivien, and Stepney, Susan
- Subjects
- *
COMPUTER science research , *QUANTUM computing , *COMPUTER engineering , *NATURAL history , *COMPUTER system design & construction , *TWENTY-first century , *HISTORY - Abstract
The authors present their thoughts on the field of computer science following the advent of more unconventional computing such as quantum computing, DNA processors, and human social machines. They argue more emphasis should be placed on the physical computers, noting that one computational model no longer encompasses all the field's necessities. Comparing computer science to natural science, they suggest that new computers could inform new computational theories.
- Published
- 2017
- Full Text
- View/download PDF
4. Exponential Laws of Computing Growth.
- Author
-
DENNING, PETER J. and LEWIS, TED G.
- Subjects
- *
FORECASTING technological innovation , *TECHNOLOGICAL innovations & economics , *INTEGRATED circuit design , *COMPUTER system design & construction , *COMPUTERS & society , *ECONOMICS - Abstract
The article discusses the use of exponential laws in relation to computing growth, noting the significance of Moore's Law. Topics include the use of mathematical models to forecast and explain the growth of the computing ecosystem; the differences between chip, system, and community computing growth; and the effects of computing growth on economic growth.
- Published
- 2017
- Full Text
- View/download PDF
5. Teaching Undergraduates to Build Real Computer Systems.
- Author
-
CHUNFENG YUAN, XIAOPENG GAO, YU CHEN, and YUNGANG BAO
- Subjects
- *
COMPUTER system design & construction , *CHINESE students , *UNDERGRADUATES , *COMPUTER science education - Abstract
The article discusses efforts to teach undergraduate students in Chinese universities how to construct what the article refers to as real computer systems.
- Published
- 2021
- Full Text
- View/download PDF
6. Fail at Scale.
- Author
-
MAURER, BEN
- Subjects
- *
COMPUTER system design & construction , *COMPUTER system failure prevention , *HUMAN error , *EMPLOYEES' workload - Abstract
The article discusses failure in computer system engineering and examines Internet services company Facebook's approach to failure. Noted causes of failures include individual machine failures, workload changes, and human error. The author goes on to comment on the importance of learning from failure.
- Published
- 2015
- Full Text
- View/download PDF
7. Tom Kilburn: A Tale of Five Computers.
- Author
-
Anderson, David
- Subjects
- *
COMPUTER engineers , *COMPUTER engineering , *COMPUTER system design & construction , *COMPUTER science , *COMPUTER science education , *TWENTIETH century , *HISTORY - Abstract
The article profiles British computer engineer Tom Kilburn, focusing on his influence on historically significant computer systems throughout his life. Topics include his family background and service in the British Royal Air Force (RAF) Telecommunications Research Establishment (TRE), his work on the cathode ray tube (CRT) memory computers known as the Manchester Baby, Mark I, and Mercury and the Muse and Atlas transistor computers, and his role in establishing computer science in British higher education.
- Published
- 2014
- Full Text
- View/download PDF
8. The Balancing Act of Choosing Nonblocking Features.
- Author
-
Michael, Maged M.
- Subjects
- *
COMPUTER system design & construction , *BLOCKING oscillators , *BLOCKING sets , *LOOP tiling (Computer science) , *DATA structures , *DATA recovery - Abstract
This article discusses the design requirements of nonblocking systems. Topics covered include the levels of nonblocking progress, uses of nonblocking operations for systems or interthread interactions and the trade-offs and compromises to be considered in selecting nonblocking operations' features. The key issues in selecting these features include the levels of progress guarantee, choice of data structures, safe memory reclamation issues and the portability of atomic operations required for nonblocking algorithms and methods.
- Published
- 2013
- Full Text
- View/download PDF
9. EVALUATING SYSTEM ACCESSIBILITY USING AN EXPERIMENTAL PROTOCOL BASED ON USABILITY.
- Author
-
Oliveira Lima, Ana Carolina, de Fátima Queiroz Vieira, Maria, da Silva Ferreira, Ronaldo, Aguiar, Yuska P. C., Pereira Bastos, Moisés, and Maciel Lopes Junior, Sandro Laerth
- Subjects
USER-centered system design ,PRODUCT quality ,HUMAN-computer interaction ,COMPUTER system design & construction ,USER interfaces ,ASSISTIVE computer technology - Abstract
This article aims to a systematic approach to assess product accessibility and the adapted system of an experimental protocol originally designed to evaluate product's usability. The adapted protocol approach is focused on products and systems for visually impaired. The developed study with the proposed protocol investigates assistive technology adequacy to target users, regardless of their gender, age or previous experience in this technology usage. The tasks performed by 30 users community were categorized as activities of entertainment, learning and social inclusion. The data obtained from the experiment carried out with the protocol application enabled the test of a set of assumptions about the protocol usage. [ABSTRACT FROM AUTHOR]
- Published
- 2018
10. The Machine That Would Predict the Future.
- Author
-
Weinberger, David
- Subjects
- *
PREDICTION models , *COMPUTER system design & construction , *FORECASTING , *SIMULATION methods & models , *COMPUTER algorithms , *DATA packeting - Abstract
The article discusses the development of a computing system by Dirk Helbing and colleagues at the Swiss Federal Institute of Technology in Zurich, called the Living Earth Simulator, that can predict the future. The Simulator project, formerly known as the FurturICT Knowledge Accelerator and Crisis-Relief System, would model global-scale systems, including economies, governments, and cultural trends, using torrential data streams and algorithms to solve some of the world’s most challenging and complex problems. Topics include an overview of how the Simulator could provide answers to difficult questions, such as the global impact of a decision by Greece’s government to drop the euro as its currency, and the challenges that such a comprehensive system would have to overcome. INSET: Disease Follows the Money.
- Published
- 2011
- Full Text
- View/download PDF
11. THINKING MACHINE.
- Author
-
Fox, Douglas
- Subjects
- *
COMPUTER system design & construction , *BRAIN research , *COMPUTER research , *CHAOS theory , *SUPERCOMPUTER design & construction - Abstract
The article examines how technological innovations within the field of computing may depend upon understanding the chemistry and physiology of the human brain. Stanford University scientist Kwabena Boahen is attempting to develop a computer that is based on organized chaos representative of the brain. He has developed complex silicon wafers for a supercomputer known as Neurogrid. Research conducted by Australian Nation University neuroscientist Simon Laughlin is discussed.
- Published
- 2009
12. Reflecting Human Values in the Digital Age.
- Author
-
SELLEN, ABIGAIL, ROGERS, YVONNE, HARPER, RICHARD, and RODDEN, TOM
- Subjects
- *
HUMAN-computer interaction , *COMPUTER system design & construction , *INTERACTIVE computer systems , *USER-centered system design , *USER interfaces , *COMPUTER science , *CONFERENCES & conventions - Abstract
The article discusses the field of human-computer interaction (HCI), exploring how the HCI field will manage to keep human values at the core of HCI. The author questions the nature of HCI's goals, how those in the HCI field should do their work, and whether HCI methods remain relevant. Topics include large changes in the sociotechnical landscape, computer systems that intrude into human lives, and the multidisciplinary nature of HCI. Also discussed are HCI techniques derived from cognitive psychology and human-factors engineering. INSET: Questions of Broader Impact.
- Published
- 2009
- Full Text
- View/download PDF
13. The computers that behaves like the weather.
- Author
-
Graham-Rowe, Duncan
- Subjects
- *
ELECTRONIC circuits , *CHAOS theory , *COMPUTER system design & construction , *COMPUTERS - Abstract
The article focuses on the development of the so-called chaotic computer processor. It accounts the studies and developments of chaotic computers made by William Ditto, a physicist at the University of Florida in Gainesville, Florida and his colleagues. It explores the mechanism behind the complex circuits and chaotic logic gates that the computer has, as well as the possibility of its use in space.
- Published
- 2008
14. The Complexity Cross—Implications for Practice.
- Author
-
Schneberger, Scott L. and McLean, Ephraim R.
- Subjects
- *
DESIGN information storage & retrieval systems , *SYSTEMS design , *COMPLEXITY (Philosophy) , *SIMPLICITY (Philosophy) , *COMPUTER system design & construction , *COMPUTER software - Abstract
The article explores the concepts of simplicity and complexity in terms of information systems. Emphasis is given to ways systems designers and managers moderate system complexity and the role of the complexity cross in information systems design. Other topics include software maintenance, the variety of components, and distributed computer systems.
- Published
- 2003
- Full Text
- View/download PDF
15. Design of DNA-based innovative computing system of digital comparison.
- Author
-
Zhou, Chunyang, Geng, Hongmei, and Guo, Chunlei
- Subjects
COMPUTER system design & construction ,MOLECULAR computers ,LOGIC circuits ,COMPARATOR circuits ,GRAPHENE oxide ,DNA probes ,SINGLE-stranded DNA ,BIOMOLECULAR electronics - Abstract
Graphical abstract Abstract Despite great potential and extensive interest in developing biomolecule-based computing, the development of even basic molecular logic gates is still in its infancy. Digital comparator (DC) is the basic unit in traditional electronic computers, but it is difficult to construct a system for achieving large-scale integration. Here, we construct, for the first time, a novel logic computing system of DCs that can compare whether two or more numbers are equal. Our approach is by taking advantage of facile preparation and unique properties of graphene oxide and DNA. The DC system reported in this work is developed by the DNA hybridization and effective combination of GO and single-stranded DNA, which is regarded as the reacting platform. On the basis of this platform and reaction principle, we have developed 2-inputs, 3-inputs, and 4-inputs DCs to realize the comparison of two or more binary numbers. We predict that such a state-of-the-art logic system enables its functionality with large-scale input signals, providing a new direction toward prototypical DNA-based logic operations and promoting the development of advanced logic computing. Statement of Significance The overarching objective of this paper is to explore the construction of a novel DNA computing system of digital comparator driven by the interaction of DNA and graphene oxide (GO). GO can efficient bind the dye-labeled, single-stranded DNA probe and then quench its fluorescence. In the case of the target appearing, specific binding between the single-stranded probe and its target occurs, changing the conformation and relationship with GO, then restoring the fluorescence of the dye. We have developed the 2-inputs, 3-inputs, and 4-inputs digital comparator circuits, which are expected to realize the comparison of large-scale input signals and can avoid the problems of design complexity and manufacturing cost of integrated circuits in traditional computing. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
16. Discrete ripplet-II transform and modified PSO based improved evolutionary extreme learning machine for pathological brain detection.
- Author
-
Nayak, Deepak Ranjan, Dash, Ratnakar, and Majhi, Banshidhar
- Subjects
- *
COMPUTER-aided design , *COMPUTER system design & construction , *MAGNETIC resonance imaging , *ARTIFICIAL neural networks , *SUPPORT vector machines - Abstract
Recently there has been remarkable advances in computer-aided diagnosis (CAD) system development for detection of the pathological brain through MR images. Feature extractors like wavelet and its variants, and classifiers like feed-forward neural network (FNN) and support vector machine (SVM) are very often used in these systems despite the fact that they suffer from many limitations. This paper presents an efficient and improved pathological brain detection system (PBDS) that overcomes the problems faced by other PBDSs in the recent literature. First, we support the use of contrast limited adaptive histogram equalization (CLAHE) to enhance the quality of the input MR images. Second, we use discrete ripplet-II transform (DR2T) with degree 2 as the feature extractor. Third, in order to reduce the huge number of coefficients obtained from DR2T, we employ PCA+LDA approach. Finally, an improved hybrid learning algorithm called MPSO-ELM has been proposed that combines modified particle swarm optimization (MPSO) and extreme learning machine (ELM) for segregation of MR images as pathological or healthy. In MPSO-ELM, MPSO is utilized to optimize the hidden node parameters (input weights and hidden biases) of single-hidden-layer feedforward neural networks (SLFN) and the output weights are determined analytically. The proposed method is contrasted with the current state-of-the-art methods on three benchmark datasets. Experimental results indicate that our proposed scheme brings potential improvements in terms of classification accuracy and number of features. Additionally, it is observed that the proposed MPSO-ELM algorithm achieves higher accuracy and obtains compact network architecture compared to conventional ELM and BPNN classifier. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
17. LACross: Learning-Based Analytical Cross-Platform Performance and Power Prediction.
- Author
-
Zheng, Xinnian, John, Lizy, and Gerstlauer, Andreas
- Subjects
- *
MACHINE learning , *COMPUTER system design & construction , *PERFORMANCE evaluation , *COMPUTER software development , *LOGICAL prediction - Abstract
Fast and accurate performance and power prediction is a key challenge in pre-silicon design evaluations during the early phases of hardware and software co-development. Performance evaluation using full-system simulation is prohibitively slow, especially with real world applications. By contrast, analytical models are not sufficiently accurate or still require target-specific execution statistics that may be slow or difficult to obtain. In this paper, we present LACross, a learning-based cross-platform prediction technique aimed at predicting the time-varying performance and power of a benchmark on a target platform using hardware counter statistics obtained while running natively on a host platform. We employ a fine-grained phase-based approach, where the learning algorithm synthesizes analytical proxy models that predict the performance and power of the workload in each program phase from performance statistics obtained on the host. Our learning approach relies on a one-time training phase using a target reference model or real hardware. We train our models on less than 160 programs from the ACM ICPC database, and demonstrate prediction accuracy and speed on 35 programs from SPEC CPU2006, MiBench and SD-VBS benchmark suites. Results show that with careful choice of phase granularity, we can achieve on average over 97% performance and power prediction accuracy at simulation speeds of over 500 MIPS. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
18. A New Mobile Agent-based Middleware System Design for Wireless Sensor Network.
- Author
-
Yuechun Wang, Ka Lok Man, Guan, Steven, and Danny Hughes
- Subjects
MOBILE agent systems ,MIDDLEWARE ,WIRELESS sensor networks ,COMPUTER system design & construction ,PROGRAMMING languages - Abstract
Wireless Sensor Network is playing a crucial role in daily life because of its distributed sensing ability in combination with wireless communication techniques and self-organising deployment approaches. To satisfy requirements of easily matching diversified dynamic sensing applications and highly heterogeneous sensor platforms from perspective of logistics, a low budget but high-efficient wireless sensor network middleware is in dire need. To maximise commercial benefits, a WSN middleware is designed in order to increase efficacy and return on investment. This paper presents a design of an inventive mobile agent-based middleware, which could optimally achieve the intensive requirements of a WSN middleware. The middleware proposed in this paper has considered resources limitation issues that are commonly addressed on normal sensor nodes, meanwhile it provides a possibility of platform independence as well as programming language independence. In addition, a mobile agentbased system can reduce network payload and dynamically adapt to environmental changes. Therefore, the middleware presented in this paper has ability of perceiving changes in operating environment and ability to automatically response to these changes. [ABSTRACT FROM AUTHOR]
- Published
- 2017
19. The Case for Explicit Ethical Agents.
- Author
-
Scheutz, Matthias
- Subjects
ARTIFICIAL intelligence & ethics ,ARTIFICIAL intelligence & society ,TECHNOLOGY & society ,COMPUTER system design & construction ,ROBOTS & society ,DRIVERLESS cars - Abstract
Morality is a fundamentally human trait that permeates all levels of human society, from basic etiquette and normative expectations of social groups, to formalized legal principles upheld by societies. Hence, future interactive AI systems, in particular, cognitive systems on robots deployed in human settings, will have to meet human normative expectations, for otherwise these system risk causing harm. While the interest in machine ethics has increased rapidly in recent years, there are only very few current efforts in the cognitive systems community to investigate moral and ethical reasoning. And there is currently no cognitive architecture that has even rudimentary moral or ethical competence, that is, the ability to judge situations based on moral principles such as norms and values and make morally and ethically sound decisions. We hence argue for the urgent need to instill moral and ethical competence in all cognitive system intended to be employed in human social contexts. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
20. Analogy and Qualitative Representations in the Companion Cognitive Architecture.
- Author
-
Forbus, Kenneth D. and Hinrichs, Thomas
- Subjects
COGNITION research ,ARTIFICIAL intelligence ,COMPUTER system design & construction ,QUALITATIVE research ,META-analysis - Abstract
The Companion cognitive architecture is aimed at reaching human-level AI by creating software social organisms -- systems that interact with people using natural modalities, working and learning over extended periods of time as collaborators rather than tools. Our two central hypotheses about how to achieve this are (1) analogical reasoning and learning are central to cognition, and (2) qualitative representations provide a level of description that facilitates reasoning, learning, and communication. This article discusses the evidence we have gathered supporting these hypotheses from our experiments with the Companion architecture. Although we are far from our ultimate goals, these experiments provide strong evidence for the utility of analogy and qualitative representation across a range of tasks. We also discuss three lessons learned and highlight three important open problems for cognitive systems research more broadly. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
21. Statement of William W. Beach Commissioner Bureau of Labor Statistics Friday, August 2, 2019.
- Subjects
STATISTICS on the working class ,UNEMPLOYMENT statistics ,PAYROLLS ,COMPUTER system design & construction ,MEDICAL care - Published
- 2019
22. Use of the Concept of Transparency in the Design of Hierarchically Structured Systems.
- Author
-
Parnas, D.L. and Siewiorek, D.P.
- Subjects
- *
STRUCTURED programming , *COMPUTER system design & construction , *TRANSPARENCY (Optics) , *VIRTUAL machine systems , *MARKOV processes , *SYNCHRONIZATION , *SOFTWARE engineering , *COMPUTER programming , *COMPUTER software - Abstract
Reports on the design of hierarchically structured programming systems based on the concept of transparency. Use of the 'outside in' approach in determining solution to software design problems; Development of a method for evaluating the cost of requiring programmers to work with an abstraction of a real machine; Illustration of the method through examples from hardware and software.
- Published
- 1975
- Full Text
- View/download PDF
23. Fault Injection Acceleration by Simultaneous Injection of Non-interacting Faults.
- Author
-
Ebrahimi, Mojtaba, Moshrefpour, Mohammad Hadi, Golanbari, Mohammad Saber, and Tahoori, Mehdi B.
- Subjects
COMPUTER system design & construction ,ERROR detection (Information theory) ,COMPUTER system failure prevention ,MICROPROCESSORS ,SOFTWARE reliability - Abstract
Fault injection is the de facto standard for evaluating the sensitivity of digital systems to transient errors. Due to various masking effects only a very small portion of the injected faults lead to system-level failures, and hence, too many faults have to be injected for achieving statistically meaningful results. At the same time, since the majority of injected faults will be masked, lots of simulation cycles will be wasted for tracking each and every injected fault separately. In this paper, we propose an opportunistic acceleration technique which evaluates the impact of multiple non-interacting faults in one workload execution. In case no failure is observed, this technique skips the evaluation of those individual faults which leads to a significant speedup. The experimental results on the Leon3 processor show that our proposed technique shortens the fault injection runtime by two orders of magnitude. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
24. Explaining Engineered Computing Systems' Behaviour: the Role of Abstraction and Idealization.
- Author
-
Angius, Nicola and Tamburrini, Guglielmo
- Subjects
- *
COMPUTER system design & construction , *ABSTRACTION (Computer science) , *LOGIC circuits - Abstract
This paper addresses the methodological problem of analysing what it is to explain observed behaviours of engineered computing systems (BECS), focusing on the crucial role that abstraction and idealization play in explanations of both correct and incorrect BECS. First, it is argued that an understanding of explanatory requests about observed miscomputations crucially involves reference to the rich background afforded by hierarchies of functional specifications. Second, many explanations concerning incorrect BECS are found to abstract away (and profitably so on account of both relevance and intelligibility of the explanans) from descriptions of physical components and processes of computing systems that one finds below the logic circuit and gate layer of functional specification hierarchies. Third, model-based explanations of both correct and incorrect BECS that are provided in the framework of formal verification methods often involve idealizations. Moreover, a distinction between restrictive and permissive idealizations is introduced and their roles in BECS explanations are analysed. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
25. Futures engineering in complex systems.
- Author
-
Bashiri, Hassan, Nazemi, Amir, and Mobinidehkordi, Ali
- Subjects
- *
LOGICAL prediction , *COMPUTATIONAL complexity , *COMPUTER system design & construction , *COMPUTER engineering , *FUTURES studies - Abstract
Purpose This paper attempts to apply complex theory in futures studies and addresses prediction challenges when the system is complex. The purpose of the research is to design a framework to engineer the futures in complex systems where components are divers and inter-related. Relations cannot be interpreted by cause and effect concept.Design/methodology/approach First, the authors shaped a conceptual framework based on engineering, complex theory and uncertainty. To extract tacit knowledge of experts, an online questionnaire was developed. To validate the proposed framework, a workshop method was adapted with NetLogo simulation.Findings Opinion of participants in the workshop which is collected through quantitative questionnaire shows that the framework helps us in understanding and shaping scenarios. Harnessing the complexity in developing the futures was the main objective of this paper with the proposed framework which has been realized based on the experience gained from the workshop.Originality/value Iterative processes are very important to harness the complexity in systems with uncertainty. The novelty of the research is a combination of engineering achievements in terms of computation, simulation and applying tools with futures studies methods. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
26. Making Transmission Models Accessible to End-Users: The Example of TRANSFIL.
- Author
-
Irvine, Michael A. and Hollingsworth, T. Deirdre
- Subjects
- *
USER interfaces , *FILARIASIS , *COMMUNICABLE diseases , *MATHEMATICAL models , *END users (Information technology) , *COMPUTER system design & construction , *JAVASCRIPT programming language , *HTML (Document markup language) , *CASCADING style sheets , *INFECTIOUS disease transmission - Abstract
The article discusses the newly developed online web interface to make lymphatic filariasis transmission model TRANSFIL accessible to end-users. Topics include the challenges facing the development of user-friendly interfaces to complex transmission models, the aims of the interface, and the technical advancements used in the model and interface such as JavaScript, HTML5 and CSS3.
- Published
- 2017
- Full Text
- View/download PDF
27. Leaky Buffer: A Novel Abstraction for Relieving Memory Pressure from Cluster Data Processing Frameworks.
- Author
-
Liu, Zhaolei and Ng, T. S. Eugene
- Subjects
- *
BUFFER storage (Computer science) , *COMPUTER storage devices , *INFORMATION storage & retrieval systems , *COMPUTER system design & construction , *HASHING , *EXPERIMENTS - Abstract
The shift to the in-memory data processing paradigm has had a major influence on the development of cluster data processing frameworks. Numerous frameworks from the industry, open source community and academia are adopting the in-memory paradigm to achieve functionalities and performance breakthroughs. However, despite the advantages of these in-memory frameworks, in practice they are susceptible to memory-pressure related performance collapse and failures. The contributions of this paper are two-fold. First, we conduct a detailed diagnosis of the memory pressure problem and identify three preconditions for the performance collapse. These preconditions not only explain the problem but also shed light on the possible solution strategies. Second, we propose a novel programming abstraction called the leaky buffer that eliminates one of the preconditions, thereby addressing the underlying problem. We have implemented a leaky buffer enabled hashtable in Spark, and we believe it is also able to substitute the hashtable that performs similar hash aggregation operations in any other programs or data processing frameworks. Experiments on a range of memory intensive aggregation operations show that the leaky buffer abstraction can drastically reduce the occurrence of memory-related failures, improve performance by up to 507 percent and reduce memory usage by up to 87.5 percent. [ABSTRACT FROM PUBLISHER]
- Published
- 2017
- Full Text
- View/download PDF
28. Hefestos: an intelligent system applied to ubiquitous accessibility.
- Author
-
Tavares, João, Barbosa, Jorge, Cardoso, Ismael, Costa, Cristiano, Yamin, Adenauer, and Real, Rodrigo
- Subjects
COMPUTER system design & construction ,UBIQUITOUS computing ,CARE of people with disabilities ,EDUCATION of people with disabilities ,SERVICES for people with disabilities - Abstract
This article proposes Hefestos, an intelligent system applied to ubiquitous accessibility. This model uses ubiquitous computing concepts to manage accessibility resources for people with disabilities. Among the concepts employed, context awareness, user profiles and trails management can be highlighted. The paper proposes an ontology for accessibility and delineates scenarios of its application in everyday life of people with disabilities. Moreover, the implementation of a smart wheelchair prototype and its application in a practical experiment is described. Ten users with a range of disability degrees tried the system and filled out a survey based on the technology acceptance model. This experiment demonstrated the main functionalities and the acceptance of the system. The results showed 96 % of acceptance regarding perceived easy of use and 98 % in perceived usefulness. These results were encouraging and show the potential for implementing Hefestos in real life situations. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
29. A review on objective measurement of usage in technology acceptance studies.
- Author
-
Walldén, Sari, Mäkinen, Erkki, and Raisamo, Roope
- Subjects
TECHNOLOGY Acceptance Model ,COMPUTER user attitudes ,COMPUTER system design & construction ,THEORY of reasoned action ,PERCEIVED benefit - Abstract
This paper reviews objective measurement of usage in user acceptance studies using the technology acceptance model (TAM). The use of objective measurement is quite uncommon in TAM. In addition to its low frequency, another striking phenomenon is the way objective measurement is used. Namely, only a minor potential of the information available is typically used, and, for example, the temporal aspect (changes in time) is almost always neglected. The paper describes the TAM studies where objective measurement of usage has been utilized and ponders the way objective measurements are taken. The ultimate goal of the paper is to improve objective measures used in TAM studies. To this end, several suggestions are given. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
30. Simulating Reconfigurable Multiprocessor Systems-on-Chip with MPSoCSim.
- Author
-
WEHNER, PHILIPP, RETTKOWSKI, JENS, KALB, TOBIAS, and GÖHRINGER, DIANA
- Subjects
ADAPTIVE computing systems ,COMPUTER simulation ,MULTIPROCESSORS ,SYSTEMS on a chip ,COMPUTER system design & construction ,ROUTING algorithms - Abstract
Upcoming reconfigurable Multiprocessor Systems-on-Chip (MPSoCs) present new challenges for the design and early estimation of technology requirements due to their runtime adaptive hardware architecture. The usage of simulators offers capabilities to overcome these issues. In this article, MPSoCSim, a SystemC simulator for Network-on-Chip (NoC) based MPSoCs is extended to support the simulation of reconfigurable MPSoCs. Processors, such as ARM and MicroBlaze, and peripheral models used within the virtual platform are provided by Imperas/OVP and attached to the NoC. Moreover, traffic generators are available to analyze the system. The virtual platform currently supports mesh topology with wormhole switching and several routing algorithms such as XY-, a minimal West-First algorithm, and an adaptive West-First algorithm. Amongst the impact of routing algorithms regarding performance, reconfiguration processes can be examined using the presented simulator. A mechanism for dynamic partial reconfiguration is implemented that is oriented towards the reconfiguration scheme on real FPGA platforms. It includes the simulation of the undefined behavior of the hardware region during reconfiguration and allows the adjustment of parameters. During runtime, dynamic partial reconfiguration interfaces are used to connect the Network-on-Chip infrastructure with reconfigurable regions. The configuration access ports can be modeled by the controller for the dynamic partial reconfiguration in form of an application programming interface. An additional SystemC component enables the readout of simulation time from within the application. For evaluation of the simulator timing and power consumption of the simulated hardware are estimated and compared with a real hardware implementation on a Xilinx Zynq FPGA. The comparison shows that the simulator improves the development of reconfigurable MPSoCs by early estimation of system requirements. The power estimations show a maximum deviation of 9mW at 1.9W total power consumption. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
31. VirtualSoC: A Research Tool for Modern MPSoCs.
- Author
-
BORTOLOTTI, DANIELE, MARONGIU, ANDREA, and BENINI, LUCA
- Subjects
COMPUTER system design & construction ,HIGH performance computing ,EMBEDDED computer systems ,COMPUTER simulation ,COMPUTER operating systems ,CENTRAL processing units - Abstract
Architectural heterogeneity has proven to be an effective design paradigm to cope with an ever-increasing demand for computational power within tight energy budgets, in virtually every computing domain. Programmable manycore accelerators are currently widely used not only in high-performance computing systems, but also in embedded devices, in which they operate as coprocessors under the control of a general-purpose CPU (the host processor). Clearly, such powerful hardware architectures are paired with sophisticated and complex software ecosystems, composed of operating systems, programming models plus associated runtime engines, and increasingly complex user applications with related libraries. System modeling has always played a key role in early architectural exploration or software development when the real hardware is not available. The necessity of efficiently coping with the huge HW/SW design space provided by the described heterogeneous Systems on Chip (SoCs) calls for advanced full-system simulation methodologies and tools, capable of assessing variousmetrics for the functional and nonfunctional properties of the target system. In this article, we describe VirtualSoC, a simulation tool targeting the full-system simulation of massively parallel heterogeneous SoCs. We also describe how VirtualSoC has been successfully adopted in several research projects. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
32. Quantitative verification and strategy synthesis for stochastic games.
- Author
-
Svoreňová, Mária and Kwiatkowska, Marta
- Subjects
VIDEO game software ,COMPUTER system design & construction ,QUANTITATIVE research ,ALGORITHMIC randomness ,STRATEGY games - Abstract
Design and control of computer systems that operate in uncertain, competitive or adversarial, environments can be facilitated by formal modelling and analysis. In this paper, we focus on analysis of complex computer systems modelled as turn-based 2 1 2 -player games, or stochastic games for short, that are able to express both stochastic and non-stochastic uncertainties. We offer a systematic overview of the body of knowledge and algorithmic techniques for verification and strategy synthesis for stochastic games with respect to a broad class of quantitative properties expressible in temporal logic. These include probabilistic linear-time properties, expected total, discounted and average reward properties, and their branching-time extensions and multi-objective combinations. To demonstrate applicability of the framework as well as its practical implementation in a tool called PRISM-games, we describe several case studies that rely on analysis of stochastic games, from areas such as robotics, and networked and distributed systems. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
33. Control charting methods for autocorrelated cyber vulnerability data.
- Author
-
Afful-Dadzie, Anthony and Allen, Theodore T.
- Subjects
QUALITY control charts ,QUALITY control ,AUTOCORRELATION (Statistics) ,COMPUTER engineering ,COMPUTER performance ,COMPUTER system design & construction ,MATHEMATICAL models - Abstract
Control charting cyber vulnerabilities is challenging because the same vulnerabilities can remain from period to period. Also, hosts (personal computers, servers, printers, etc.) are often scanned infrequently and can be unavailable during scanning. To address these challenges, control charting of the period-to-period demerits per host using a hybrid moving centerline residual-based and adjusted demerit (MCRAD) chart is proposed. The intent is to direct limited administrator resources to unusual cases when automatic patching is insufficient. The proposed chart is shown to offer superior average run length performance compared with three alternative methods from the literature. The methods are illustrated using three datasets. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
34. ABT and SBT revisited: Efficient memory management techniques for object oriented and web-based applications.
- Author
-
Rezaei, M. and Kavi, K. M.
- Subjects
COMPUTER memory management ,WEB-based user interfaces ,COMPUTER system design & construction ,GARBAGE collection (Computer science) ,PROGRAMMING languages - Abstract
Dynamic memory management is an important and essential part of computer systems design. Efficient memory allocation, garbage collection, and compaction are becoming critical in parallel and distributed applications using object oriented languages like C++ and Java. In addition to achieving fast allocation/de-allocation of memory objects and fragmentation, memory management techniques should strive to improve the overall execution performance of object oriented applications. In this paper, we introduce Address Ordered and Segregated Binary Trees, two memory management techniques particularly efficient for object oriented applications. Our empirical results manifest that both ABT and SBT, when accompanied by coalescing, outperform the existing allocators such as Segregated free lists in terms of storage utilization and execution performance. We also show that these new allocators perform well in terms of storage utilization, even without coalescing. This is in particular suitable for web-applications. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
35. The CADE-25 Automated Theorem Proving system competition - CASC-25.
- Author
-
Sutcliffe, Geoff and Urban, Josef
- Subjects
- *
AUTOMATIC theorem proving , *COMPUTER system design & construction , *COMPUTER science , *COMPUTER networks , *COMPUTATIONAL intelligence - Abstract
The CADE ATP System Competition (CASC) is an annual evaluation of fully automatic, classical logic Automated Theorem Proving (ATP) systems. CASC-25 was the twentieth competition in the CASC series. Twenty-seven ATP systems and system variants competed in the various competition divisions. An outline of the competition design, and a commentated summary of the results, are presented. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
36. Lifetime Achievement Award Translating Today into Tomorrow.
- Author
-
Sheng Li
- Subjects
- *
MACHINE translating , *TRANSLATIONS , *COMPUTER system design & construction , *NATURAL language processing , *OPEN source software , *COMPUTER software - Abstract
The article presents a speech by Sheng Li of the Harbin Institute of Technology delivered at the Association for Computational Linguistics (ACL) conference held in Beijing, China in 2015. Topics of the speech included the history and emergence of machine translation (MT) in China, the development of the DEAR computer-aided translation system, and the release of the Language Technology Platform (LTP) open-source natural language processing (NLP) system.
- Published
- 2015
- Full Text
- View/download PDF
37. Modeling and evaluation of highly complex computer systems architectures.
- Author
-
Iacono, Mauro, Gribaudo, Marco, Kołodziej, Joanna, and Pop, Florin
- Subjects
COMPUTER system design & construction ,SIMULATION methods & models ,COMPUTATIONAL complexity ,SYSTEMS software ,PROTOTYPES - Abstract
Modern computer based systems are characterized by several complexity dimensions: a non-exhaustive list includes scale, architecture, distribution, variability, flexibility, dynamics, workloads, time constraints, dependability, availability, security, performances. The design, implementation, operation, maintenance and evolution of such systems require informed decisions, that must be founded onto techniques and tools enabling an anticipated knowledge about the behavior of every subsystem, including hardware, software and interactions, and the whole system, and the relationship of the system and the external world, considering workloads, communication, sensing of physical interactions. Performance prediction, and in general behavior prediction, may exploit simulation based approaches or analytical techniques to evaluate in advance the effects of design choices, or variability under different workloads, or emerging behavior, of systems, and provide a valuable support in all the phases of the lifecycle of a system by means of proper modeling approaches. In this Special Issue we present some contributions that offer a glance on modeling and evaluation of complex computer based system, and that have been chosen in order to provide a view on different domains and different approaches, mainly focusing on simulation techniques and related applications. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
38. An extension of the taxonomy of persistent and nonviolent steps.
- Author
-
Koutny, Maciej, Mikulski, Łukasz, and Pietkiewicz-Koutny, Marta
- Subjects
- *
COMPUTER system design & construction , *SEMANTICS , *PROBLEM solving , *PETRI nets , *NONVIOLENCE - Abstract
The design and analysis of concurrent computing systems is often concerned with fundamental behavioural properties involving system activities, e.g., boundedness, liveness, and persistence. This paper is about the latter property and a complementary property of nonviolence. Persistence means that an enabled activity cannot be disabled, whereas nonviolence means that executing an activity does not disable any other enabled activity. Since its introduction in the 1970s, persistence has been investigated assuming that each system activity is a single atomic action, but in the design of Globally Asynchronous Locally Synchronous (GALS) systems one also needs to allow activities represented by steps, each step being a set of simultaneously executed atomic actions. Dealing with step based execution semantics creates a wealth of new fundamental problems and questions. In particular, there are different ways in which the standard notion of persistence (and nonviolence) could be lifted to the level of steps. We provide a rich classification of different types of step based persistence and nonviolence. We first do this for a general model of (step) transition systems. After that, we focus on Petri nets, and introduce a taxonomy of persistent and nonviolent steps and markings. We also characterise key structural properties of persistence and nonviolence, linking these behavioural notions with the presence of self-loops in Petri nets. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
39. Technical correspondence.
- Author
-
Kleijnen, Jack P. C., Davidson, E. S., Kumar, B., Koenigsberg, Ernest, Sauter, John, Sendrow, Marvin, Merkle, Ralph C., and Hellman, Martin E.
- Subjects
- *
LETTERS to the editor , *DATA encryption , *ALGORITHMS , *COMPUTER system design & construction , *COMPUTER network resources - Abstract
Several letters to the editor are presented in response to articles including "On the Security of Multiple Encryption," by Ralph C. Merkle and Martin E. Hellman in the July 1981 issue, "Computational Algorithms for Product Form Queueing Networks," by K.M. Chandy and C. H. Sauer in the October 1980 issue, and "Computer System Design Using a Hierarchical Approach to Performance Evaluation," by B. Kumar, and E.S. Davidson. Responses of the authors are provided.
- Published
- 1981
- Full Text
- View/download PDF
40. Not only the big need computers.
- Subjects
COMPUTER system design & construction ,ELECTRONIC data processing ,PRODUCTION control ,INVENTORY control - Abstract
The article reports on the essential use of a computer-equipped data-processing system for the business operation of Warner Electric Brake & Clutch Co. of Beloit, Wisconsin. It mentions that the computer system is intended for the company's efficient production and inventory control. It also touches on the effort of finance vice president William Keefer and his system analyst to come up with the plan in making their electronic computer system works as well as the status of their programming.
- Published
- 1962
41. Development of an open source agricultural mobile data collector system.
- Author
-
Szilágyi, Róbert and Tóth, Mihály
- Subjects
- *
BIG data , *ACQUISITION of data , *ELECTRONIC data processing , *COMPUTER system design & construction , *DETECTORS - Abstract
The information is important in every decision area. The Big Data philosophy lead to collect every possible data. Nowadays these applications are more and more successful in the following agricultural areas: different parts of food industry, extension services, precision agriculture. While studying the use of these new ICT technologies can be concluded that different types of services offer different possibilities. Firstly we compared the possible mainboards and sensors. General information about the existing mobile main boards. We compared the Atmel AVR, the Raspberry PI and the LEGO Mindstorms NXT. We choosed the Arduino system board. We described the main system architecture and connection possibilites. We found the temperature sensor widely useable. The software was also briefly mentioned. We can say there are several advantages of the Arduino. The whole system can be upgradeable, and there are several Arduino based mainboards and sensors too. Nowadays the block programming support are increasing (etc. MIT Appinventor), but there are disadvantages too. The system has several limitation: the number of the connected sensor, the connection type, the system energy supply, the data loss, the creation of user friendly interface and the system failure tolerability. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
42. A dynamic modelling framework for control-based computing system design.
- Author
-
Papadopoulos, Alessandro Vittorio, Maggio, Martina, Terraneo, Federico, and Leva, Alberto
- Subjects
- *
DYNAMIC models , *COMPUTER system design & construction , *HEURISTIC algorithms , *FEEDBACK control systems , *RESOURCE allocation , *COMPUTER scheduling - Abstract
This manuscript proposes a novel viewpoint on computing systems’ modelling. The classical approach is to consider fully functional systems and model them, aiming at closing some external loops to optimize their behaviour. On the contrary, we only model strictly physical phenomena, and realize the rest of the system as a set of controllers. Such an approach permits rigorous assessment of the obtained behaviour in mathematical terms, which is hardly possible with the heuristic design techniques, that were mainly adopted to date. The proposed approach is shown at work with three relevant case studies, so that a significant generality can be inferred from it. [ABSTRACT FROM PUBLISHER]
- Published
- 2015
- Full Text
- View/download PDF
43. Strategy based semantics for mobility with time and access permissions.
- Author
-
Ciobanu, Gabriel, Koutny, Maciej, and Steggles, Jason
- Subjects
- *
COMPUTER engineering , *COMPUTER system design & construction , *SEMANTICS , *COMPUTER simulation , *MOBILE communication systems - Abstract
The process algebras Timed Mobility ( TiMo) and its extension Permissions, Timers and Mobility ( PerTiMo) were recently proposed to support engineering applications in distributed system design. TiMo provides a formal framework in which process migration between distinct locations and timing constraints linked to local clocks can be modelled and analysed. This is extended in PerTiMo by associating access permissions to communication to model security aspects of a distributed system. In this paper we develop a new semantic model for TiMo using Rewriting Logic (RL) and strategies, with the aim of providing a foundation for tool support; in particular, strategies are used to capture the locally maximal concurrent step of a TiMo specification which previously required the use of action rules based on negative premises. This RL model is then extended with access permissions in order to develop a new semantic model for PerTiMo. These RL semantical models are formally proved to be sound and complete with respect to the original operational semantics on which they were based. We present examples of how the developed RL models for TiMo and PerTiMo can be implemented within the strategy-based rewriting system Elan and illustrate the range of (behavioural) properties that can be analysed using such a tool. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
44. Designing High Quality ICT for Altered Environmental Conditions.
- Author
-
Vaziri, Daryoush Daniel, Schreiber, Dirk, and Gadatsch, Andreas
- Subjects
INFORMATION & communication technologies ,QUALITY of service ,COMPUTER system design & construction ,DEMOGRAPHIC transition ,UNIVERSAL design - Abstract
This article concerns the design and development of Information- and Communication Technology, in particular computer systems in regard to the demographic transition which will influence user capabilities. It is questionable if current applied computer systems are able to meet the requirements of altered user groups with diversified capabilities. Such an enquiry is necessary based on actual forecasts leading to the assumption that the average age of employees in enterprises will increase significantly within the next 50-60 years, while the percentage of computer aided business tasks, operated by human individuals, rises from year to year. This progress will precipitate specific consequences for enterprises regarding the design and application of computer systems. If computer systems are not adapted to altered user requirements, efficient and productive utilisation could be negatively influenced. These consequences constitute the motivation to extend traditional design methodologies and thereby ensure the application of computer systems that are usable, independent of user capabilities. In theory as well as in practice several design and development concepts described are respectively applied. However, in most cases these concepts are considered as solitary independent solutions. Generally, theories contrast usability and accessibility as two different concepts. While the first provides possibilities for specific user groups to accomplish tasks efficiently, effectively and satisfactorily, the latter provides solutions taking into consideration people with a wide range of capabilities, such as disabled people or people with an enduring health problem. Both concepts are quite extensive. Therefore developers tend to decide between these concepts, which always leads to failures. This article seeks to provide a universal design and development approach for computer systems, by combining these individually considered concepts into one common approach. This approach will not distinguish between user groups, but instead, will provide procedures and solutions to design computer systems, which consider all relevant user capabilities. The results of this article provide a theoretical approach for design and development cycles. Enterprises will be sensitised for the identification of relevant user requirements and the design of human-centred computer systems. [ABSTRACT FROM AUTHOR]
- Published
- 2012
45. An Efficient Approach for System-Level Timing Simulation of Compiler-Optimized Embedded Software.
- Author
-
Zhonglei Wang and Herkersdorf, Andreas
- Subjects
EMBEDDED computer systems ,TIMING circuits ,SYSTEMS design ,ELECTRONIC systems ,INTEGRATED circuit design ,COMPUTER system design & construction - Abstract
Software accounts for more than 80% of embedded system development efforts, so software performance estimation is a very important issue in system design. Recently, source level simulation (SLS) has become a state-of-the-art approach for software simulation in system level design. However, the simulation accuracy relies on the mapping between source code and binary code, which can be destroyed by compiler optimizations. This drawback strongly limits the usability of this technique in practical system design. We introduce an approach to overcome this limitation by converting source code to a low level representation, called intermediate source code (ISC). ISC has accounted for most compiler optimizations and has a structure close to binary code, so it allows for accurate back-annotation of timing information from the binary level. To show the benefits of our approach, we present a quantitative comparison of the related techniques with the proposed one, using a set of benchmarks. [ABSTRACT FROM AUTHOR]
- Published
- 2009
46. DESIGNING OBJECT ORIENTED SYSTEMS USING STEREOTYPES AND PATTERNS.
- Author
-
Bezerra, Vinicius Miana and Hirama, Kechi
- Subjects
OBJECT-oriented databases ,UNIFIED modeling language ,ELECTRONIC commerce ,COMPUTER system conversion ,COMPUTER system design & construction ,COMPUTER security ,COMPUTER systems integration services - Abstract
Design Patterns are a great aid in the design process of object-oriented systems, however the number of design patterns being published is growing rapidly, making their use and management harder. In e-business such patterns involve distributed objects and are even more complex. This paper presents a technique that can be used to catalog design patterns and associate them to UML stereotypes. As a result the design models of a typical system will involve fewer elements and therefore will be simpler to understand and use. Using this technique with the aid of code generation tools, the productivity and quality during design can be dramatically increased. [ABSTRACT FROM AUTHOR]
- Published
- 2006
47. IMPLEMENTING SHIBBOLETH AT A UK NATIONAL ACADEMIC DATA CENTRE.
- Author
-
MacIntyre, Ross and Chaplin, David
- Subjects
INTERNET in education ,FEDERATED database systems ,AUTHENTICATION (Law) ,COMPUTER system design & construction - Abstract
The UK education sector is embarking upon the adoption of Internet2's Shibboleth software for federated access management. This paper recounts the early experiences of a large academic data centre in implementing support for Shibboleth across its range of services. It covers the practical approach adopted, a worked example and the significant issues raised. Familiarity with federated access and identity management is assumed. [ABSTRACT FROM AUTHOR]
- Published
- 2005
48. DESIGN AND IMPLEMENTATION OF A GRID COMPUTING FRAMEWORK BASED ON SPACES.
- Author
-
XIE Jingming and QI Deyu
- Subjects
GRID computing ,COMPUTER system design & construction ,3G networks ,COMPUTER simulation ,PRODUCTION scheduling ,DISTRIBUTED computing - Published
- 2005
49. RADIOSCAPE - SYSTEM DESIGN TOOL FOR INDOOR WIRELESS COMMUNICATIONS VIA THE INTERNET -.
- Author
-
Yoshinori Watanabe, Hiroshi Furukawa, Kazuhiro Okanoue, and Shuntaro Yamazaki
- Subjects
WIRELESS communications software ,INDOOR positioning systems ,COMPUTER system design & construction ,CLIENT/SERVER computing software ,RADIO wave propagation ,GRAPHICAL user interfaces ,MATHEMATICAL models - Published
- 2001
50. Design flaws of 'an anonymous two-factor authenticated key agreement scheme for session initiation protocol using elliptic curve cryptography'.
- Author
-
Kumari, Saru
- Subjects
SESSION Initiation Protocol (Computer network protocol) ,COMPUTER access control ,KEY agreement protocols (Computer network protocols) ,DATA security ,COMPUTER system design & construction - Abstract
Recently, a two-factor authenticated key agreement scheme for session initiation protocol is published by Lu et al. in Multimedia Tools and Applications [doi:10.1007/s11042-015-3166-4]. I have examined this scheme and found some design flaws in it. Due to flaw in registration phase, the scheme is vulnerable to guessing attacks. However, flaws during key agreement phase hinder the functionality of the scheme in such a way that mutual authentication process between the user and the server is not viable. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.