1,015 results on '"COMPUTER software research"'
Search Results
2. Extensiones de Mtest.search para la generación de código de prueba.
- Author
-
Miguel Güemes-Esperón, Alejandro, Dunia Delgado-Dapena, Martha, Beatriz Fernández-Oliva, Perla, and Margarita Henry-Chibas, Heydi
- Subjects
- *
COMPUTER software testing , *COMPUTER software quality control , *COMPUTER software industry , *EXECUTION traces (Computer program testing) , *COMPUTER software development , *BEST practices , *PROGRAMMING languages , *AUTOMATION , *AUTOMATION software , *COMPUTER software research , *SOURCE code - Abstract
Software testing focuses on detecting defects or failures during code execution. Testing is a challenging creative task, requiring automation. The adoption of good practices and testing strategies contributes to increasing the efficiency of software development companies. MTest.search model for automatic unit test generation has defined domain model extension, test, and execution mechanisms. In this work, mechanisms to extend the search-based reduction model are presented. The proposed extensions take into account the objects and sets involved in the source code, and enhance the detection of defects or failures based on the significance of the values and paths / scenarios involved in the test. To validate the proposal, three case studies were defined using classical methods and real projects. [ABSTRACT FROM AUTHOR]
- Published
- 2022
3. A Software Institute for Data-Intensive Sciences, Joining Computer Science Academia and Natural Science Research.
- Author
-
Doglioni, C., Kim, D., Stewart, G.A., Silvestris, L., Jackson, P., Kamleh, W., Bird, Ian, Campana, Simone, Mato Vila, Pere, Roiser, Stefan, Schulz, Markus, Stewart, Graeme A., and Valassi, Andrea
- Subjects
- *
DATA analysis , *COMPUTER software development , *COMPUTER software research , *ELECTRONIC data processing , *PHYSICS experiments - Abstract
With the ever-increasing size of scientific collaborations and complexity of scientific instruments, the software needed to acquire, process and analyze the gathered data is increasing in both complexity and size. Unfortunately the role and career path of scientists and engineers working on software R&D and developing scientific software are neither clearly established nor defined in many fields of natural science. In addition, the exchange of information between scientific software development and computer science departments at universities or computing schools is scattered and fragmented into individual initiatives. To address the above issues we propose a new effort on a European level, which concentrates on strengthening the role of software developers in natural sciences, acts as a hub for the exchange of ideas among different stake-holders in computer science and scientific software and forms a lobbying forum for software engineering in natural sciences on an international level. This contribution discusses in detail the motivation, role and interplay with other initiatives of a "Software Institute for Data-Intensive Sciences", which is currently being discussed between research institutes, universities and funding agencies in Europe. In addition to the current status, an outlook on future prospects of this initiative will be given. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
4. Revista de Historia de América.
- Subjects
- *
COMPUTER software research , *UNITED States history - Abstract
The article presents different section introduction of the periodical, which discusses articles on the topics including reflections on the use of software in research; and teaching tasks of the History of America through the publication of peer-reviewed contributions national and foreign academics.
- Published
- 2021
5. The research of endless loop detection method based on the basic path.
- Author
-
Gao, Xuexin, Mu, Yongmin, and Shen, Meie
- Subjects
COMPUTER software research ,SOURCE code ,COMPUTER engineering ,COMPUTER software development ,SOFTWARE engineering ,COMPUTER software - Abstract
The detection of the endless loop has been an important issue in software research. In order to ensure the quality of software and improve the accuracy of endless loop detection, a method based on the basic path of the endless loop detection is proposed. Firstly, the source code is preprocessed, we cleaned the annotations, and detect the absolute endless loop; secondly, the intermediate code by the GCC compiler is generated, and the structural features of the program and generated the control flow graph are abstracted. Then we get the set of basic path and circle by the depth-first method. Finally, we backtracked from the end of the control flow graph which is used to find the exit of the loop structure and determine whether the exit condition is satisfied. Experiments demonstrate that the method improves the efficiency and accuracy of the endless loop detection. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
6. The Impact of Network Externalities on the Competition Between Open Source and Proprietary Software.
- Author
-
CHENG, HSING KENNETH, LIU, YIPENG, and TANG, QIAN (CANDY)
- Subjects
EXTERNALITIES ,COMPUTER networks ,ECONOMIC competition ,OPEN source software ,COMPUTER software research ,SOFTWARE compatibility - Abstract
In this paper, we build analytical models to examine the impact of network externalities on the competition between open source software (OSS) and proprietary software. We investigate the competing OSS and proprietary software products with comparable functionalities in four different scenarios depending on whether they are compatible with each other and whether the underlying market is fully covered (i.e., all consumers adopt one of the two products). Furthermore, we study which party has the most incentive to make its product compatible with its counterpart. When the market is fully covered, the installed base and the profit of proprietary software increase at the expense of a decreasing user base for OSS in the presence of network externalities. This competitive imbalance becomes more pronounced when OSS and proprietary software are incompatible and the market is partially covered. Finally, we find that in the presence of network externalities, being compatible with its rival is not desirable for the proprietary software, but highly beneficial to the OSS community. [ABSTRACT FROM AUTHOR]
- Published
- 2011
- Full Text
- View/download PDF
7. A New Objective-C Runtime: From Research to Production.
- Author
-
Chisnall, David
- Subjects
- *
OBJECTIVE-C (Computer program language) , *RUN time systems (Computer science) , *OPEN source software , *SOFTWARE compatibility , *COMPUTER software development , *COMPUTER software research - Abstract
The article describes the development and production of Objective-C computer programming language implementation runtime system for use with the Étoilé open source software project. Particular focus is given to the emphasis on backward compatibility with other programming languages. Topics include the GNU Compiler Collection (GCC) software released by the the Free Software Foundation, application binary interface (API) between software and operating systems, and the creation and modification of method lookup functions.
- Published
- 2012
- Full Text
- View/download PDF
8. Study of a New Digital Text Watermarking Algorithm.
- Author
-
Han, Xiaofeng, Li, Yan, and Liu, Guodong
- Subjects
- *
DIGITAL watermarking , *PUBLISHING , *COMPUTER software research , *ALGORITHMS - Abstract
As information technology develops rapidly and smart devices become popular, the traditional publishing industry is changing form. It has become very common for newspapers, books, magazines and pictures to transmit on the network. Because the network is easy to copy and spread text resources, the lawful interests of owners are liable to violate. In order to solve this problem, copyright protection technology represented by the text watermark has been invented. In this paper, a novel digital text watermarking algorithm utilizing word processing software is researched, which can ignore some faults in the documents to some extent. By means of this new algorithm, users can keep the copyright information in the reserved words of files and properly use the documents. The embedded watermark can be extracted and the copyright can be verified. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
9. Acycle: Time-series analysis software for paleoclimate research and education.
- Author
-
Li, Mingsong, Hinnov, Linda, and Kump, Lee
- Subjects
- *
TIME series analysis , *COMPUTER software research , *GRAPHICAL user interfaces , *EDUCATION research , *EDUCATION software , *CYCLOSTRATIGRAPHY - Abstract
Abstract Recognition and interpretation of paleoclimate signals in sedimentary proxy datasets are time consuming and subjective. Acycle is a comprehensive and easy-to-use software package for time series analysis in paleoclimate research and education. It is designed to speed paleoclimate time series analysis, especially cyclostratigraphy, and to provide objective methods for estimating astrochronology. Acycle provides for detrending with multiple options to track and remove secular trends. A selection of power spectral analysis methodologies is offered for the detection of periodic signals. Many of the functions are specific to cyclostratigraphy and astrochronology that are not found in standard statistical packages. A specialized function is provided to assess the astronomical (Milankovitch) forcing of paleoclimate series and search for the most likely sedimentation rate by evaluating the correlation coefficient between power spectra of an astronomical solution and sedimentary proxy data. Sedimentary noise modeling (for past sea-level changes) is also provided in Acycle. As an example, Acycle is applied to a sedimentary proxy series from the cyclostratigraphy of the Paleocene-Eocene thermal maximum (PETM) in Core BH9/05 from the Paleogene Central Basin, Svalbard. Acycle detects significant astronomical forcing in the proxy series and relatively stable sedimentation rates during and after the PETM. Acycle runs in the MATLAB environment or as stand-alone software on Windows and Macintosh OS X, and is open-source software. Highlights • Acycle is signal processing software for paleoclimate research and education. • Many of the functions are specific to cyclostratigraphy and astrochronology. • Acycle includes models for sedimentary noise and sedimentation rate. • A fully implemented graphical user interface facilitates operator use. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
10. DRfit: a Java tool for the analysis of discrete data from multi-well plate assays.
- Author
-
Hofmann, Andreas, Preston, Sarah, Cross, Megan, Herath, H. M. P. Dilrukshi, Simon, Anne, and Gasser, Robin B.
- Subjects
- *
BIOCHEMICAL research , *DATA analysis , *DOSE-response relationship in biochemistry , *PHARMACEUTICAL research , *ENZYME analysis , *ENZYMES , *LIFE sciences research , *COMPUTER software research - Abstract
Background: Analyses of replicates in sets of discrete data, typically acquired in multi-well plate formats, is a recurring task in many contemporary areas in the Life Sciences. The availability of accessible cross-platform data analysis tools for such fundamental tasks in varied projects and environments is an important prerequisite to ensuring a reliable and timely turnaround as well as to provide practical analytical tools for student training. Results: We have developed an easy-to-use, interactive software tool for the analysis of multiple data sets comprising replicates of discrete bivariate data points. For each dataset, the software identifies the replicate data points from a defined matrix layout and calculates their means and standard errors. The averaged values are then automatically fitted using either a linear or a logistic dose response function. Conclusions: DRfit is a practical and convenient tool for the analysis of one or multiple sets of discrete data points acquired as replicates from multi-well plate assays. The design of the graphical user interface and the built-in analysis features make it a flexible and useful tool for a wide range of different assays. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
11. iMOPSE: a library for bicriteria optimization in Multi-Skill Resource-Constrained Project Scheduling Problem.
- Author
-
Myszkowski, Paweł B., Laszczyk, Maciej, Nikulin, Ivan, and Skowroński, Marek
- Subjects
- *
BILEVEL programming , *GREEDY algorithms , *GENETIC algorithms , *LIBRARY software , *RESEARCH libraries , *COMPUTER software research - Abstract
This paper presents a software library as a research and educational tool for Multi-Skill Resource-Constrained Scheduling Problem. The following useful tools have been implemented in Java: instance Generator, solution validator, solution visualizer and example solvers: Greedy algorithm and Genetic Algorithm. All tools are supported by iMOPSE dataset which consists of 36 instances and additional 'small' 6 instances for educational purpose. In the paper, three test studies are described: (1) educational use of 6 'small' instances, (2) optimization of cost or duration of a schedule, and (3) simple bicritieria optimization of cost/duration of a final schedule. All described tools/examples are freely published on iMOPSE homepage. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
12. Optimal ice routing of a ship with icebreaker assistance.
- Author
-
Topaj, A.G., Tarovik, O.V., Bakharev, A.A., and Kondratenko, A.A.
- Subjects
- *
ICEBREAKERS (Ships) , *ICE , *RAILROAD routing , *ICE navigation , *ENERGY consumption , *COMPUTER software research - Abstract
Abstract Offshore development and growing prospects of commercial shipping in the Arctic pose the challenge of optimal ship routing in ice. Route selection in spatially distributed ice conditions significantly affects the voyage time and determines the efficiency of shipping. Most of the applied methods of ice routing solve the problem of a single vessel route selection without considering icebreaker support. At the same time, the real practice of ice navigation is closely connected with icebreaker assistance. It allows reducing the voyage time and fuel consumption, while having additional costs for icebreaker services. Such opposite trends set an optimization task that has not been studied in detail before. In this article, we presented the formulation of a Single Vessel and Icebreaker Assisted Ice Routing optimization problem in non-stationary ice conditions. We considered the icebreaker assistance as an integral part of the overall route optimization problem, and used the economic criterion to optimize both ship route and amount of icebreaker involvement. The article contains the adapted mathematical formulations of classical graph-based and wave-based routing problems in order to consider icebreaker assistance. To prove the practical applicability of these formulations, we developed special subject-oriented research software and implemented there both graph-based (Dijkstra, A*) and the wave-based ice routing methods. Using this developments, we conducted several case studies and made the analysis of strengthens and weaknesses of the alternative routing methods in case of ice operation. The results of the study may serve an additional step to the practical implementation of ice routing technologies and planning of icebreaker resources. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
13. A dynamic control technique to enhance the flexibility of software artifact reuse in large-scale repository.
- Author
-
Kim, Doohwan, Nam, Seungwoo, and Hong, Jang-Eui
- Subjects
- *
COMPUTER software reusability , *COMPUTER software research , *COMPUTER software development , *END users (Information technology) , *COMPUTER users - Abstract
Reuse is the activity of developing new software systems using software components (or artifacts) that are already proven and reliable. However, traditional reuse-based software development has difficulties in finding the components that have the proper information (feature) to match the developers' needs, or reusing a component without modification, because it has various and mixed information (features). In order to solve these problems, this paper proposes a dynamic control technique to enhance the reusability of software components. In particular, this technique focuses on the reuse of software documents that are created during the software research and development processes. We define a new unit of document reuse as a microComponent; this is a basic unit of reuse defined with a section of a software document. Based on the microComponent, it is possible to fast find more suitable components from a large-scale document repository; to control the reuse granularity from a section to an entire document; and finally, to improve the reusability of existing reusable assets. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
14. Community Organizations: Changing the Culture in Which Research Software Is Developed and Sustained.
- Author
-
Katz, Daniel S., McInnes, Lois Curfman, Bernholdt, David E., Mayes, Abigail Cabunoc, Hong, Neil P. Chue, Duckles, Jonah, Gesing, Sandra, Heroux, Michael A., Hettrick, Simon, Jimenez, Rafael C., Pierce, Marlon, Weaver, Belinda, and Wilkins-Diehr, Nancy
- Subjects
COMPUTER engineering ,COMMUNITY organization ,COMPUTER software research - Abstract
Software is the key crosscutting technology that enables advances in mathematics, computer science, and domain-specific science and engineering to achieve robust simulations and analysis for science, engineering, and other research fields. However, software itself has not traditionally received focused attention from research communities; rather, software has evolved organically and inconsistently, with its development largely as by-products of other initiatives. Moreover, challenges in scientific software are expanding due to disruptive changes in computer hardware, increasing scale and complexity of data, and demands for more complex simulations involving multiphysics, multiscale modeling and outer-loop analysis. In recent years, community members have established a range of grass-roots organizations and projects to address these growing technical and social challenges in software productivity, quality, reproducibility, and sustainability. This article provides an overview of such groups and discusses opportunities to leverage their synergistic activities while nurturing work toward emerging software ecosystems. [ABSTRACT FROM AUTHOR]
- Published
- 2019
- Full Text
- View/download PDF
15. 顾及姿态及声线弯曲的多波束测深归算模型研究.
- Author
-
金绍华, 刘国庆, 孙文川, 边刚, and 崔杨
- Subjects
PUNCHED card systems ,RESEARCH & development ,COMPUTER software research ,EXAMPLE ,COMPUTER software - Abstract
Copyright of Hydrographic Surveying & Charting / Haiyang Cehui is the property of Hydrographic Surveying & Charting Editorial Board and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2019
- Full Text
- View/download PDF
16. Compiler Research: the next 50 Years.
- Author
-
HALL, MARY, PADUA, DAVID, and PINGALI, KESHAV
- Subjects
- *
COMPILERS (Computer programs) , *COMPUTER software research , *PROGRAMMING software , *SYSTEMS software , *FORECASTING , *COMPUTER science , *COMPUTER software - Abstract
The article discusses research done in the area of computer compiler technology, which is defined as a computer program that translates a program written in a high-level language into another language, usually machine language. 2007 was the 50th anniversary of IBM's (International Business Machine's) release of the first optimizing compiler, the authors state. Topics include directions compiler technology make take from 2009 and beyond, a review of past contributions to compiler technology, and the status of compiler technology in 2009.
- Published
- 2009
- Full Text
- View/download PDF
17. Fast and Flexible Large-Scale Clone Detection with CloneWorks.
- Author
-
Svajlenko, Jeffrey and Roy, Chanchal K.
- Subjects
INSTITUTIONAL repositories ,COMPUTER software research ,COMPUTER software development ,PLUG-ins (Computer programs) ,COMPUTER workstation clusters - Abstract
Clone detection in very-large inter-project repositories has numerous applications in software research and development. However, existing tools do not provide the flexibility researchers need to explore this emerging domain. We introduce CloneWorks. a fast and flexible clone detector for large-scale clone detection experiments. CloneWorks gives the user full control over the representation of the source code before clone detection, including easy plug-in of custom source transformation, normalization and Altering logic. The user can then perform targeted clone detection for any type or kind of clone of interest. CloneWorks uses our fast and scalable partitioned partial indexes approach, which can handle any input size on an average workstation using input partitioning. CloneWorks can detect Typc-3 clones in an input as large as 250 million lines of code in just four hours on an average workstation, with good recall and precision as measured by our BigCloneBench. [ABSTRACT FROM AUTHOR]
- Published
- 2017
- Full Text
- View/download PDF
18. Systematic Review of Software Behavioral Model Consistency Checking.
- Author
-
UL MURAM, FAIZ, TRAN, HUY, and ZDUN, UWE
- Subjects
- *
UNIFIED modeling language , *COMPUTER software development , *CONSISTENCY models (Computers) , *COMPUTER software research , *COMPUTER software development -- Management - Abstract
In software development, models are often used to represent multiple views of the same system. Such models need to be properly related to each other in order to provide a consistent description of the developed system. Models may contain contradictory system specifications, for instance, when they evolve independently. Therefore, it is very crucial to ensure that models conform to each other. In this context, we focus on consistency checking of behavior models. Several techniques and approaches have been proposed in the existing literature to support behavioral model consistency checking. This article presents a Systematic Literature Review (SLR) that was carried out to obtain an overview of the various consistency concepts, problems, and solutions proposed regarding behavior models. In our study, the identification and selection of the primary studies was based on a well-planned search strategy. The search process identified a total of 1770 studies, out of which 96 have been thoroughly analyzed according to our predefined SLR protocol. The SLR aims to highlight the state-of-the-art of software behavior model consistency checking and identify potential gaps for future research. Based on research topics in selected studies, we have identified seven main categories: targeted software models, types of consistency checking, consistency checking techniques, inconsistency handling, type of study and evaluation, automation support, and practical impact. The findings of the systematic review also reveal suggestions for future research, such as improving the quality of study design and conducting evaluations, and application of research outcomes in industrial settings. For this purpose, appropriate strategy for inconsistency handling, better tool support for consistency checking and/or development tool integration should be considered in future studies. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF
19. FLEXIBLE SOFTWARE RELIABILITY GROWTH MODEL UNDER IMPERFECT DEBUGGING USING LEARNING FUNCTION.
- Author
-
Sharma, Dinesh K., Kumar, Deepak, and Sharma, Shubhra Gautam
- Subjects
COMPUTER software research ,COMPUTER files ,COMPUTER systems ,RELIABILITY in engineering ,ENGINEERING - Abstract
Software reliability is a probability of a system to work failure free for a given period under given conditions. In this paper, we propose a new Software Reliability Growth Model (SRGM) with imperfect debugging using learning function. The model is validated on software real data sets and compared with the existing SGRMs in the literature. [ABSTRACT FROM AUTHOR]
- Published
- 2016
20. Legally Speaking Why Do Software Startups Patent (or Not)?
- Author
-
Samuelson, Pamela
- Subjects
- *
COMPUTER software research , *PATENTS , *COMPUTER software industry , *SURVEYS , *NEW business enterprises , *INTELLECTUAL property - Abstract
The article examines results stemming from the 2008 Berkeley Patent Survey, which sought to establish how many high technology entrepreneurs have sought patents for innovations embodied in their services and products. The author examines the economic benefits and hindrances associated with software patents and discusses why startup companies in this sector may choose to patent their products. It was discovered, among intellectual property rights, that copyrights, trademarks, secrecy, and the difficulties of reverse engineering outranked patents as a means of gaining a competitive advantage.
- Published
- 2010
- Full Text
- View/download PDF
21. LIGHTWEIGHT FISHEYE CAMERAS IN PHOTOGRAMMETRY.
- Author
-
Ostrowski, Wojciech
- Subjects
- *
PHOTOGRAMMETRY , *COMPUTER software research , *CAMERAS , *THREE-dimensional display systems , *TECHNOLOGY - Abstract
The development of photogrammetric methods and their integration with structure-from-motion technologies has significantly increased the ability to conduct photogrammetric studies with the use of non-professional cameras. In the previous year (2014) the option to use fisheye cameras was introduced in some commercial software programs, which allowed a broader use of popular, light sport cameras in 3D modelling. The goal of the study was to compare the results obtained with two most popular programs of this type together with a solution developed by the author before such functionalities first appeared in commercial software. [ABSTRACT FROM AUTHOR]
- Published
- 2015
22. New evaluation model by means of Mobile Technology.
- Author
-
Moldovan, Liviu
- Subjects
MOBILE learning ,VOCATIONAL education research ,OCCUPATIONAL training ,PEER review of students ,COMPUTER software research - Abstract
This paper presents employment of mobile technology in order to turn the assessment process into a more simplified and learning friendly process. It is developed a new learning and evaluation model that comprises four evaluation-learning activities: pre-tests, learning, post-tests and comparison which allows instructor to organise teaching in many different ways. The four steps for the evaluation employment by means of the new Peer Learning Assessment Software ONE2ACT are described. Conclusions for the evaluation model and testing with the software demonstrate impressions of trainees and instructors regarding technological and methodological aspects of the evaluation. The success indicators in terms of effectiveness as well as quality and completion for software employment and assessment methodology are exemplified. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
23. Software solutions for the analysis of the innovation capability of the company.
- Author
-
Alexe, Cătălin-George and Alexe, Cătălina-Monica
- Subjects
INNOVATIONS in business ,COMPUTER software research ,INNOVATION management ,INFORMATION technology research ,BENCHMARKING (Management) - Abstract
This paper aims to develop a brief overview of IT solutions to support the analysis of the innovation capability of a firm, which at first sight are not so many and are generally grouped in the German-speaking area, where there are serious and visible concerns in this direction. In this regard, there are mentioned the solutions offered by IMP
3 rove, InnoScore, TCW Innovationsaudit. Software solutions dedicated to the analysis of the innovation capabilities that have occurred in recent years in the world, come to eliminate these shortcomings by providing the possibility of obtaining various management reports particularly useful in managing innovation in a company. Thus, it can perform internal and external benchmarking and, not at least, safety data is made easier. Many IT solutions are solutions identified online free that can be useful for entrepreneurs and managers, with the possibility of comparing results of the analyzed company to other companies' performances in the European space. [ABSTRACT FROM AUTHOR]- Published
- 2015
- Full Text
- View/download PDF
24. Software component clustering and classification using novel similarity measure.
- Author
-
Srinivas, Chintakindi, Radhakrishna, Vangipuram, and Rao, C. V.Guru
- Subjects
COMPUTER software research ,CLUSTER analysis (Statistics) ,SIMILARITY transformations ,GAUSSIAN function ,EUCLIDEAN algorithm ,COSINE function - Abstract
The similarity measures such as Euclidean, Jaccard, Cosine, Manhattan etc present in the literature only consider the count of the features but does not consider the feature distribution and the degree of commonality. There is a significant research carried out for designing new similarity measures which can accurately find the similarity between any two software components. The distribution of component features in the software components has important contribution in evaluating their degree of similarity. This is the Key idea for the design of the proposed measure. The main objective of this research is to first design an efficient similarity measure which essentially considers the distribution of the features over the entire input. We then carry out the analysis for worst case, average case and best case situations. The proposed measure is Gaussian based and preserves the properties of Gaussian function and can be used for clustering and classification of software components. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
25. The ECCOMAT program for the selection of ecological materials in order to ensure a healthy built environment.
- Author
-
Aciu, Claudiu, Manea, Daniela Lucia, and Striletchi, Cosmin
- Subjects
COMPUTER software research ,BUILT environment ,CASE studies ,HEALTH ,CONSTRUCTION industry - Abstract
The quality of the indoor environment is a determining factor for health due to fact that people spend most of their lives inside the buildings. In the current context, the entire construction industry is confronted with particular priorities regarding the execution of sustainable buildings. Designers and constructors have become increasingly aware of the wide spectrum of issues that affect the environment and health, but they face a confusing number of possible actions and solutions, which makes the selection of materials difficult. The paper presents the calculation program developed based on the ECCOMAT analysis method, which is a tool designed to offer the users multiple possibilities for the management and analysis of building materials, helping them obtain in an easy and rapid way the optimal solution. The application field is the design and construction of buildings with ecological materials and the extension of research in order to obtain new ecological materials. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
26. Automated concept of potato sizing equipment.
- Author
-
Edu, Filip Vladimir, Cazangiu, Diana, and Csatlos, Carol
- Subjects
MACHINERY ,AUTOMATION ,COMPUTER software research ,RIGID body mechanics ,MACHINING - Abstract
The paper presents an automated concept of a potato sizing equipment. The technological flow is represented by dimension charts done in the software CATIA v5 r18. The classical technological flows of potato sizing are rigid systems, but modern systems are flexible and have modular construction. Modular construction has the advantage that when a component of the equipment breaks out, it can be moved away and thus not affecting the entire equipment. The role of automation is illustrated by the designing of an original potato sizing machine, based on the real geometrical quotation. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
27. Software implementation for evaluating human resources in the legal department within companies in technical field.
- Author
-
Adrian, Mocanu, Mihai, Birsan, and Angela, Repanovici
- Subjects
CORPORATE legal departments ,PERSONNEL management ,COMPUTER software research ,ORGANIZATIONAL behavior research ,ENGINEERING management - Abstract
The paper deals with the very particular issue of the legal departments functioning within companies working in technical fields. Besides engineering and management knowledge, these companies must bear with a strong juridical knowledge, incorporated in their activity. The authors propose a systemic perspective based on the concept of evaluation of the human resources as the foundation of development of the company. The originality of the work described here consists of the implementation of a computer program called MPS J (Managementul Personalului din Sistemul Juridic) - Legal Personnel Management System - that can be applicable to the legal departments of the companies in order to determine the role of legal and managerial skills for continuous improving. [ABSTRACT FROM AUTHOR]
- Published
- 2015
28. SCADA simulation of a distributed generation system with storage technologies.
- Author
-
Dulău, Lucian loan, Abrudean, Mihail, and Bică, Dorin
- Subjects
SIMULATION methods & models ,ELECTRIC power production research ,ELECTRIC power distribution ,ENERGY storage ,COMPUTER software research - Abstract
This paper describes the simulation of a distributed generation system with storage technologies. The simulation is performed using a SCADA software for a distributed generation system, considering the generation cost, the load and the availability of the system's generating units. Also, a storage unit is added to the system. An overview of the storage technologies is also presented. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
29. Towards a Practical Security Analysis Methodology.
- Author
-
van den Berghe, Alexander
- Subjects
COMPUTER software research ,INVESTMENT analysis ,HETEROGENEOUS computing ,MODEL-integrated computing ,SOFTWARE frameworks - Abstract
The research community has proposed numerous techniques to perform security-oriented analyses based on a software design model. Such a formal analysis can provide precise security guarantees to the software designer, and facilitate the discovery of subtle flaws. Nevertheless, using such techniques in practice poses a big challenge for the average software designer, due to the narrow scope of each technique, the heterogeneous set of modelling languages that are required, and the analysis results that are often hard to interpret. Within the course of our research, we intend to provide practitioners with an integrated, easy-to-use modelling and analysis environment that enables them to work on a broad range of common security concerns without leaving the software design's level of abstraction. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
30. Towards Generation of Software Development Tasks.
- Author
-
Thompson, C. Albert
- Subjects
COMPUTER software research ,DEVELOPMENT of application software ,COMPUTER software developers ,ARTIFICIAL intelligence ,DATA mining - Abstract
The presence of well defined fine-grained sub-tasks is important to the development process: having a fine-grained task context has been shown to allow developers to more efficiently resume work. However, determining how to break a high level task down into sub-tasks is not always straightforward. Sometimes developers lack experience, and at other times, the task definition is not clear enough to afford confident decomposition. In my research I intend to show that by using syntactic mining of past task descriptions and their decomposition, I can provide automatically derived sub-task suggestions to afford more confident task decomposition by developers. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
31. Analysis of Android Inter-App Security Vulnerabilities Using COVERT.
- Author
-
Sadeghi, Alireza, Bagheri, Hamid, and Malek, Sam
- Subjects
COMPUTER software research ,AUTOMATION ,SOFTWARE engineering ,SOFTWARE engineers - Abstract
The state-of-the-art in securing mobile software systems are substantially intended to detect and mitigate vulnerabilities in a single app, but fail to identify vulnerabilities that arise due to the interaction of multiple apps, such as collusion attacks and privilege escalation chaining, shown to be quite common in the apps on the market. This paper demonstrates COVERT, a novel approach and accompanying tool-suite that relies on a hybrid static analysis and lightweight formal analysis technique to enable compositional security assessment of complex software. Through static analysis of Android application packages, it extracts relevant security specifications in an analyzable formal specification language, and checks them as a whole for inter-app vulnerabilities. To our knowledge, COVERT is the first formally-precise analysis tool for automated compositional analysis of Android apps. Our study of hundreds of Android apps revealed dozens of inter-app vulnerabilities, many of which were previously unknown. A video highlighting the main features of the tool can be found at: http://youtu.be/bMKk7OW7dGg. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
32. StressCloud: A Tool for Analysing Performance and Energy Consumption of Cloud Applications.
- Author
-
Feifei Chen, John Grundy, Schneider, Jean-Guy, Yun Yang, and Qiang He
- Subjects
COMPUTER software research ,SOFTWARE engineering ,ENERGY consumption ,CLOUD computing ,AUTOMATION ,SOFTWARE engineers - Abstract
Finding the best deployment configuration that maximises energy efficiency while guaranteeing system performance of cloud applications is an extremely challenging task. It requires the evaluation of system performance and energy consumption under a wide variety of realistic workloads and deployment configurations. This paper demonstrates StressCloud, an automatic performance and energy consumption analysis tool for cloud applications in real-world cloud environments. StressCloud supports 1) the modelling of realistic cloud application workloads, 2) the automatic generation and running of load tests, and 3) the profiling of system performance and energy consumption. A demonstration video can be accessed at: https://www.youtube.com/watch?v=0l4_a_CNtVQ [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
33. StriSynth: Synthesis for Live Programming.
- Author
-
Gulwani, Sumit, Mayer, Mikaël, Niksic, Filip, and Piskac, Ruzica
- Subjects
COMPUTER software research ,COMPUTER files ,APPLICATION software ,END users (Information technology) ,AUTOMATION ,SOFTWARE engineering - Abstract
Motivated by applications in automating repetitive file manipulations, we present a tool called StriSynth, which allows end-users to perform transformations over data using examples. Based on provided examples, our tool automatically generates scripts for non-trivial file manipulations. Although the current focus of StriSynth are file manipulations, it implements a more general string transformation framework. This framework builds on and further extends the functionality of Flash Fill--a Microsoft Excel extension for string transformations. An accompanying video to this paper is available at the following website http://youtu.be/kkDZphqIdFM. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
34. MU-MMINT: an IDE for Model Uncertainty.
- Author
-
Famelis, Michalis, Ben-David, Naama, Di Sandro, Alessio, Salay, Rick, and Chechik, Marsha
- Subjects
UNCERTAINTY ,SOFTWARE engineering ,COMPUTER software research ,COMPUTER systems ,SOFTWARE engineers - Abstract
Developers have to work with ever-present design-time uncertainty, i.e., uncertainty about selecting among alternative design decisions. However, existing tools do not support working in the presence of uncertainty, forcing developers to either make provisional, premature decisions, or to avoid using the tools altogether until uncertainty is resolved. In this paper, we present a tool, called MU-MMINT, that allows developers to express their uncertainty within software artifacts and perform a variety of model management tasks such as reasoning, transformation and refinement in an interactive environment. In turn, this allows developers to defer the resolution of uncertainty, thus avoiding having to undo provisional decisions. See the companion video: http://youtu.be/kAWUm-iFatM [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
35. New Initiative: The Naturalness of Software.
- Author
-
Devanbu, Premkumar
- Subjects
COMPUTER software research ,GRANTS (Money) - Abstract
This paper describes a new research consortium, studying the Naturalness of Software. This initiative is supported by a pair of grants by the US National Science Foundation, totaling $2,600,000: the first, exploratory ("EAGER") grant of $600,000 helped kickstart an inter-disciplinary effort, and demonstrate feasibility; a follow-on full grant of $2,000,000 was recently awarded. The initiative is led by the author, who is at UC Davis, and includes investigators from Iowa State University and Carnegie-Mellon University (Language Technologies Institute). [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
36. Automatic and Continuous Software Architecture Validation.
- Author
-
Goldstein, Maayan and Segall, Itai
- Subjects
COMPUTER software research ,COMPUTER software development ,COMPUTER architecture ,SOFTWARE architecture ,COMPUTER engineering - Abstract
Software systems tend to suffer from architectural problems as they are being developed. While modern software development methodologies such as Agile and Dev-Ops suggest different ways of assuring code quality, very little attention is paid to maintaining high quality of the architecture of the evolving systems. By detecting and alerting about violations of the intended software architecture, one can often avoid code-level bad smells such as spaghetti code. Typically, if one wants to reason about the software architecture, the burden of first defining the intended architecture falls on the developer's shoulders. This includes definition of valid and invalid dependencies between software components. However, the developers are seldom familiar with the entire software system, which makes this task difficult, time consuming and error-prone. We propose and implement a solution for automatic detection of architectural violations in software artifacts. The solution, which utilizes a number of predefined and user-defined patterns, does not require prior knowledge of the system or its intended architecture. We propose to leverage this solution as part of the nightly build process used by development teams, thus achieving continuous automatic validation of the system's software architecture. As we show in multiple open-source and proprietary cases, a small set of predefined patterns can detect architectural violations as they are introduced over the course of development, and also capture deterioration in existing architectural problems. By evaluating the tool on relatively large open-source projects, we also validate its scalability and practical applicability to large software systems. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
37. Measuring Dependency Freshness in Software Systems.
- Author
-
Cox, Jöel, Bouwers, Eric, van Eekelen, Marko, and Visser, Joost
- Subjects
COMPUTER software research ,SYSTEM analysis software ,SOFTWARE measurement ,COMPUTER files ,COMPUTER systems - Abstract
Modern software systems often make use of thirdparty components to speed-up development and reduce maintenance costs. In return, developers need to update to new releases of these dependencies to avoid, for example, security and compatibility risks. In practice, prioritizing these updates is difficult because the use of outdated dependencies is often opaque. In this paper we aim to make this concept more transparent by introducing metrics to quantify the use of recent versions of dependencies, i.e. the system's "dependency freshness". We propose and investigate a system-level metric based on an industry benchmark. We validate the usefulness of the metric using interviews, analyze the variance of the metric through time, and investigate the relationship between outdated dependencies and security vulnerabilities. The results show that the measurements are considered useful, and that systems using outdated dependencies four times as likely to have [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
38. Striving for Failure: An Industrial Case Study About Test Failure Prediction.
- Author
-
Anderson, Jeff, Salem, Saeed, and Hyunsook Do
- Subjects
COMPUTER software research ,REGRESSION analysis ,MICROSOFT software ,COMPUTER files ,COMPUTER systems - Abstract
Software regression testing is an important, yet very costly, part of most major software projects. When regression tests run, any failures that are found help catch bugs early and smooth the future development work. The act of executing large numbers of tests takes significant resources that could, otherwise, be applied elsewhere. If tests could be accurately classified as likely to pass or fail prior to the run, it could save significant time while maintaining the benefits of early bug detection. In this paper, we present a case study to build a classifier for regression tests based on industrial software, Microsoft Dynamics AX. In this study, we examine the effectiveness of this classification as well as which aspects of the software are the most important in predicting regression test failures. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF
39. Interference calculation in asynchronous random access protocols using diversity.
- Author
-
Meloni, Alessio and Murroni, Maurizio
- Subjects
MULTIPLE access protocols (Computer network protocols) ,MULTIPLEXING ,COMPUTER software research ,INTERFERENCE (Telecommunication) ,CONTENTION resolution protocols (Computer network protocols) - Abstract
The use of Aloha-based random access protocols is interesting when channel sensing is either not possible or not convenient and the traffic from terminals is unpredictable and sporadic. In this paper an analytic model for packet interference calculation in asynchronous random access protocols using diversity is presented. The aim is to provide a tool that avoids time-consuming simulations to evaluate packet loss and throughput in case decodability is still possible when a certain interference threshold is not exceeded. Moreover the same model represents the groundbase for further studies in which iterative interference cancellation is applied to received frames. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
40. Control Charts with Variable Dimension for Linear Combination of Poisson Variables.
- Author
-
García‐Bustos, Sandra, Mite, Mónica, and Vera, Francisco
- Subjects
- *
QUALITY control charts , *POISSON processes , *MARKOV processes , *GENETIC algorithms , *COMPUTER software research - Abstract
This article analyzes the simultaneous control of several correlated Poisson variables by using the Variable Dimension Linear Combination of Poisson Variables (VDLCP) control chart, which is a variable dimension version of the LCP chart. This control chart uses as test statistic, the linear combination of correlated Poisson variables in an adaptive way, i.e. it monitors either p1 or p variables ( p1 < p) depending on the last statistic value. To analyze the performance of this chart, we have developed software that finds the best parameters, optimizing the out-of-control average run length (ARL) for a shift that the practitioner wishes to detect as quickly as possible, restricted to a fixed value for in-control ARL. Markov chains and genetic algorithms were used in developing this software. The results show performance improvement compared to the LCP chart. Copyright © 2015 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
41. Protecting Software through Obfuscation: Can It Keep Pace with Progress in Code Analysis?
- Author
-
SCHRITTWIESER, SEBASTIAN, KATZENBEISSER, STEFAN, KINDER, JOHANNES, MERZDOVNIK, GEORG, and WEIPPL, EDGAR
- Subjects
- *
COMPUTER software research , *COMPUTER programming , *DATA protection , *MALWARE , *COMPUTER software development - Abstract
Software obfuscation has always been a controversially discussed research area. While theoretical results indicate that provably secure obfuscation in general is impossible, its widespread application inmalware and commercial software shows that it is nevertheless popular in practice. Still, it remains largely unexplored to what extent today's software obfuscations keep up with state-of-the-art code analysis and where we stand in the arms race between software developers and code analysts. The main goal of this survey is to analyze the effectiveness of different classes of software obfuscation against the continuously improving deobfuscation techniques and off-the-shelf code analysis tools. The answer very much depends on the goals of the analyst and the available resources. On the one hand, many forms of lightweight static analysis have difficulties with even basic obfuscation schemes, which explains the unbroken popularity of obfuscation among malware writers. On the other hand, more expensive analysis techniques, in particular when used interactively by a human analyst, can easily defeat many obfuscations. As a result, software obfuscation for the purpose of intellectual property protection remains highly challenging. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
42. A Scalable, Non-Parametric Method for Detecting Performance Anomaly in Large Scale Computing.
- Author
-
Yu, Li and Lan, Zhiling
- Subjects
- *
LARGE scale systems , *COMPUTER systems , *ERRORS , *SYSTEM analysis , *COMPUTER software research - Abstract
As computer systems continue to grow in scale and complexity, performance problems become common and a major concern for large-scale computing. Performance anomalies caused by application bugs, hardware or software faults, or resource contention can have great impact on system-wide performance and could lead to significant economic losses for service providers. While many detection methods have been presented in the past, the newly emerging challenges are detection scalability and practical use. In this paper, we propose a scalable, non-parametric method for effectively detecting performance anomalies in large-scale systems. The design is generic for anomaly detection in a variety of parallel and distributed systems exhibiting peer-comparable property. It adopts a divide-and-conquer approach to address the scalability challenge and explores the use of non-parametric clustering and two-phase majority voting to improve detection flexibility and accuracy. We derive probabilistic models to quantitatively evaluate our decentralized design. Experiments with a suite of applications on production systems demonstrate that this method outperforms existing methods in terms of detection accuracy with a negligible runtime overhead. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
43. Automated brain volumetrics in multiple sclerosis: a step closer to clinical application.
- Author
-
Wang, C., Beadnall, H. N., Hatton, S. N., Bader, G., Tomic, D., Silva, D. G., and Barnett, M. H.
- Subjects
MULTIPLE sclerosis research ,BRAIN imaging ,ALGORITHM research ,REGRESSION analysis ,COMPUTER software research - Abstract
Background: Whole brain volume (WBV) estimates in patients with multiple sclerosis (MS) correlate more robustly with clinical disability than traditional, lesion-based metrics. Numerous algorithms to measure WBV have been developed over the past two decades. We compare Structural Image Evaluation using Normalisation of Atrophy-Cross-sectional (SIENAX) to NeuroQuant and MSmetrix, for assessment of cross-sectional WBV in patients with MS.Methods: MRIs from 61 patients with relapsing-remitting MS and 2 patients with clinically isolated syndrome were analysed. WBV measurements were calculated using SIENAX, NeuroQuant and MSmetrix. Statistical agreement between the methods was evaluated using linear regression and Bland-Altman plots. Precision and accuracy of WBV measurement was calculated for (1) NeuroQuant versus SIENAX and (2) MSmetrix versus SIENAX.Results: Precision (Pearson's r) of WBV estimation for NeuroQuant and MSmetrix versus SIENAX was 0.983 and 0.992, respectively. Accuracy (Cb) was 0.871 and 0.994, respectively. NeuroQuant and MSmetrix showed a 5.5% and 1.0% volume difference compared with SIENAX, respectively, that was consistent across low and high values.Conclusions: In the analysed population, NeuroQuant and MSmetrix both quantified cross-sectional WBV with comparable statistical agreement to SIENAX, a well-validated cross-sectional tool that has been used extensively in MS clinical studies. [ABSTRACT FROM AUTHOR]- Published
- 2016
- Full Text
- View/download PDF
44. Collaboration in the consulting industry.
- Author
-
Martensen, Malte, Ryschka, Stephanie, Blesik, Till, and Bick, Markus
- Subjects
COMPUTER software research ,TECHNOLOGICAL innovations ,JOB performance ,WORK environment ,STAKEHOLDERS - Abstract
Purpose – By studying the drivers of social collaboration the purpose of this paper is to describe how, and for what job-related purposes, social software is employed in the digital workplace. Focussing on consultants, who are considered to be part of a knowledge-intensive and innovative industry, factors that may influence the adoption of professional social software are explored. In addition, insights about socio-demographic differences as well as distinct consulting segments and use categories are provided. Design/methodology/approach – The Unified Theory of Acceptance and Use of Technologies (UTAUT) is the theoretical backbone of this research. The UTAUT model is expanded to fit the research goals, and the results from a quantitative study (n=341) are used to test the model. Findings – The results suggest that the adoption of social software is associated with the expectation that one’s work performance will improve. There are significant differences regarding age and gender in the use of social software for job-related purposes. Practical implications – Using the results of the study, social software suites can be tailored to users’ needs and preferences, which, in turn, may lead to higher levels of acceptance and intensity of use. Originality/value – Social software is already widely adopted for private purposes, and it is being used more and more within the digital workplace, too. However, little research has been conducted into how, and for what job-related purposes, social software is employed, or into the potential drivers for its adoption. The stakeholders in the research include scholars and practitioners alike. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
45. Performance of e-ASPECTS software in comparison to that of stroke physicians on assessing CT scans of acute ischemic stroke patients.
- Author
-
Herweh, Christian, Ringleb, Peter A., Rauch, Geraldine, Gerry, Steven, Behrens, Lars, Möhlenbruch, Markus, Gottorf, Rebecca, Richter, Daniel, Schieber, Simon, and Nagel, Simon
- Subjects
- *
COMPUTER software research , *STROKE , *CEREBROVASCULAR disease , *PHYSICIANS , *MEDICAL personnel - Abstract
Background: The Alberta Stroke Program Early CT score (ASPECTS) is an established 10-point quantitative topographic computed tomography scan score to assess early ischemic changes. We compared the performance of the e-ASPECTS software with those of stroke physicians at different professional levels. Methods: The baseline computed tomography scans of acute stroke patients, in whom computed tomography and diffusion-weighted imaging scans were obtained less than two hours apart, were retrospectively scored by e-ASPECTS as well as by three stroke experts and three neurology trainees blinded to any clinical information. The ground truth was defined as the ASPECTS on diffusion-weighted imaging scored by another two non-blinded independent experts on consensus basis. Sensitivity and specificity in an ASPECTS region-based and an ASPECTS score-based analysis as well as receiver-operating characteristic curves, Bland--Altman plots with mean score error, and Matthews correlation coefficients were calculated. Comparisons were made between the human scorers and e-ASPECTS with diffusion-weighted imaging being the ground truth. Two methods for clustered data were used to estimate sensitivity and specificity in the region-based analysis. Results: In total, 34 patients were included and 680 (34-20) ASPECTS regions were scored. Mean time from onset to computed tomography was 172-135 min and mean time difference between computed tomographyand magnetic resonance imaging was 41-31 min. The region-based sensitivity (46.46% [CI: 30.8;62.1]) of e-ASPECTS was better than three trainees and one expert (p-0.01) and not statistically different from another two experts. Specificity (94.15% [CI: 91.7;96.6]) was lower than one expert and one trainee (p<0.01) and not statistically different to the other four physicians. e-ASPECTS had the best Matthews correlation coefficient of 0.44 (experts: 0.38-0.08 and trainees: 0.19-0.05) and the lowest mean score error of 0.56 (experts: 1.44-1.79 and trainees: 1.97-2.12). Conclusion: e-ASPECTS showed a similar performance to that of stroke experts in the assessment of brain computed tomographys of acute ischemic stroke patients with the Alberta Stroke Program Early CT score method. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
46. C-strider: type-aware heap traversal for C.
- Author
-
Saur, Karla, Hicks, Michael, and Foster, Jeffrey S.
- Subjects
COMPUTER software research ,COMPUTER files ,COMPUTER systems ,COMPUTER programming ,COMPUTER algorithms - Abstract
Researchers have proposed many tools and techniques that work by traversing the heap, including checkpointing systems, heap profilers, heap assertion checkers, and dynamic software updating systems. Yet building a heap traversal for C remains difficult, and to our knowledge, extant services have used their own application-specific traversals. This paper presents C-strider, a framework for writing C heap traversals and transformations. Writing a basic C-strider service requires implementing only four callbacks; C-strider then generates a program-specific traversal that invokes the callbacks as each heap location is visited. Critically, C-strider is type aware - it tracks types as it walks the heap, so every callback is supplied with the exact type of the associated location. We used C-strider to implement heap serialization, dynamic software updating, heap checking, and profiling, and then applied the resulting traversals to several programs. We found that C-strider requires little programmer effort, and the resulting services are efficient and effective. Copyright © 2015 John Wiley & Sons, Ltd. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
47. Parallel image computation in clusters with task-distributor.
- Author
-
Baun, Christian
- Subjects
- *
COMPUTER workstation clusters , *PERFORMANCE research , *RAY tracing , *MESSAGE passing (Computer science) , *COMPUTER software research - Abstract
Distributed systems, especially clusters, can be used to execute ray tracing tasks in parallel for speeding up the image computation. Because ray tracing is a computational expensive and memory consuming task, ray tracing can also be used to benchmark clusters. This paper introduces task-distributor, a free software solution for the parallel execution of ray tracing tasks in distributed systems. The ray tracing solution used for this work is the Persistence Of Vision Raytracer (POV-Ray). Task-distributor does not require any modification of the POV-Ray source code or the installation of an additional message passing library like the Message Passing Interface or Parallel Virtual Machine to allow parallel image computation, in contrast to various other projects. By analyzing the runtime of the sequential and parallel program parts of task-distributor, it becomes clear how the problem size and available hardware resources influence the scaling of the parallel application. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
48. Modeling of a Cylindrical Shell and Helical Tube Condenser of HFC-134a.
- Author
-
Dabas, J. K., Kumar, Sudhir, Dodeja, A. K., and Kasana, K. S.
- Subjects
- *
CYLINDRICAL shells , *CONDENSERS (Vapors & gases) , *HEAT transfer , *COMPUTER software research , *PRESSURE drop (Fluid dynamics) - Abstract
A computer program was developed for the performance analysis and design optimization of a cylindrical shell and helical tube type HFC134a condenser and its predicted results were verified against the experimentally determined data. The computer model is based on a numerical method of cell discretization. The local values of variables like heat transfer rate, pressure drop, and the properties of refrigerant are calculated on the basis of appropriate theoretical and empirical correlations available in the literature and the mass, momentum, and energy balance is applied to each cell. The whole sequential and iterative procedure to satisfy the boundary conditions of each cell and of the whole condenser has been transformed into a computer program written in C++. This computer model was used in a parametric study to analyze the effects of varying the input parameters of both fluids on the performance of the condenser. It gives the optimal values of refrigerant mass velocity and of the tube diameter against the available conditions of external cooling fluid, mass flow rate of refrigerant, and degree of subcooling of the refrigerant at the condenser outlet. Its utility was found in the performance optimization of an existing condenser as well as in the design optimization of a new condenser. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
49. Software requirements for the control systems according to the level of functional safety.
- Author
-
Gabriška, D.
- Subjects
- *
COMPUTER software research , *AUTOMATIC control systems , *SECURITY systems , *SOFTWARE architecture , *COMPUTER software sales & prices - Abstract
The article describes the main requirements of the software subsystems management development. Standard IEC 61508-3 provides an overview at all stages of the life cycle of all security systems, including E/E/PE of a security system from initial concept, design, and implementation to operation maintenance. In this paper we analyzed set out requirements for the drafting of a software architecture that is consistent with the hardware architecture while meeting specified requirements for software safety. [ABSTRACT FROM AUTHOR]
- Published
- 2016
- Full Text
- View/download PDF
50. Achieving One Billion Key-Value Requests per Second on a Single Server.
- Author
-
Li, Sheng, Lim, Hyeontaek, Lee, Victor W., Ahn, Jung Ho, Kalia, Anuj, Kaminsky, Michael, Andersen, David G., O, Seongil, Lee, Sukhan, and Dubey, Pradeep
- Subjects
- *
CLOUD computing , *BIG data , *COMPUTER software research , *COMPUTER simulation , *COMPUTER input-output equipment - Abstract
Distributed in-memory key-value stores (KVSs) have become a critical data-serving layer in cloud computing and big data infrastructure. Unfortunately, KVSs have demonstrated a gap between achieved and available performance, QoS, and energy efficiency on commodity platforms. Two research thrusts have focused on improving key-value performance: hardware-centric research has started to explore specialized platforms for KVSs, and software-centric research revisited the KVS application to address fundamental software bottlenecks. Unlike prior research focusing on hardware or software in isolation, the authors aimed to full-stack (software through hardware) architect high-performance and efficient KVS platforms. Their full-system characterization identifies the critical hardware/software ingredients for high-performance KVS systems and suggests optimizations to achieve record-setting performance and energy efficiency: 120~167 million requests per second (RPS) on a single commodity server. They propose a future many-core platform and via detailed simulations demonstrate the capability of achieving a billion RPS with a single server platform. [ABSTRACT FROM PUBLISHER]
- Published
- 2016
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.