1,327 results
Search Results
2. Transforming Medical Imaging: The First SCAR TRIP™ Conference: A Position Paper from the SCAR TRIP™ Subcommittee of the SCAR Research and Development Committee
- Author
-
Katherine P. Andriole and Richard L. Morin
- Subjects
Breakout ,Imaging informatics ,Radiological and Ultrasound Technology ,business.industry ,Computer science ,Computer Applications ,Usability ,Translational research ,Data science ,Article ,Computer Science Applications ,Medical imaging ,System integration ,Position paper ,Radiology, Nuclear Medicine and imaging ,Engineering ethics ,business - Abstract
The First Society for Computer Applications in Radiology (SCAR) Transforming the Radiological Interpretation Process (TRIP™) Conference and Workshop, “Transforming Medical Imaging” was held on January 31–February 1, 2005 in Bethesda, MD. Representatives from all areas of medical and scientific imaging—academia, research, industry, and government agencies—joined together to discuss the future of medical imaging and potential new ways to manage the explosion in numbers, size, and complexity of images generated by today's continually advancing imaging technologies. The two-day conference included plenary, scientific poster, and breakout sessions covering six major research areas related to TRIP™. These topic areas included human perception, image processing and computer-aided detection, data visualization, image set navigation and usability, databases and systems integration, and methodology evaluation and performance validation. The plenary presentations provided a general status review of each broad research field to use as a starting point for discussion in the breakout sessions, with emphasis on specific topics requiring further study. The goals for the breakout sessions were to define specific research questions in each topic area, to list the impediments to carrying out research in these fields, to suggest possible solutions and near- and distant-future directions for each general topic, and to report back to the general session. The scientific poster session provided another mechanism for presenting and discussing TRIP™-related research. This report summarizes each plenary and breakout session, and describes the group recommendations as to the issues facing the field, major impediments to progress, and the outlook for radiology in the short and long term. The conference helped refine the definition of the SCAR TRIP™ Initiative and the problems facing radiology with respect to the dramatic growth in medical imaging data, and it underscored a present and future need for the support of interdisciplinary translational research in radiology bridging bench-to-bedside. SCAR will continue to fund research grants exploring TRIP™ solutions. In addition, the organization proposes providing an infrastructure to foster collaborative research partnerships between SCAR corporate and academic members in the form of a TRIP™ Imaging Informatics Network (TRIPI2N).
- Published
- 2006
3. Introduction to the Paper by Seong K. Mun, Ph.D., et al, 'Experience with Image Management Networks at Three Universities: Is the Cup Half-empty or Half-full?'
- Author
-
Seong Ki Mun
- Subjects
Telemedicine ,Radiological and Ultrasound Technology ,Workstation ,Process (engineering) ,Computer science ,business.industry ,Network security ,Mature technology ,Information technology ,Network topology ,Data science ,Article ,Computer Science Applications ,law.invention ,Picture archiving and communication system ,law ,Radiology, Nuclear Medicine and imaging ,business - Abstract
OUR RESEARCH APPROACH to the development of a picture archiving and communication system (PACS) has been at the system level. While component technology was developed by various investigators, we felt that our unique role was to look at the PACS from the perspective of network and clinical operations as well as the management of insertion of technology into a complex environment. One of the first questions we tried to address was, “How should one describe PACS?” The paper reprinted here was an early attempt to describe PACS at an operational level without being limited to specific component technology, as we knew the technology would eventually change. This approach as laid the foundation for the performance specifications of the Medical Diagnostic Imaging System that the Department of Defense adopted in the 1990s. During that decade many new efforts were directed toward developing quantitative requirements for the PACS performance, especially to meet the needs of radiologists. Whereas workstation performance and network topologies were topics of intense discussion and development in various quarters, our focus remained on the system level performance and technology deployment. Today PACS is a mature technology. It is difficult to know if any aspects of PACS development influenced the advances in generic computer and information technology. It is reasonable to assume, however, that the PACS initiative has a significant impact on advancing imaging, image processing and image processing technologies. Certainly the PACS efforts in the radiology community have laid the foundation for filmless electronic hospitals and telemedicine. Many major technical issues have been resolved, but as applications expand and the landscape of usage changes, new issues arise. The questions of network topology, workstation performance, image quality on the electronic displays, digital radiography, and clinical acceptance are no longer challenges for PACS today. But the integration of PACS with the radiology information system and other enterprise-wide information and imaging systems continues to pose implementation difficulties. Network security, patient privacy, and health information assurance are a few of the new requirements that the PACS community must address. Through participation in the evolution of PACS, we have experienced firsthand the old lessons associated with the adoption of new technology. This process must begin with proof of the merit of the technology; in addition it requires overcoming entrenched habits and self-interest associated with preserving old technology and old work rules. For PACS, the process of maturation has taken more than 10 years. It was successful because the technology solved difficult problems of managing large amount of complex data for many different stakeholders. The next challenges may be the development of new research programs and patient care capabilities by accessing the vast image databases that are accumulating at PACS hospitals.
- Published
- 2003
4. Introduction to Paper by Sridhar B. Seshadri, MSEE, MBA, et al, 'Prototype Medical Image Management System (MIMS) at the University of Pennsylvania: Software Design Considerations'
- Author
-
Satjeet Khalsa, Inna Brikman, Sridhar B. Seshadri, Frans van der Voorde, and Ronald Arenson
- Subjects
Radiological and Ultrasound Technology ,Standardization ,business.industry ,Computer science ,Information technology ,Article ,Computer Science Applications ,DICOM ,Engineering management ,Software ,Workflow ,Management system ,Component-based software engineering ,Software design ,Radiology, Nuclear Medicine and imaging ,business - Abstract
IT IS ILLUMINATING and perhaps a little humbling to review a paper written over 15 years ago and compare it to the state-of-the-art in PACS today. Clearly, there have been astounding improvements in the core technology: increased processing power, higher communications bandwidth, cost-effective storage capacity, and superb display technologies. However, the authors’ view (in 1987) that the industry (vendors and customers) needs to focus more on software and systems issues still rings true. Standardization of hardware and software components has come a long way with the digital imaging and communications in medicine (DICOM) and Integrating the Healthcare Enterprise (IHE) standards; without these yeoman efforts, PACS would still be in the dark ages. Also, PACS appears to have “graduated” from being a departmental solution to becoming more and more integrated into the mainstream clinical information systems from information technology providers. Despite these great achievements, I think the industry needs to invest more thought and effort into unleashing the power of PACS with revolutionary workflow and process-change around the technology that will help users realize greater benefits. Interestingly, Louis Gerstner Jr., in an interview about his new book, Who Says Elephants Can’t Dance? says (about the computer industry) “. . . the process of integrating this technology and achieving the benefit is unbelievably painful for companies. The industry has been all about faster, faster, more function, more function. . . .” It appears, at least in this regard, that PACS shares the challenges of the rest of the computer industry!
- Published
- 2003
5. Introduction to Paper by G. James Blaine, D.Sc. et al, 'PACS Workbench at Mallinckrodt Institute of Radiology—1983'
- Author
-
G. James Blaine
- Subjects
medicine.medical_specialty ,Radiological and Ultrasound Technology ,business.operation ,Computer science ,business.industry ,Mallinckrodt ,Modular design ,Communications system ,Article ,Computer Science Applications ,DICOM ,Workflow ,Management system ,medicine ,Workbench ,Radiology, Nuclear Medicine and imaging ,Radiology ,business ,Implementation - Abstract
IT HAS BEEN ASTOUNDING to witness the two-decade computing evolution, which increased processing power, communications bandwidth, and storage capacity while reducing costs. Most of the technology limitations that challenged our early development of picture archiving and communications system (PACS) development have been dissolved. Our quest for modular components has been partially facilitated by the developments of DICOM v3 and more recently by the Radiological Society of North America (RSNA) and the Healthcare Information and Management Systems Society (HIMSS) initiative to address the integration of the healthcare environment (IHE). While many of the commercial systems are still limited in their ability to be tailored to support the radiologist’s need for specialized workflow, adoption of the IHE technical framework holds promise. As in the PACS Workbench experiments, we still find it is essential to have tools to measure image flow, queue arrival times, and queue departure times in order to understand the bounds and performance of our installed commercial systems. The dynamic display of performance metrics, missing in many systems today, continues to be required to enable the “measured, scientific approach” that we sought in our original implementations.
- Published
- 2003
6. Network immunization and virus propagation in email networks: experimental evaluation and analysis
- Author
-
Jiming Liu, Chao Gao, and Ning Zhong
- Subjects
Operations research ,Computer science ,Network topology ,computer.software_genre ,Immunization strategies ,Virus ,Electronic mail ,Computer virus ,Enron ,Betweenness centrality ,Human dynamics ,Artificial Intelligence ,Virus propagation ,Regular Paper ,Email networks ,business.industry ,Immunization (finance) ,Human-Computer Interaction ,Key factors ,Hardware and Architecture ,business ,computer ,Software ,Information Systems ,Computer network - Abstract
Network immunization strategies have emerged as possible solutions to the challenges of virus propagation. In this paper, an existing interactive model is introduced and then improved in order to better characterize the way a virus spreads in email networks with different topologies. The model is used to demonstrate the effects of a number of key factors, notably nodes’ degree and betweenness. Experiments are then performed to examine how the structure of a network and human dynamics affects virus propagation. The experimental results have revealed that a virus spreads in two distinct phases and shown that the most efficient immunization strategy is the node-betweenness strategy. Moreover, those results have also explained why old virus can survive in networks nowadays from the aspects of human dynamics.
- Published
- 2010
7. Software engineering in a BS in computer science
- Author
-
Richard Louis Weis
- Subjects
Social software engineering ,AP Computer Science ,business.industry ,Computer science ,Informatics engineering ,Software construction ,Personal software process ,Position paper ,Computer science curriculum ,Software requirements ,Software engineering ,business ,GeneralLiterature_REFERENCE(e.g.,dictionaries,encyclopedias,glossaries) - Abstract
This position paper outlines the rationale for and the approach used at the University of Hawaii at Hilo to further augment the ACM/IEEE computer science curriculum for software engineering considerations.
- Published
- 2006
8. Invariant Processing and Occlusion Resistant Recognition of Planar Shapes
- Author
-
Alfred M. Bruckstein
- Subjects
Planar ,Computer science ,business.industry ,Short paper ,Occlusion ,Computer vision ,Artificial intelligence ,Invariant (mathematics) ,Fixed point ,business ,Partial occlusion ,Smoothing - Abstract
This short paper surveys methods for planar shape smoothing and processing and planar shape recognition invariant under viewing distortions and even partial occlusions. It is argued that all the results available in the literature on these problems implicitly follow from successfully addressing two basic problems: invariant location of points with respect to a given shape (a given set of points in the plane) and invariant displacement of points with regard to the given shape.
- Published
- 2005
9. The KCM system: Speeding-up logic programming through hardware support
- Author
-
Jacques Noyé
- Subjects
Computer science ,business.industry ,Programming language ,Short paper ,computer.software_genre ,Prolog ,Logic synthesis ,Software ,Computer architecture ,business ,Logic Control ,computer ,Logic programming ,Computer hardware ,Logic optimization ,Register-transfer level ,computer.programming_language - Abstract
The aim of the KCM (Knowledge Crunching Machine) project was to study how to speed-up Prolog, and more generally Logic Programming, through hardware support at the processor level. An experimental approach was taken, which resulted in the design and implementation of a real system, hardware and software. This short paper outlines the key features of the system as well as the main conclusions which can be drawn from the project.
- Published
- 2005
10. The Office of the Past
- Author
-
Steven M. Seitz, Maneesh Agrawala, and Jiwon Kim
- Subjects
Paper document ,Computer science ,business.industry ,Scale-invariant feature transform ,Scene graph ,Computer vision ,Artificial intelligence ,business - Published
- 2005
11. Short term production scheduling of the pulp mill — A decentralized optimization approach
- Author
-
Kauko Leiviska
- Subjects
Pulp mill ,Waste management ,Decentralized optimization ,business.industry ,Computer science ,Storage tank ,Production control ,Production schedule ,Paper mill ,business ,Term (time) - Published
- 2005
12. Improving SIEM for critical SCADA water infrastructures using machine learning
- Author
-
David Brosset, Hanan Hindy, Amar Seeam, Ethan Bayne, Xavier Bellekens, Katsikas, Sokratis K., Cuppens, Frédéric, Cuppens, Nora, Lambrinoudakis, Costas, Antón, Annie, Gritzalis, Stefanos, Mylopoulos, John, Kalloniatis, Christos, Institut de Recherche de l'Ecole Navale (IRENAV), Université de Bordeaux (UB)-Institut Polytechnique de Bordeaux-Centre National de la Recherche Scientifique (CNRS)-Institut National de Recherche pour l’Agriculture, l’Alimentation et l’Environnement (INRAE)-Arts et Métiers Sciences et Technologies, HESAM Université (HESAM)-HESAM Université (HESAM), University of Mauritius, and Middlesex University
- Subjects
QA75 ,021110 strategic, defence & security studies ,Spoofing attack ,Computer science ,Process (engineering) ,Event (computing) ,business.industry ,Anomaly (natural sciences) ,0211 other engineering and technologies ,02 engineering and technology ,Machine learning ,computer.software_genre ,SCADA ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,Anomaly detection ,[INFO]Computer Science [cs] ,Artificial intelligence ,business ,computer ,Implementation - Abstract
International audience; Network Control Systems (NAC) have been used in many industrial processes. They aim to reduce the human factor burden and efficiently handle the complex process and communication of those systems. Supervisory control and data acquisition (SCADA) systems are used in industrial, infrastructure and facility processes (e.g. manufacturing, fabrication, oil and water pipelines, building ventilation, etc.) Like other Internet of Things (IoT) implementations, SCADA systems are vulnerable to cyber-attacks, therefore, a robust anomaly detection is a major requirement. However, having an accurate anomaly detection system is not an easy task, due to the difficulty to differentiate between cyber-attacks and system internal failures (e.g. hardware failures). In this paper, we present a model that detects anomaly events in a water system controlled by SCADA. Six Machine Learning techniques have been used in building and evaluating the model. The model classifies different anomaly events including hardware failures (e.g. sensor failures), sabotage and cyber-attacks (e.g. DoS and Spoofing). Unlike other detection systems, our proposed work helps in accelerating the mitigation process by notifying the operator with additional information when an anomaly occurs. This additional information includes the probability and confidence level of event(s) occurring. The model is trained and tested using a real-world dataset.
- Published
- 2019
13. Randomized neighbor discovery protocols with collision detection for static multi-hop wireless ad hoc networks
- Author
-
Carlos T. Calafate, Jaime Lloret, Jose Vicente Sorribes, and Lourdes Peñalver
- Subjects
Computer science ,Wireless ad hoc network ,computer.internet_protocol ,Neighbor discovery ,Throughput ,02 engineering and technology ,Neighbor Discovery Protocol ,0203 mechanical engineering ,Computer Science::Networking and Internet Architecture ,0202 electrical engineering, electronic engineering, information engineering ,Collision detection ,Electrical and Electronic Engineering ,Multihop ,Protocol (science) ,One-hop ,business.industry ,Network packet ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,020302 automobile design & engineering ,020206 networking & telecommunications ,Energy consumption ,Castalia ,Randomized protocols ,Wireless ad hoc networks ,business ,computer ,Computer network - Abstract
[EN] Neighbor discovery represents a first step after the deployment of wireless ad hoc networks, since the nodes that form them are equipped with limited-range radio transceivers, and they typically do not know their neighbors. In this paper two randomized neighbor discovery approaches, called CDH and CDPRR, based on collision detection for static multi-hop wireless ad hoc networks, are presented. Castalia 3.2 simulator has been used to compare our proposed protocols against two protocols chosen from the literature and used as reference: the PRR, and the Hello protocol. For the experiments, we chose five metrics: the neighbor discovery time, the number of discovered neighbors, the energy consumption, the throughput and the number of discovered neighbors versus packets sent ratio. According to the results obtained through simulation, we can conclude that our randomized proposals outperform both Hello and PRR protocols in the presence of collisions regarding all five metrics, for both one-hop and multi-hop scenarios. As novelty compared to the reference protocols, both proposals allow nodes to discover all their neighbors with probability 1, they are based on collision detection and know when to terminate the neighbor discovery process. Furthermore, qualitative comparisons of the existing protocols and the proposals are available in this paper. Moreover, CDPRR presents better results in terms of time, energy consumption and number of discovered neighbors versus packets sent ratio. We found that both proposals achieve to operate under more realistic assumptions. Furthermore, CDH does not need to know the number of nodes in the network., This work has been partially supported by the "Ministerio de Economia y Competitividad" in the "Programa Estatal de Fomento de la Investigacion Cientifica y Tecnica de Excelencia, Subprograma Estatal de Generacion de Conocimiento" within the project under Grant TIN2017-84802-C2-1-P. This work has also been partially supported by European Union through the ERANETMED (Euromediterranean Cooperation through ERANET joint activities and beyond) project ERANETMED3-227 SMARTWATIR.
- Published
- 2021
14. Comparative study of AR versus video tutorials for minor maintenance operations
- Author
-
Juan M. Orduña, M. Carmen Juan, Pedro Morillo, Marcos Fernández, and Inmaculada García-García
- Subjects
Augmented Reality ,Multimedia ,Computer Networks and Communications ,business.industry ,Computer science ,Equipment maintenance ,020207 software engineering ,Usability ,02 engineering and technology ,Minor (academic) ,computer.software_genre ,Multimedia-based learning ,Hardware and Architecture ,Real user study ,0202 electrical engineering, electronic engineering, information engineering ,Media Technology ,Augmented reality ,Comparative study ,business ,computer ,LENGUAJES Y SISTEMAS INFORMATICOS ,Software - Abstract
[EN] Augmented Reality (AR) has become a mainstream technology in the development of solutions for repair and maintenance operations. Although most of the AR solutions are still limited to specific contexts in industry, some consumer electronics companies have started to offer pre-packaged AR solutions as alternative to video-based tutorials (VT) for minor maintenance operations. In this paper, we present a comparative study of the acquired knowledge and user perception achieved with AR and VT solutions in some maintenance tasks of IT equipment. The results indicate that both systems help users to acquire knowledge in various aspects of equipment maintenance. Although no statistically significant differences were found between AR and VT solutions, users scored higher on the AR version in all cases. Moreover, the users explicitly preferred the AR version when evaluating three different usability and satisfaction criteria. For the AR version, a strong and significant correlation was found between the satisfaction and the achieved knowledge. Since the AR solution achieved similar learning results with higher usability scores than the video-based tutorials, these results suggest that AR solutions are the most effective approach to substitute the typical paper-based instructions in consumer electronics., This work has been supported by Spanish MINECO and EU ERDF programs under grant RTI2018-098156-B-C55.
- Published
- 2020
15. Benders decomposition for the mixed no-idle permutation flowshop scheduling problem
- Author
-
Alper Hamzadayi, Tolga Bektaş, and Rubén Ruiz
- Subjects
Mathematical optimization ,Speedup ,Computer science ,Benders decomposition ,ESTADISTICA E INVESTIGACION OPERATIVA ,0211 other engineering and technologies ,02 engineering and technology ,Management Science and Operations Research ,Flowshop scheduling ,Permutation ,Idle ,Artificial Intelligence ,Referenced local search ,Convergence (routing) ,0202 electrical engineering, electronic engineering, information engineering ,Local search (optimization) ,Metaheuristic ,021103 operations research ,Job shop scheduling ,business.industry ,General Engineering ,Mixed no-idle ,Exact algorithm ,020201 artificial intelligence & image processing ,business ,Software - Abstract
[EN] The mixed no-idle flowshop scheduling problem arises in modern industries including integrated circuits, ceramic frit and steel production, among others, and where some machines are not allowed to remain idle between jobs. This paper describes an exact algorithm that uses Benders decomposition with a simple yet effective enhancement mechanism that entails the generation of additional cuts by using a referenced local search to help speed up convergence. Using only a single additional optimality cut at each iteration, and combined with combinatorial cuts, the algorithm can optimally solve instances with up to 500 jobs and 15 machines that are otherwise not within the reach of off-the-shelf optimization software, and can easily surpass ad-hoc existing metaheuristics. To the best of the authors' knowledge, the algorithm described here is the only exact method for solving the mixed no-idle permutation flowshop scheduling problem., This research project was partially supported by the Scientific and Technological Research Council of Turkey (TuBITAK) under Grant 1059B191600107. While writing this paper, Dr Hamzaday was a visiting researcher at the Southampton Business School at the University of Southampton. Ruben Ruiz is supported by the Spanish Ministry of Science, Innovation and Universities, under the Project 'OPTEP-Port Terminal Operations Optimization' (No. RTI2018-094940-B-I00) financed with FEDER funds. Thanks are due to two anonymous reviewers for their careful reading of the paper and helpful suggestions.
- Published
- 2020
16. Utilizing geospatial information to implement SDGs and monitor their Progress
- Author
-
Ali Kharrazi, Ram Avtar, Tonni Agustiono Kurniawan, Ridhika Aggarwal, and Pankaj Kumar
- Subjects
Earth observation ,Geospatial analysis ,Geographic information system ,010504 meteorology & atmospheric sciences ,United Nations ,Computer science ,Geospatial data and techniques ,And indicators ,Big data ,Sustainable development goals ,Continuous planning ,010501 environmental sciences ,Management, Monitoring, Policy and Law ,computer.software_genre ,01 natural sciences ,Citizen science ,Adaptation ,0105 earth and related environmental sciences ,General Environmental Science ,Sustainable development ,business.industry ,Member states ,General Medicine ,Sustainable Development ,Remote sensing ,Pollution ,Data science ,business ,computer ,Goals ,Environmental Monitoring - Abstract
It is more than 4 years since the 2030 agenda for sustainable development was adopted by the United Nations and its member states in September 2015. Several efforts are being made by member countries to contribute towards achieving the 17 Sustainable Development Goals (SDGs). The progress which had been made over time in achieving SDGs can be monitored by measuring a set of quantifiable indicators for each of the goals. It has been seen that geospatial information plays a significant role in measuring some of the targets, hence it is relevant in the implementation of SDGs and monitoring of their progress. Synoptic view and repetitive coverage of the Earth's features and phenomenon by different satellites is a powerful and propitious technological advancement. The paper reviews robustness of Earth Observation data for continuous planning, monitoring, and evaluation of SDGs. The scientific world has made commendable progress by providing geospatial data at various spatial, spectral, radiometric, and temporal resolutions enabling usage of the data for various applications. This paper also reviews the application of big data from earth observation and citizen science data to implement SDGs with a multi-disciplinary approach. It covers literature from various academic landscapes utilizing geospatial data for mapping, monitoring, and evaluating the earth's features and phenomena as it establishes the basis of its utilization for the achievement of the SDGs.
- Published
- 2020
17. A neural network filtering approach for similarity-based remaining useful life estimation
- Author
-
Kai Goebel, Jeffrey Alun Jones, Oguz Bektas, Indranil Roychoudhury, and Shankar Sankararaman
- Subjects
Annan samhällsbyggnadsteknik ,0209 industrial biotechnology ,Computer science ,02 engineering and technology ,Machine learning ,computer.software_genre ,Similarity-based RUL calculation ,Data-driven prognostics ,Industrial and Manufacturing Engineering ,020901 industrial engineering & automation ,Similarity (psychology) ,C-MAPPS datasets ,Estimation ,Artificial neural network ,business.industry ,Mechanical Engineering ,Other Civil Engineering ,Computer Science Applications ,TA ,Control and Systems Engineering ,Prognostics ,Artificial intelligence ,Raw data ,business ,computer ,Software ,Neural networks - Abstract
The role of prognostics and health management is ever more prevalent with advanced techniques of estimation methods. However, data processing and remaining useful life prediction algorithms are often very different. Some difficulties in accurate prediction can be tackled by redefining raw data parameters into more meaningful and comprehensive health level indicators that will then provide performance information. Proper data processing has a significant importance on remaining useful life predictions, for example, to deal with data limitations or/and multi-regime operating conditions. The framework proposed in this paper considers a similarity-based prognostic algorithm that is fed by the use of data normalisation and filtering methods for operational trajectories of complex systems. This is combined with a data-driven prognostic technique based on feed-forward neural networks with multi-regime normalisation. In particular, the paper takes a close look at how pre-processing methods affect algorithm performance. The work presented herein shows a conceptual prognostic framework that overcomes challenges presented by short-term test datasets and that increases the prediction performance with regards to prognostic metrics. Validerad;2019;Nivå 2;2019-04-12 (johcin)
- Published
- 2019
18. Energy Efficiency in Cooperative Wireless Sensor Networks
- Author
-
Jaime Lloret, Jose M. Jimenez, Raquel Lacuesta, and Sandra Sendra
- Subjects
Computer Networks and Communications ,Computer science ,Transport network ,02 engineering and technology ,Fresh products ,0202 electrical engineering, electronic engineering, information engineering ,Cooperative monitoring ,Wireless sensor networks (WSN) ,business.industry ,Node (networking) ,020206 networking & telecommunications ,Energy consumption ,INGENIERIA TELEMATICA ,Energy efficiency ,Hardware and Architecture ,Path (graph theory) ,Shortest path problem ,020201 artificial intelligence & image processing ,business ,Wireless sensor network ,Delivery ,Software ,Constrained Shortest Path First ,Symmetric routing ,Information Systems ,Efficient energy use ,Computer network - Abstract
[EN] The transport of sensitive products is very important because their deterioration may cause the value lost and even the product rejection by the buyer. In addition, it is important to choose the optimal way to achieve this end. In a data network, the task of calculating the best routes is performed by routers. We can consider the optimal path as the one that provides a shortest route. However, if a real transport network is considered the shortest path can sometimes be affected by incidents and traffic jams that would make it inadvisable. On the other hand, when we need to come back, due to features that symmetry provides, it would be interesting to follow the same path in reverse sense. For this reason, in this paper we present a symmetric routing mechanism for cooperative monitoring system for the delivery of fresh products. The systems is based on a combination of fixed nodes and a mobile node that stores the path followed to be able of coming back following the same route in reverse sense. If this path is no longer available, the system will try to maintain the symmetry principle searching the route that provide the shortest time to the used in the initial trip. The paper shows the algorithm used by the systems to calculate the symmetric routes. Finally, the system is tested in a real scenario which combines different kind of roads. As the results shows, the energy consumption of this kind of nodes is highly influenced by the activity of sensors., This work has been supported by the "Ministerio de Economia y Competitividad", through the "Convocatoria 2014. Proyectos I+D -Programa Estatal de Investigacion Cientifica y Tecnica de Excelencia" in the "Subprograma Estatal de Generacion de Conocimiento", (project TIN2014-57991-C3-1- P) and the "programa para la Formacion de Personal Investigador - (FPI-2015-S2-884)" by the "Universitat Politecnica de Valencia".
- Published
- 2019
- Full Text
- View/download PDF
19. Publishing accessible proceedings: the DSAI 2016 case study
- Author
-
Sergio Sayago, Ricardo Pozzobon, and Mireia Ribera
- Subjects
PDF/UA ,Conversion procedures ,Computer Networks and Communications ,business.industry ,Computer science ,05 social sciences ,Software development ,050301 education ,Human-Computer Interaction ,World Wide Web ,Document accessibility ,EPUB3 ,Publishing ,0501 psychology and cognitive sciences ,Accessible proceedings ,business ,0503 education ,Computer communication networks ,050107 human factors ,Software ,Information Systems - Abstract
Access to research papers has changed in the last decades: from printed to digital sources, from close to open access. Despite these changes and broader access to research results, there are still accessibility barriers. A very few number of conferences and journals state an accessibility policy of their publications in their websites. The production of accessible documents is not a common practice in conferences devoted to accessibility. Purpose: The purpose of this paper is to present the case study of DSAI 2016 (Software Development and Technologies for Enhancing Accessibility and Fighting Infoexclusion) conference, wherein we were in charge of making accessible its proceedings. Methods: We discuss the methods and technical procedures we carried out to turn the original articles in MS Word and Latex formats into accessible PDFs and the steps necessary in authoring, conversion and validation. Results: the papers of DSAI 2016 were published in accessible format after much effort, best tools and procedure were using MS Word plus PDF Axes plus PDF Accessibility checker. Conclusion: We state the need to include a new role in the organizing committee of conferences for dealing with accessible publishing.
- Published
- 2019
20. Sparse analytic hierarchy process: an experimental analysis
- Author
-
Roberto Setola, Paolo Dell'Olmo, Gabriele Oliva, and Antonio Scala
- Subjects
0209 industrial biotechnology ,Process (engineering) ,Computer science ,Analytic hierarchy process ,Computational intelligence ,02 engineering and technology ,Machine learning ,computer.software_genre ,Theoretical Computer Science ,Task (project management) ,Body of knowledge ,020901 industrial engineering & automation ,0202 electrical engineering, electronic engineering, information engineering ,sparse information ,analytic hierarchy process ,decision-making ,Set (psychology) ,business.industry ,Aggregate (data warehouse) ,Rank (computer programming) ,020201 artificial intelligence & image processing ,Geometry and Topology ,Artificial intelligence ,business ,computer ,Software - Abstract
The aim of the sparse analytic hierarchy process (SAHP) problem is to rank a set of alternatives based on their utility/importance; this task is accomplished by asking human decision-makers to compare selected pairs of alternatives and to specify relative preference information, in the form of ratios of utilities. However, such an information is often affected by subjective biases or inconsistencies. Moreover, there is no general consent on the best approach to accomplish this task, and in the literature several techniques have been proposed. Finally, when more than one decision-maker is involved in the process, there is a need to provide adequate methodologies to aggregate the available information. In this view, the contribution of this paper to the SAHP body of knowledge is twofold. From one side, it develops a novel methodology to aggregate sparse data given by multiple sources of information. From another side, the paper undertakes an experimental validation of the most popular techniques to solve the SAHP problem, discussing the strength points and shortcomings of the different methodology with respect to a real case study.
- Published
- 2019
21. Proxy-based near real-time TV content transmission in mobility over 4G with MPEG-DASH transcoding on the cloud
- Author
-
Salvador Ferrairó, Román Belda, Ismael de Fez, Juan Carlos Guerri, and Pau Arce
- Subjects
Computer Networks and Communications ,Computer science ,Real-time computing ,ITU-T P.1203 ,Cloud computing ,Buffering ,02 engineering and technology ,Transcoding ,computer.software_genre ,Quality of experience ,TV ,Dynamic Adaptive Streaming over HTTP ,Dynamic adaptive streaming over HTTP (DASH) ,Digital Video Broadcasting ,TEORIA DE LA SEÑAL Y COMUNICACIONES ,0202 electrical engineering, electronic engineering, information engineering ,Media Technology ,4G ,Proxy (statistics) ,Video streaming ,business.industry ,020207 software engineering ,INGENIERIA TELEMATICA ,Proxy ,Transmission (telecommunications) ,Handover ,Hardware and Architecture ,business ,computer ,Software - Abstract
[EN] This paper presents and evaluates a system that provides TV and radio services in mobility using 4G communications. The system has mainly two blocks, one on the cloud and another on the mobile vehicle. On the cloud, a DVB (Digital Video Broadcasting) receiver obtains the TV/radio signal and prepares the contents to be sent through 4G. Specifically, contents are transcoded and packetized using the DASH (Dynamic Adaptive Streaming over HTTP) standard. Vehicles in mobility use their 4G connectivity to receive the flows transmitted by the cloud. The key element of the system is an on-board proxy that manages the received flows and offers them to the final users in the vehicle. The proxy contains a buffer that helps reduce the number of interruptions caused by hand over effects and lack of coverage. The paper presents a comparison between a live transmission using 4G connecting the clients directly with the cloud server and a near real-time transmission based on an on-board proxy. Results prove that the use of the proxy reduces the number of interruptions considerably and, thus, improves the Quality of Experience of users at the expense of slightly increasing the delay., This work is supported by the Centro para el Desarrollo Tecnologico Industrial (CDTI) from the Government of Spain under the project "Plataforma avanzada de conectividad en movilidad" (CDTI IDI-20150126) and the project "Desarrollo de nueva plataforma de entretenimiento multimedia para entornos nauticos" (CDTI TIC-20170102).
- Published
- 2019
- Full Text
- View/download PDF
22. Project Failures: Continuing Challenges for Sustainable Information Systems
- Author
-
Thomas Gilb, Peri Loucopoulos, Leszek A. Maciaszek, Kalle Lyytinen, and Kecheng Liu
- Subjects
Government ,Computer science ,business.industry ,media_common.quotation_subject ,Public debate ,Information technology ,Public relations ,Information system ,Introspection ,Product (category theory) ,Enterprise information system ,Project management ,business ,media_common - Abstract
Much has been written and many discussions have taken place on the causes and cures of IT projects. This introspection is not a new phenomenon. It has been going on as long as industrial size IT systems became into being. The continuing reliance of businesses, government and society on such systems coupled to the realisation that only a little progress has been made in the last 20–30 years in delivering effective and efficient systems are sufficient motivations for continuing this debate. This paper is the product of such a public debate by the authors during the 2004 International Conference on Enterprise Information System. The paper focuses on four topics: ecological complexity, product complexity, project management and education.
- Published
- 2006
23. Limits to anonymity when using credentials
- Author
-
Chris J. Mitchell and Andreas Pashalidis
- Subjects
Focus (computing) ,business.industry ,Computer science ,Faculty of Science\Mathematics ,Research Groups and Centres\Information Security\ Information Security Group ,Internet privacy ,Computer security ,computer.software_genre ,Credential ,Timing attack ,business ,Heuristics ,computer ,Anonymity - Abstract
This paper identifies certain privacy threats that apply to anonymous credential systems. The focus is on timing attacks that apply even if the system is cryptographically secure. The paper provides some simple heuristics that aim to mitigate the exposure to the threats and identifies directions for further research.
- Published
- 2006
24. On the use of models for high-performance scientific computing applications: an experience report
- Author
-
Jean-Michel Bruel, David Lugato, Marc Palyart, Ileana Ober, Commissariat à l'Energie Atomique et aux énergies alternatives - CEA (FRANCE), Centre National de la Recherche Scientifique - CNRS (FRANCE), Institut National Polytechnique de Toulouse - Toulouse INP (FRANCE), Université Toulouse III - Paul Sabatier - UT3 (FRANCE), Université Toulouse - Jean Jaurès - UT2J (FRANCE), Université Toulouse 1 Capitole - UT1 (FRANCE), University of British Columbia (CANADA), Institut de Recherche en Informatique de Toulouse - IRIT (Toulouse, France), Advancing Rigorous Software and System Engineering (IRIT-ARGOS), Institut de recherche en informatique de Toulouse (IRIT), Université Toulouse 1 Capitole (UT1), Université Fédérale Toulouse Midi-Pyrénées-Université Fédérale Toulouse Midi-Pyrénées-Université Toulouse - Jean Jaurès (UT2J)-Université Toulouse III - Paul Sabatier (UT3), Université Fédérale Toulouse Midi-Pyrénées-Centre National de la Recherche Scientifique (CNRS)-Institut National Polytechnique (Toulouse) (Toulouse INP), Université Fédérale Toulouse Midi-Pyrénées-Université Toulouse 1 Capitole (UT1), Université Fédérale Toulouse Midi-Pyrénées, University of British Columbia (UBC), Smart Modeling for softw@re Research and Technology (IRIT-SM@RT), Université Toulouse - Jean Jaurès (UT2J), Commissariat à l'énergie atomique et aux énergies alternatives (CEA), Université Toulouse Capitole (UT Capitole), Université de Toulouse (UT)-Université de Toulouse (UT)-Université Toulouse - Jean Jaurès (UT2J), Université de Toulouse (UT)-Université Toulouse III - Paul Sabatier (UT3), Université de Toulouse (UT)-Centre National de la Recherche Scientifique (CNRS)-Institut National Polytechnique (Toulouse) (Toulouse INP), Université de Toulouse (UT)-Toulouse Mind & Brain Institut (TMBI), Université de Toulouse (UT)-Université de Toulouse (UT)-Université Toulouse III - Paul Sabatier (UT3), Université de Toulouse (UT)-Université Toulouse Capitole (UT Capitole), Université de Toulouse (UT), and Institut National Polytechnique de Toulouse - INPT (FRANCE)
- Subjects
[INFO.INFO-AR]Computer Science [cs]/Hardware Architecture [cs.AR] ,Source code ,Modeling language ,Computer science ,media_common.quotation_subject ,Fortran ,02 engineering and technology ,[INFO.INFO-SE]Computer Science [cs]/Software Engineering [cs.SE] ,Interface homme-machine ,Domain (software engineering) ,Abstraction layer ,Computational science ,[INFO.INFO-CR]Computer Science [cs]/Cryptography and Security [cs.CR] ,Software ,High-performance calculus ,Architectures Matérielles ,020204 information systems ,Architecture ,0202 electrical engineering, electronic engineering, information engineering ,Génie logiciel ,[INFO.INFO-HC]Computer Science [cs]/Human-Computer Interaction [cs.HC] ,Implementation ,media_common ,computer.programming_language ,business.industry ,020207 software engineering ,computer.file_format ,Modélisation et simulation ,[INFO.INFO-MO]Computer Science [cs]/Modeling and Simulation ,Systèmes embarqués ,Modeling and Simulation ,HPC ,Cryptographie et sécurité ,[INFO.INFO-ES]Computer Science [cs]/Embedded Systems ,MDE Model-driven engineering ,Executable ,Model-driven architecture ,Software engineering ,business ,computer - Abstract
International audience; This paper reports on a four-year project that aims to raise the abstraction level through the use of model-driven engineering (MDE) techniques in the development of scientific applications relying on high-performance computing. The development and maintenance of high-performance scientific computing software is reputedly a complex task. This complexity results from the frequent evolutions of supercomputers and the tight coupling between software and hardware aspects. Moreover, current parallel programming approaches result in a mixing of concerns within the source code. Our approach relies on the use of MDE and consists in defining domain-specific modeling languages targeting various domain experts involved in the development of HPC applications, allowing each of them to handle their dedicated model in a both user-friendly and hardware-independent way. The different concerns are separated thanks to the use of several models as well as several modeling viewpoints on these models. Depending on the targeted execution platforms, these abstract models are translated into executable implementations by means of model transformations. To make all of these effective, we have developed a tool chain that is also presented in this paper. The approach is assessed through a multi-dimensional validation that focuses on its applicability, its expressiveness and its efficiency. To capitalize on the gained experience, we analyze some lessons learned during this project.
- Published
- 2018
25. A Multilevel Approach to Sentiment Analysis of Figurative Language in Twitter
- Author
-
Paolo Rosso, Sivaji Bandyopadhyay, Soumadeep Mazumdar, Dipankar Das, and Braja Gopal Patra
- Subjects
Irony ,Metaphor ,Computer science ,media_common.quotation_subject ,02 engineering and technology ,Figurative text ,computer.software_genre ,01 natural sciences ,Literal and figurative language ,Sentiment analysis ,Sentiment abruptness measure ,0202 electrical engineering, electronic engineering, information engineering ,0101 mathematics ,media_common ,Sarcasm ,business.industry ,010102 general mathematics ,Cosine similarity ,Variation (linguistics) ,020201 artificial intelligence & image processing ,Artificial intelligence ,InformationSystems_MISCELLANEOUS ,business ,computer ,LENGUAJES Y SISTEMAS INFORMATICOS ,Natural language processing ,Natural language ,Meaning (linguistics) - Abstract
[EN] Commendable amount of work has been attempted in the field of Sentiment Analysis or Opinion Mining from natural language texts and Twitter texts. One of the main goals in such tasks is to assign polarities (positive or negative) to a piece of text. But, at the same time, one of the important as well as difficult issues is how to assign the degree of positivity or negativity to certain texts. The answer becomes more complex when we perform a similar task on figurative language texts collected from Twitter. Figurative language devices such as irony and sarcasm contain an intentional secondary or extended meaning hidden within the expressions. In this paper we present a novel approach to identify the degree of the sentiment (fine grained in an 11-point scale) for the figurative language texts. We used several semantic features such as sentiment and intensifiers as well as we introduced sentiment abruptness, which measures the variation of sentiment from positive to negative or vice versa. We trained our systems at multiple levels to achieve the maximum cosine similarity of 0.823 and minimum mean square error of 2.170., The work reported in this paper is supported by a grant from the project “CLIA System Phase II” funded by Department of Electronics and Information Technology (DeitY), Ministry of Communications and Information Technology (MCIT), Government of India. The work of the fourth author is also supported by the SomEMBED TIN2015-71147-C2-1-P MINECO research project and by the Generalitat Valenciana under the grant ALMAPATER (PrometeoII/2014/030).
- Published
- 2018
- Full Text
- View/download PDF
26. Cross-language transfer of semantic annotation via targeted crowdsourcing: task design and evaluation
- Author
-
Marcos Calvo, Ioannis Klasinas, Arindam Ghosh, Evgeny A. Stepanov, Ali Orkan Bayer, Emilio Sanchis, Shammur Absar Chowdhury, and Giuseppe Riccardi
- Subjects
Linguistics and Language ,Computer science ,02 engineering and technology ,Library and Information Sciences ,Ontology (information science) ,Temporal annotation ,computer.software_genre ,Crowdsourcing ,Language and Linguistics ,Education ,Annotation ,0202 electrical engineering, electronic engineering, information engineering ,Evaluation ,Parsing ,Information retrieval ,Semantic annotation ,business.industry ,020206 networking & telecommunications ,Ontology ,Cross-language transfer ,020201 artificial intelligence & image processing ,Artificial intelligence ,Computational linguistics ,business ,computer ,LENGUAJES Y SISTEMAS INFORMATICOS ,Natural language processing ,Spoken language - Abstract
[EN] Modern data-driven spoken language systems (SLS) require manual semantic annotation for training spoken language understanding parsers. Multilingual porting of SLS demands significant manual effort and language resources, as this manual annotation has to be replicated. Crowdsourcing is an accessible and cost-effective alternative to traditional methods of collecting and annotating data. The application of crowdsourcing to simple tasks has been well investigated. However, complex tasks, like cross-language semantic annotation transfer, may generate low judgment agreement and/or poor performance. The most serious issue in cross-language porting is the absence of reference annotations in the target language; thus, crowd quality control and the evaluation of the collected annotations is difficult. In this paper we investigate targeted crowdsourcing for semantic annotation transfer that delegates to crowds a complex task such as segmenting and labeling of concepts taken from a domain ontology; and evaluation using source language annotation. To test the applicability and effectiveness of the crowdsourced annotation transfer we have considered the case of close and distant language pairs: Italian-Spanish and Italian-Greek. The corpora annotated via crowdsourcing are evaluated against source and target language expert annotations. We demonstrate that the two evaluation references (source and target) highly correlate with each other; thus, drastically reduce the need for the target language reference annotations., This research is partially funded by the EU FP7 PortDial Project No. 296170, FP7 SpeDial Project No. 611396, and Spanish contract TIN2014-54288-C4-3-R. The work presented in this paper was carried out while the author was affiliated with Universitat Politecnica de Valencia.
- Published
- 2018
27. Structural Feature Selection for Event Logs
- Author
-
Teemu Lehto, Markku Hinkka, Keijo Heljanko, Alexander Jung, Teniente, E, Weidlich, M, and Helsinki Institute for Information Technology
- Subjects
0301 basic medicine ,FOS: Computer and information sciences ,Business process ,Computer science ,Process mining ,Context (language use) ,Feature selection ,Machine Learning (stat.ML) ,02 engineering and technology ,Machine learning ,computer.software_genre ,Machine Learning (cs.LG) ,Computer Science - Software Engineering ,03 medical and health sciences ,Computer Science - Databases ,Statistics - Machine Learning ,0202 electrical engineering, electronic engineering, information engineering ,Automatic business process discovery ,Cluster analysis ,business.industry ,Event (computing) ,Process mining Prediction ,Databases (cs.DB) ,Classification ,113 Computer and information sciences ,Software Engineering (cs.SE) ,Computer Science - Learning ,Statistical classification ,030104 developmental biology ,Clustering Feature selection ,020201 artificial intelligence & image processing ,Artificial intelligence ,Root cause analysis ,business ,computer - Abstract
We consider the problem of classifying business process instances based on structural features derived from event logs. The main motivation is to provide machine learning based techniques with quick response times for interactive computer assisted root cause analysis. In particular, we create structural features from process mining such as activity and transition occurrence counts, and ordering of activities to be evaluated as potential features for classification. We show that adding such structural features increases the amount of information thus potentially increasing classification accuracy. However, there is an inherent trade-off as using too many features leads to too long run-times for machine learning classification models. One way to improve the machine learning algorithms' run-time is to only select a small number of features by a feature selection algorithm. However, the run-time required by the feature selection algorithm must also be taken into account. Also, the classification accuracy should not suffer too much from the feature selection. The main contributions of this paper are as follows: First, we propose and compare six different feature selection algorithms by means of an experimental setup comparing their classification accuracy and achievable response times. Second, we discuss the potential use of feature selection results for computer assisted root cause analysis as well as the properties of different types of structural features in the context of feature selection., Comment: Extended version of a paper published in the proceedings of the BPM 2017 workshops
- Published
- 2018
28. Recent Advances on Telematics Engineering
- Author
-
Ramón Agüero Calvo, Maria Magdalena Payeras Capellà, Jaime Lloret, and Guillem Femenias Nadal
- Subjects
Operations research ,Computer Networks and Communications ,Computer science ,business.industry ,INGENIERIA TELEMATICA ,Networking ,Engineering management ,Hardware and Architecture ,Systems engineering ,Review process ,Telematics ,JITEL 2015 ,business ,Computer communication networks ,Software ,Information Systems - Abstract
This Special Issue includes extended versions of selected papers from the XII Jornadas de Ingeniería Telemática (JITEL 2015), that took place in Palma, Spain, from October 14th to 16th, 2015. These papers underwent a rigorous review process, ensuring that they present enough new material so as to be considered original contributions while avoiding self-plagiarism.
- Published
- 2017
29. Forming classes in an e-Learning social network scenario
- Author
-
Pasquale De Meo, Giuseppe M. L. Sarné, Lidia Fotia, Fabrizio Messina, Domenico Rosaci, BADICA C EL FALLAH S SEGHROUCHNI A BEYNIER A CAMACHO D HERPSON C HINDRIKS K NOVAIS P, De Meo, P, Fotia, L, Messina, F, Rosaci, D, and Sarne', G
- Subjects
Class (computer programming) ,MULTI-AGENT SYSTEMS ,Knowledge management ,Social network ,Exploit ,business.industry ,Computer science ,E-learning (theory) ,05 social sciences ,E-LEARNING ,050301 education ,050801 communication & media studies ,Artificial Intelligence, Online Social Networks ,Data science ,Trust relationship ,0508 media and communications ,ONLINE SOCIAL NETWORK ,Distributed algorithm ,Software agent ,Convergence (relationship) ,business ,0503 education - Abstract
The use of network technology to provide online courses is the latest trend in the training and development industry and has been defined as the “e-Learning revolution”. On the other hand, Online Social Networks (OSNs) represent today an effective possibility to have common and easy-to-use platforms for supporting e-Learning activities. However, as underlined by previous studies, many of the proposed e-Learning systems can result in confusion and decrease the learner’s interest. In this paper, we introduce the possibility to form e-Learning classes in the context of OSNs. At the best of our knowledge, any of the approaches proposed in the past considers the evolution of on-line classes as a problem of matching the individual users’ profiles with the profiles of the classes. In this paper, we propose an algorithm that exploits a multi-agent system to suitably distribute such a matching computation on all the user devices. The good effectiveness and the promising efficiency of our approach is shown by the experimental results obtained on simulated On-line Social Networks data.
- Published
- 2017
30. Analysis of Network Management and Monitoring using Cloud Computing
- Author
-
Razvan Gheorghe, Florin Pop, Aniello Castiglione, Ciprian Dobre, Victor Suciu, and George Suciu
- Subjects
business.industry ,Computer science ,Network security ,Distributed computing ,020207 software engineering ,Cloud computing ,02 engineering and technology ,Network monitoring ,Networking hardware ,Network management ,Utility computing ,Cloud testing ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,The Internet ,business - Abstract
In the near future the number of equipment connected to the Internet will greatly increase, so that further development of applications meant to verify their operations will be required. Monitoring represents an important factor in improving the quality of the services provided in cloud computing, given the fact that it allows scaling resource utilization in an adaptive manner. This paper aims to provide a solution for the monitoring of network devices and services, allowing administrator to verify connectivity of the equipment, their performances and network security. The main contribution of the paper consists in proposing an integrated solution that is deployed in the cloud for monitoring all the network components. Finally, the paper discusses the main findings and advantages for a reference implementation of the monitoring system using a simulated network.
- Published
- 2016
31. Action boundaries detection in a video
- Author
-
Bassem Haidar, Hassan Wehbe, Philippe Joly, Centre National de la Recherche Scientifique - CNRS (FRANCE), Institut National Polytechnique de Toulouse - Toulouse INP (FRANCE), Lebanese University - LU (LEBANON), Université Toulouse III - Paul Sabatier - UT3 (FRANCE), Université Toulouse - Jean Jaurès - UT2J (FRANCE), Université Toulouse 1 Capitole - UT1 (FRANCE), Équipe Structuration, Analyse et MOdélisation de documents Vidéo et Audio (IRIT-SAMoVA), Institut de recherche en informatique de Toulouse (IRIT), Université Toulouse 1 Capitole (UT1), Université Fédérale Toulouse Midi-Pyrénées-Université Fédérale Toulouse Midi-Pyrénées-Université Toulouse - Jean Jaurès (UT2J)-Université Toulouse III - Paul Sabatier (UT3), Université Fédérale Toulouse Midi-Pyrénées-Centre National de la Recherche Scientifique (CNRS)-Institut National Polytechnique (Toulouse) (Toulouse INP), Université Fédérale Toulouse Midi-Pyrénées-Université Toulouse 1 Capitole (UT1), Université Fédérale Toulouse Midi-Pyrénées, Lebanese International University (LIU), and Institut National Polytechnique de Toulouse - INPT (FRANCE)
- Subjects
0209 industrial biotechnology ,Computer Networks and Communications ,Computer science ,02 engineering and technology ,Video analysis ,[INFO.INFO-AI]Computer Science [cs]/Artificial Intelligence [cs.AI] ,Codebook quantization ,Traitement des images ,020901 industrial engineering & automation ,Segmentation ,[INFO.INFO-TS]Computer Science [cs]/Signal and Image Processing ,0202 electrical engineering, electronic engineering, information engineering ,Media Technology ,Traitement du signal et de l'image ,Computer vision ,Action detection ,Synthèse d'image et réalité virtuelle ,Pixel ,business.industry ,Quantization (signal processing) ,Codebook ,[INFO.INFO-CV]Computer Science [cs]/Computer Vision and Pattern Recognition [cs.CV] ,Vision par ordinateur et reconnaissance de formes ,Intelligence artificielle ,[INFO.INFO-GR]Computer Science [cs]/Graphics [cs.GR] ,Hardware and Architecture ,[INFO.INFO-TI]Computer Science [cs]/Image Processing [eess.IV] ,020201 artificial intelligence & image processing ,Artificial intelligence ,business ,Software - Abstract
International audience; In the video analysis domain, automatic detection of actions performed in a recorded video represents an important scientific and industrial challenge. This paper presents a new method to approximate the boundaries of actions performed by a person while interacting with his environment (such as moving objects). This method relies on a Codebook quantization method to analyze the rough evolution of each pixel and then decide whether this evolution corresponds to an action or not; this decision is taken by an automated system. Statistics are then produced - at the scale of the whole frame - to estimate the start and the end of an action. According to our proposed evaluation protocol, this method produces interesting results on both real and simulated videos. This statistic-based protocol is discussed at the end of this paper. The interpretation of this evaluation protocol nominates this method to be a solid base to localize the exact boundaries of actions or - in the framework of this research activity - to associate prescriptive text with a visual content.
- Published
- 2015
32. Evidence analysis method using Bloom filter for MANET forensics
- Author
-
Takashi Mishina, Yoh Shiraishi, and Osamu Takahashi
- Subjects
Network forensics ,Computer science ,business.industry ,ComputerSystemsOrganization_COMPUTER-COMMUNICATIONNETWORKS ,Evidence analysis ,Mobile ad hoc network ,Bloom filter ,Computer security ,computer.software_genre ,Forensic science ,ComputingMilieux_COMPUTERSANDEDUCATION ,business ,computer ,Computer network - Abstract
Lecture Notes in Computer Science, Various security weaknesses have been identified in mobile ad-hoc networks (MANET). The paper focuses on MANET forensics whereby a third party can prove there was attack by collecting and analyzing evidence about it. The paper describes such a MANET forensics analysis method using a Bloom filter.
- Published
- 2010
33. Research on an online self-organizing radial basis function neural network
- Author
-
Honggui Han, Qili Chen, and Junfei Qiao
- Subjects
Artificial neural network ,business.industry ,Time delay neural network ,Computer science ,Growing and pruning approach ,System identification ,MathematicsofComputing_NUMERICALANALYSIS ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,BOD soft measurement ,Probabilistic neural network ,Models of neural computation ,Function approximation ,ComputingMethodologies_PATTERNRECOGNITION ,Artificial Intelligence ,Radial basis function ,Original Article ,Artificial intelligence ,Pruning (decision trees) ,business ,Software ,Hierarchical RBF ,Self-organizing RBF neural network (SORBF) - Abstract
A new growing and pruning algorithm is proposed for radial basis function (RBF) neural network structure design in this paper, which is named as self-organizing RBF (SORBF). The structure of the RBF neural network is introduced in this paper first, and then the growing and pruning algorithm is used to design the structure of the RBF neural network automatically. The growing and pruning approach is based on the radius of the receptive field of the RBF nodes. Meanwhile, the parameters adjusting algorithms are proposed for the whole RBF neural network. The performance of the proposed method is evaluated through functions approximation and dynamic system identification. Then, the method is used to capture the biochemical oxygen demand (BOD) concentration in a wastewater treatment system. Experimental results show that the proposed method is efficient for network structure optimization, and it achieves better performance than some of the existing algorithms.
- Published
- 2010
34. Feature-space transformation improves supervised segmentation across scanners
- Author
-
van Opbroek, Annegreet, Achterberg, Hakim C., de Bruijne, Marleen, Bhatia, Kanwal K., Lombaert, Herve, Radiology & Nuclear Medicine, and Medical Informatics
- Subjects
Computer science ,business.industry ,Feature vector ,Pattern recognition ,computer.software_genre ,Transformation (function) ,Voxel ,Feature (computer vision) ,Segmentation ,Computer vision ,Artificial intelligence ,Transfer of learning ,business ,computer - Abstract
Image-segmentation techniques based on supervised classification generally perform well on the condition that training and test samples have the same feature distribution. However, if training and test images are acquired with different scanners or scanning parameters, their feature distributions can be very different, which can hurt the performance of such techniques. We propose a feature-space-transformation method to overcome these differences in feature distributions. Our method learns a mapping of the feature values of training voxels to values observed in images from the test scanner. This transformation is learned from unlabeled images of subjects scanned on both the training scanner and the test scanner. We evaluated our method on hippocampus segmentation on 27 images of the Harmonized Hippocampal Protocol (HarP), a heterogeneous dataset consisting of 1.5T and 3T MR images. The results showed that our feature space transformation improved the Dice overlap of segmentations obtained with an SVM classifier from 0.36 to 0.85 when only 10 atlases were used and from 0.79 to 0.85 when around 100 atlases were used.
- Published
- 2015
35. Lessons Learned in Deploying Independent Living Technologies to Older Adults’ Homes
- Author
-
Julie Doyle, Cliodhna Ni Scanaill, Flip van den Berg, and Cathy Bailey
- Subjects
education.field_of_study ,Computer Networks and Communications ,business.industry ,Population ,Public relations ,Human-Computer Interaction ,Work (electrical) ,Software deployment ,Computer Science ,Psychology ,Sociology ,Older people ,education ,business ,Computer communication networks ,Software ,Independent living ,Simulation ,Information Systems - Abstract
Independent living technologies are fast gaining interest within both academia and industry, amid the realization that the world's population is ageing. Technology can increase the quality of life of older people, allowing them to age-in-place and helping them to remain physically, cognitively and socially engaged with their environment. However, little research in this area is applied. The paper argues for the necessity of moving such technology out of the research laboratory and into the home, where its real impact on the lives of older adults can be assessed. Moreover, a series of recommendations are outlined, encompassing the life cycle of independent living technologies, from ethnographic assessment, through to design, deployment and evaluation. This work is based on lessons learned in deploying such technologies to older people in over 200 homes. This paper can act as a guide for other researchers interested in developing technologies with older people.
- Published
- 2014
36. Security Analysis of CRT-Based Cryptosystems
- Author
-
Katsuyuki Okeya and Tsuyoshi Takagi
- Subjects
Computer Networks and Communications ,business.industry ,Computer science ,Cryptography ,Computer security ,computer.software_genre ,Timing attack ,Power analysis ,Rabin cryptosystem ,Cryptosystem ,Hybrid cryptosystem ,Chosen-ciphertext attack ,Side channel attack ,Arithmetic ,Safety, Risk, Reliability and Quality ,business ,computer ,Software ,Information Systems - Abstract
A side channel attack (SCA) is a serious attack on the implementation of cryptosystems, which can break the secret key using side channel information such as timing, power consumption, etc. Recently, Boneh et al. showed that SSL is vulnerable to SCA if the attacker gets access to the local network of the server. Therefore, public-key infrastructure eventually becomes a target of SCA. In this paper, we investigate the security of RSA cryptosystem using the Chinese remainder theorem (CRT) in the sense of SCA. Novak first proposed a simple power analysis (SPA) against the CRT part using the difference of message modulo p and modulo q. In this paper, we apply Novak’s attack to the other CRT-based cryptosystems, namely Multi-Prime RSA, Multi-Exponent RSA, Rabin cryptosystem, and HIME(R) cryptosystem. Novak-type attack strictly depends on how to implement the CRT. We examine the operations related to CRT of these cryptosystems, and show that an extended Novak-type attack is effective on them. Moreover, we present a novel attack called zero-multiplication attack. The attacker tries to guess the secret prime by producing ciphertexts that cause a multiplication with zero during the decryption, which is easily detected by power analysis. Our experimental result shows that the timing with the zero multiplication is reduced about 10% from the standard one. Finally, we propose countermeasures against these attacks. The proposed countermeasures are based on the ciphertext blinding, but they require no inversion operation. The overhead of the proposed scheme is only about 1–5% of the whole decryption if the bit length of modulus is 1,024.
- Published
- 2006
37. Further Security Analysis of XTR
- Author
-
Dong-Guk Han, Tsuyoshi Takagi, and Jongin Lim
- Subjects
Exponentiation ,Computer science ,business.industry ,CEILIDH ,Computer security ,computer.software_genre ,Public-key cryptography ,Power analysis ,Collision attack ,XTR ,Side channel attack ,Arithmetic ,business ,computer ,Key size - Abstract
In Crypto 2000 and 2003, Lenstra-Verheul and Rubin- Silverberg proposed XTR public key system and torus based public key cryptosystem CEILIDH, respectively. The common main idea of XTR and CEILIDH is to shorten the bandwidth of transmission data. Due to the contribution of Granger et al., that is the comparison result of the performance of CEILIDH and XTR, XTR is an excellent alternative to either RSA or ECC in some applications, where computational power and memory capacity are both very limited, such as smart-cards. Among the family of XTR algorithm, Improved XTR Single Exponentiation (XTR-ISE) is the most efficient one, which computes single exponentiation. However, there are few papers investigating the side channel attacks of XTR-ISE, even though the memory constraint devices suffer most from vulnerability to side channel attacks. Chung-Hasan and Page-Stam tried to analyze XTR-ISE with the known simple power analysis, but unfortunately their approach were not practically feasible. Recently, Han et al. proposed new collision attack on it with analysis complexity O(240) when the key size is 160-bit. In this paper we analyze XTR-ISE from other point of view, namely differential power analysis (DPA). One straightforward result is that XTR-ISE can be free from the original DPA. However, a non-trivial result is that an enhancing DPA proposed in this paper threatens XTR-ISE. Furthermore, we show several weak points of the structure of XTR-ISE. From our simulation results, we show the proposed attack requires about 584 times queries to DPA_Oracle to detect the whole 160-bit secret value. This result shows that XTR-ISE is vulnerable to the proposed enhancing DPA.
- Published
- 2006
38. Artificial intelligent system for multimedia services in smart home environments
- Author
-
Jose M. Jimenez, Albert Rego, Pedro Luis Gonzalez Ramirez, and Jaime Lloret
- Subjects
Service (systems architecture) ,Computer Networks and Communications ,Computer science ,020209 energy ,02 engineering and technology ,computer.software_genre ,Field (computer science) ,User experience design ,Smart home ,Home automation ,Reinforcement learning ,0202 electrical engineering, electronic engineering, information engineering ,Computer communication networks ,Multimedia ,business.industry ,Deep learning ,020206 networking & telecommunications ,INGENIERIA TELEMATICA ,Classification ,Artificial intelligence ,business ,Internet of Things ,computer ,Software - Abstract
[EN] Internet of Things (IoT) has introduced new applications and environments. Smart Home provides new ways of communication and service consumption. In addition, Artificial Intelligence (AI) and deep learning have improved different services and tasks by automatizing them. In this field, reinforcement learning (RL) provides an unsupervised way to learn from the environment. In this paper, a new intelligent system based on RL and deep learning is proposed for Smart Home environments to guarantee good levels of QoE, focused on multimedia services. This system is aimed to reduce the impact on user experience when the classifying system achieves a low accuracy. The experiments performed show that the deep learning model proposed achieves better accuracy than the KNN algorithm and that the RL system increases the QoE of the user up to 3.8 on a scale of 10., This work has been partially supported by the "Ministerio de Economia y Competitividad" in the "Programa Estatal de Fomento de la Investigacion Cientifica y Tecnica de Excelencia, Subprograma Estatal de Generacion de Conocimiento" within the project under Grant TIN2017-84802-C2-1-P. This work has also been partially founded by the Universitat Polite`cnica de Vale`ncia through the postdoctoral PAID-10-20 program.
- Published
- 2022
39. A Case Study to Enable and Monitor Real IT Companies Migrating from Waterfall to Agile
- Author
-
Luca Mainetti, Luigi Manco, Antonio Capodieci, B. Murgante et al. (Eds.), Capodieci, Antonio, Mainetti, Luca, and Manco, Luigi
- Subjects
Waterfall ,geography ,Agile ,Agile usability engineering ,geography.geographical_feature_category ,Migration towards Agile ,Computer science ,business.industry ,Control (management) ,Agile Unified Process ,Software Engineering ,Software metric ,Software Metric ,Software engineering ,business ,Simulation ,Agile software development - Abstract
Agile development methods are becoming increasingly important to face continuously changing requirements. Nevertheless, the adoption of such methods in industrial environments still needs to be fostered. Companies call for tools to keep under (quantitative) control both agility and coordination of IT teams. In this paper, we report on an empirical case study aiming at enabling real com-panies migrating from waterfall to Agile software development process. Our re-search effort has been spent in introducing 11 different IT small and medium-sized enterprises to Agile methods and to observe them executing real business projects. To have a common evaluation framework, we selected a set of 61 met-rics, with the purpose of monitoring and measuring the evolution towards Agile methods of each company involved in the experiment. In the paper, we provide readers with empirical data on companies’ feedback. We report on two different categories of real insights into what the companies are doing: (i) the metrics they consider to be useful and/or directly exploitable by product lines beyond the theoretical definitions; (ii) the tools and strategies they are adopting and connecting to existing development environments to easily collect data from the metrics, and evaluate quantitative improvements in the Agile methods.
- Published
- 2014
40. The nutrition researcher cohort: toward a new generation of nutrition research and health optimization
- Author
-
Ben van Ommen
- Subjects
Gerontology ,business.industry ,Computer science ,Endocrinology, Diabetes and Metabolism ,media_common.quotation_subject ,Big data ,Flexibility (personality) ,Public relations ,Nutrigenomics ,MSB - Microbiology and Systems Biology ,Editorial ,Life ,Health care ,Genetics ,Citizen science ,eHealth ,Food and Nutrition ,EELS - Earth, Environmental and Life Sciences ,business ,Empowerment ,mHealth ,Healthy Living ,media_common ,Nutrition - Abstract
It is now possible and affordable to sequence my own genome and multiple aspects of my phenotype. In a recent paper (Chen et al. 2012), Mike Snyder demonstrated the beauty of this by “self-quantifying” his “integrative Personal Omics Profile” during one year. Interestingly, he did not “quantify” his lifestyle (dietary intake, physical exercise). When, halfway during his experiment, he curiously developed type 2 diabetes, the cure was eating less (which was neither specified in the paper). Apparently, “big science” does not realize the need for quantification, nor the impact of the environment. Nutrition research is developing in a genotype × phenotype × environment interaction science. Actually, the capacity to constantly adapt to a changing environment is now being coined as a new definition of health (Huber et al. 2011; van Ommen et al. 2009). On a molecular regulatory level, a multitude of processes is constantly fine-tuning aspects of our phenotype to maintain and regain homeostasis after dietary, metabolic, oxidative, inflammatory and other challenges. We have coined this as “phenotypic flexibility,” Quantification of our health status thus needs to take into account our ability to cope with these challenges. If my health is determined by my “genotype × phenotype × environment,” we need to properly “quantify” each of these components. Modern nutrition science has seen a number of major developments over the last 10 years: the inclusion of omics technologies (nutrigenomics), the modern version of “back to physiology” (nutritional systems biology), the personalized nutrition hype, new metabolomics-based methods of food intake quantification, etc. The big question now remains: Is nutrition science ready to “quantify” my personal health status and provide related personal dietary and lifestyle advice based on quantification of my own genotype × phenotype × environment interaction? A careful look at changes in healthcare points at the urgent need for both prevention and personal empowerment (Fani Marvasti and Stafford 2012). Each individual, in order to properly take control of one`s own health, needs access or even better, needs to own all relevant information regarding personal health status. Apart from the above-mentioned integrative personal omics profile, other activities point in this direction. There is a push from the “medical records” front, and more interestingly, the Quantified Self crowd source movement (http://quantifiedself.com) launches all kinds of initiatives. Developments in personal sensors are exploding and the European Commission takes this very seriously with action plans on eHealth and mHealth (http://ec.europa.eu/digital-agenda/en/eHealth). An interesting feature of most of the above-mentioned initiatives is that the generated data usually become open access. This is not just an enforced action from the journals or funders, but actually is a new trend in science on top of the “big data” wave, where new science and business models develop on the awareness that it is more advantageous to share than not to share (Friend and Norman 2013). Now, imagine that we have access to thousands of “Mike Snyder” datasets, extended with proper dietary intake data and all other relevant parameters and that all these data sources were standardized, open access and covering all relevant aspects of genotyping, phenotyping and “exotyping” over a number of years. This would become a treasure for nutrition and health research! NuGO, the Nutrigenomics Organisation has taken up the challenge to organize this: an open access cohort where each individual provides and owns her/his own health data that both provide an empowerment for individual health optimization and, brought together, a powerful open access cohort. As a first step, a 2-year project is being launched to establish all analytical methods, standards and operation procedures, data infrastructure, ethical and privacy aspects, governance, etc. We have decided to stay “close to home” in this experimental phase and enroll ourselves, (nutrition) scientists, as participants. Who else then us can best evaluate how to quantify food intake, assess the real use of genetic variation for our nutritional phenotype, determine the best “do it yourself” challenge test to determine phenotypic flexibility, etc. Thus, the “Nutrition Researcher Cohort” (NRC) is born. Details are provided at www.nugo.org/nrc. The NRC is thus a “crowd science” project, where we, as experts/subjects, both participate and build. If you are interested, please join, either as participant and as scientist. Once the two year initial phase is passed, we can implement the lessons learned in a really new mix between a nutrition and health cohort and a personal healthcare setting.
- Published
- 2013
41. Designing user interfaces for 'ordinary users in extraordinary circumstances': a keyboard-only web-based application for use in airports
- Author
-
Simeon Keates
- Subjects
T1 ,Computer Networks and Communications ,business.industry ,Interface (Java) ,Computer science ,Universal design ,Usability ,Context (language use) ,Throughput ,Human-Computer Interaction ,World Wide Web ,Tab key ,Human–computer interaction ,Web application ,User interface ,business ,Software ,Information Systems - Abstract
Universal Access is commonly interpreted as focusing on designing for users with atypical requirements--specifically users with disabilities or older adults. However, Universal Access is also about providing access to users in all situations and circumstances, including those that place extraordinary or unusual demands on the users who might otherwise not need assistance. This paper examines the design of a user interface (UI) for use in an airport environment and explains how the lessons learned from research into designing for users with disabilities, in particular, have been applied in this new context. The paper further describes a series of experiments that were performed to demonstrate the usability of the new interface and also compares the efficiency and effectiveness of three different input strategies developed for the new UI. The most efficient method of input was a strategy of combined keyboard shortcuts offering access to the full functionality of the UI. The case study also highlights that new Web 2.0 technologies support the implementation of accessibility solutions more typically only associated with non-Web applications. Further, it demonstrates that relying on only the TAB key for supporting keyboard-only access is comparatively inefficient, and that Web developers should be actively encouraged to use all of the available functionality from Web 2.0 technologies to produce more flexible and efficient keyboard-only support.
- Published
- 2013
42. The Race for Visibility and Value
- Author
-
Vipin Chandra Kalia
- Subjects
Value (ethics) ,Impact factor ,Operations research ,business.industry ,Computer science ,media_common.quotation_subject ,Visibility (geometry) ,Public relations ,Microbiology ,Race (biology) ,Editorial ,Work (electrical) ,Rest (finance) ,Quality (business) ,business ,Publication ,media_common - Abstract
We the members of the Editorial team of Indian Journal of Microbiology (INJM) are extremely thankful to all those who believe in the values of this journal. This is evident from the fact that quite a few of you have published your quality work with us. In the present scenario, the value of the work is judged from the citations it receives, which happens with time only. However, at the time of publication, people judge the quality of the work from the impact factor (IF) of the journal, where your precious work has been published. Now, this race to get recognized through IF is getting fierce. We are left with hardly any option but to participate in it and run, that too very fast lest we are left behind among the muck. It is important to move forward and get recognized. It’s time to have a few wins among your precious possessions. IF of INJM for 2012 will be declared very soon in June/July 2013. Since it will be based on the papers published in 2010 and 2011 but cited in 2012, there is little which we can do now. We will be facing the reality and hoping for the best to achieve a respectable IF. Our hopes are quite high in view of the fact that there are quite a few papers which have been cited in Journals of high IF, such as—Nature Reviews Microbiology, Biotechnology Advances, Current Opinion in Biotechnology, Critical Reviews in Environmental Science, FEMS Microbiology, Infection and Immunity, ISME Journal, Applied and Environmental Microbiology, Bioresource Technology, PLoS One, International Journal of Hydrogen Energy, etc. We must acknowledge the authors of these contributions, which have taken INJM to a higher platform and improved it visibility among the researchers worldwide. It also gives confidence to those who hesitate to publish their prestigious and important works in INJM, fearing that these may go unnoticed. With around 6,500 subscribers and 3,400 Life members of AMI going through these articles, INJM is as good as an “Open Access” journal. Since we have already seen Vol. 53(1) of INJM in March 2013, so in our own interest, we need to act in a focused manner. To enhance the visibility of our work we need to cite our present work (published in INJM) in our subsequent publications (in journals other than INJM) but to get it recognized (enhanced IF of INJM) we need to cite papers (of other authors, self citation will not prove effective) preferably from INJM published in 2011, 2012, 2013 and from Online First papers as well. INJM Vol. 53(2) is in your hands and you can expect the rest to follow soon. We very strongly believe that you will be submitting your innovative and novel works in future as well and the citations of your work will help us improve the status and image of INJM.
- Published
- 2013
43. Multi-objective optimal control of dynamic bioprocesses using ACADO Toolkit
- Author
-
Filip Logist, Boris Houska, Jan Van Impe, Dries Telen, and Moritz Diehl
- Subjects
0106 biological sciences ,Optimal design ,Intersection (set theory) ,Computer science ,business.industry ,Pareto principle ,Boundary (topology) ,Bioengineering ,Control engineering ,02 engineering and technology ,General Medicine ,Optimal control ,01 natural sciences ,Models, Biological ,Set (abstract data type) ,Software ,020401 chemical engineering ,010608 biotechnology ,Computer Simulation ,0204 chemical engineering ,Bioprocess ,business ,Biotechnology - Abstract
The optimal design and operation of dynamic bioprocesses gives in practice often rise to optimisation problems with multiple and conflicting objectives. As a result typically not a single optimal solution but a set of Pareto optimal solutions exist. From this set of Pareto optimal solutions, one has to be chosen by the decision maker. Hence, efficient approaches are required for a fast and accurate generation of the Pareto set such that the decision maker can easily and systematically evaluate optimal alternatives. In the current paper the multi-objective optimisation of several dynamic bioprocess examples is performed using the freely available ACADO Multi-Objective Toolkit (http://www.acadotoolkit.org). This toolkit integrates efficient multiple objective scalarisation strategies (e.g., Normal Boundary Intersection and (Enhanced) Normalised Normal Constraint) with fast deterministic approaches for dynamic optimisation (e.g., single and multiple shooting). It has been found that the toolkit is able to efficiently and accurately produce the Pareto sets for all bioprocess examples. The resulting Pareto sets are added as supplementary material to this paper. ispartof: Bioprocess and Biosystems Engineering vol:36 issue:2 pages:151-164 ispartof: location:Germany status: published
- Published
- 2013
44. Towards a framework for the development of adaptable service-based applications
- Author
-
Stephen Lane, Qing Gu, Patricia Lago, Ita Richardson, SFI, Software and Sustainability (S2), Network Institute, Software & Services, and Information Management & Software Engineering
- Subjects
software process ,Social software engineering ,Software Engineering Process Group ,Database ,Computer science ,business.industry ,Software as a service ,Service design ,Software development ,maintenance process ,computer.software_genre ,service-based application life-cycle ,Service virtualization ,Management Information Systems ,SDG 17 - Partnerships for the Goals ,Hardware and Architecture ,Personal software process ,service-based application adaptation ,Data as a service ,business ,Software engineering ,computer ,Software ,Information Systems - Abstract
Service-oriented computing is a promising computing paradigm which facilitates the composition of loosely coupled and adaptable applications. Unfortunately, this new paradigm does not lend itself easily to traditional software engineering methods and principles due to the decentralised nature of software services. The goal of this paper is to identify a set of engineering activities that can be used to develop adaptable service-based applications. Rather than focusing on the entire service-based application development life-cycle, this paper will focus on adaptation-specific processes and activities and map them to an existing high-level service-based application development life-cycle. Existing software engineering literature as well as research results from service engineering research is reviewed for relevant activities. The result is an adaptation framework that can guide software engineers in developing adaptable service-based applications. © 2013 Springer-Verlag London.
- Published
- 2013
45. Masking vs. Multiparty Computation: How Large Is the Gap for AES?
- Author
-
Grosso, Vincent, Standaert, François-Xavier, Faust, Sebastian, Cryptographic Hardware and Embedded Systems - CHES 2013 - 15th International Workshop, and UCL - SST/ICTM/ELEN - Pôle en ingénierie électrique
- Subjects
Multiplication algorithm ,Speedup ,Computer Networks and Communications ,business.industry ,Computer science ,Computation ,Cryptography ,0102 computer and information sciences ,02 engineering and technology ,01 natural sciences ,Secret sharing ,Masking (Electronic Health Record) ,Data encryptation ,Systems and data security ,MPC ,010201 computation theory & mathematics ,020204 information systems ,Algorithm analysis problem complexity ,MultiParty Computation ,0202 electrical engineering, electronic engineering, information engineering ,business ,Implementation ,Algorithm ,Software ,Randomness - Abstract
In this paper, we evaluate the performances of state-of-the-art higher order masking schemes for the AES. Doing so, we pay a particular attention to the comparison between specialized solutions introduced exclusively as countermeasures against side-channel analysis, and a recent proposal by Roche and Prouff exploiting multiparty computation (MPC) techniques. We show that the additional security features this latter scheme provides (e.g., its glitch-freeness) come at the cost of large performance overheads. We then study how exploiting standard optimization techniques from the MPC literature can be used to reduce this gap. In particular, we show that “packed secret sharing” based on a modified multiplication algorithm can speed up MPC-based masking when the order of the masking scheme increases. Eventually, we discuss the randomness requirements of masked implementations. For this purpose, we first show with information theoretic arguments that the security guarantees of masking are only preserved if this randomness is uniform, and analyze the consequences of a deviation from this requirement. We then conclude the paper by including the cost of randomness generation in our performance evaluations. These results should help actual designers to choose a masking scheme based on security and performance constraints.
- Published
- 2013
46. Structured multi-class feature selection for effective face recognition
- Author
-
Luca Zini, Giovanni Fusco, Nicoletta Noceti, and Francesca Odone
- Subjects
Lasso (statistics) ,Local binary patterns ,Computer science ,business.industry ,Pipeline (computing) ,Benchmark (computing) ,Point (geometry) ,Feature selection ,Pattern recognition ,Artificial intelligence ,Representation (mathematics) ,business ,Facial recognition system - Abstract
This paper addresses the problem of real time face recognition in unconstrained environments from the analysis of low quality video frames. It focuses in particular on finding an effective and fast to compute (that is, sparse) representation of faces, starting from classical Local Binary Patterns (LBPs). The two contributions of the paper are a new formulation of Group LASSO for structured feature selection (MCGroup LASSO) to cope directly with multi-class settings, and a face recognition pipeline based on a representation derived from MC-GrpLASSO. We present an extensive experimental analysis on two benchmark datasets, MOBO and Choke Point, and on a more complex dataset acquired in-house over a large temporal span. We compare our results with state-of-the-art approaches and show the superiority of our method in terms of both performances and sparseness of the obtained solution.
- Published
- 2013
47. Breast Ultrasound Image Classification Based on Multiple-Instance Learning
- Author
-
Yingtao Zhang, Jianrui Ding, Jiafeng Liu, Heng-Da Cheng, and Jianhua Huang
- Subjects
Support Vector Machine ,Databases, Factual ,Computer science ,Image quality ,Feature vector ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Breast Neoplasms ,Sensitivity and Specificity ,Article ,Pattern Recognition, Automated ,Image Interpretation, Computer-Assisted ,medicine ,Cluster Analysis ,Humans ,Radiology, Nuclear Medicine and imaging ,Computer vision ,Breast ultrasound ,Radiological and Ultrasound Technology ,Contextual image classification ,medicine.diagnostic_test ,business.industry ,Pattern recognition ,Speckle noise ,Image segmentation ,Models, Theoretical ,Computer Science Applications ,Support vector machine ,ComputingMethodologies_PATTERNRECOGNITION ,ROC Curve ,Pattern recognition (psychology) ,Female ,Artificial intelligence ,Ultrasonography, Mammary ,business ,Algorithms - Abstract
Breast ultrasound (BUS) image segmentation is a very difficult task due to poor image quality and speckle noise. In this paper, local features extracted from roughly segmented regions of interest (ROIs) are used to describe breast tumors. The roughly segmented ROI is viewed as a bag. And subregions of the ROI are considered as the instances of the bag. Multiple-instance learning (MIL) method is more suitable for classifying breast tumors using BUS images. However, due to the complexity of BUS images, traditional MIL method is not applicable. In this paper, a novel MIL method is proposed for solving such task. First, a self-organizing map is used to map the instance space to the concept space. Then, we use the distribution of the instances of each bag in the concept space to construct the bag feature vector. Finally, a support vector machine is employed for classifying the tumors. The experimental results show that the proposed method can achieve better performance: the accuracy is 0.9107 and the area under receiver operator characteristic curve is 0.96 (p
- Published
- 2012
48. Building the IJSO: An International Editorial Perspective
- Author
-
David A. Rew
- Subjects
Impact factor ,business.industry ,Computer science ,Citation index ,Scopus ,Bibliometrics ,Public relations ,World literature ,Editorial ,Oncology ,Publishing ,Surgery ,Sphere of influence ,business ,Citation - Abstract
The IJSO is a new peer reviewed journal for a specialist professional audience. All new journals face challenges in developing a vision and strategy, and establishing a style and a “Unique Selling Proposition” (USP). The USP establishes the place of the journal in the world literature and in the minds of potential authors and readers, and it is sensible for the Editorial Board to have a clear business plan and appropriate targets by which to measure progress. A Society journal such as the IJSO exists to serve the interests of the society members and subscribers, and to provide a platform for the exchange of information in that professional community through the peer reviewed publication process. As the journal grows, so its influence and academic weight may increase, and it can take on wider challenges with greater ambition. As Medical Subject Chair of the Elsevier SCOPUS Content Selection and Advisory Board (CSAB), I am privileged to assess several hundred specialist journal titles which are seeking accession to this major citation index each year, from a wide variety of regional and national institutions, and with a wide range in quality. In the process, I have learned much about the variation in standards of publication, and about what it takes to build a successful journal. So far as the future of the IJSO is concerned, there are a number of key points to make that will help the journal succeed. Firstly, the international professional surgical oncology community does not need another “me too” surgical oncology journal. Surgical Oncology already has an abundance of such journals, including the World Journal (WJSO), the European Journal (EJSO), the Journal of Surgical Oncology (JSO) and the Annals of Surgical Oncology (ASO). These journals already struggle to find new and original material, and many published papers are never cited and probably rarely read. Any new journal must find an original strategy. Secondly, many journals and their content suffer from poor quality, unimaginative editing, bloated with words and data, and are difficult to read. The general reader will always prefer the article with a clear, meaningful title and informative abstract, and concise content with a clean, simple, direct style and well argued, thoughtful and self-critical conclusions. Authors and editors should always write with the reader’s time and attention in mind. Thirdly, modern publishing must strike a balance between publishing the printed journal, the process of which dictates the quality standards, and the demands of the Internet, which secures the widest distribution of content and which determines the citations of the papers and authors. It is important that a journal has an excellent internet distribution platform, preferably provided by a major publisher such as Springer, and that particular editorial attention is given to clear, informative titles and structured abstracts. After all, these are the most that browsers and potential readers will ever see of the journal. Citations are taken to be an indirect hallmark of quality. Quality is a measure of trustworthiness and respect for a piece of published work; for the authors and for the journal in which the work is published. Each reference or citation thus represents, in effect, a positive vote for that work. Citation measures are the summation of references to published articles in other articles, and the science of bibliometrics exists for the statistical analysis of citations. The Impact Factor of a journal and the H Index of an author are two of many such bibliometric indices. Data systems such as Elsevier’s SCOPUS and Thomson-Reuters’ Web of Science reflect huge financial and intellectual investments to understand and exploit the knowledge derived from bibliometrics. Local and regional subspeciality journals such as the IJSO are thus most likely to make an Impact and to be read and cited by other authors if they publish material of a high quality which is readable, which is accessible, which is trustworthy and of high quality, and which is of unique interest. They should focus on local and regional professional issues and challenges, and develop their own identity. An effort to compete in terms of nature and style of content with the best established journals in the field will be unlikely to succeed, as the journal will not be widely read, authors will be disappointed, and the Impact Factor will languish. Conversely, a subspeciality journal in a developing country which takes courageous steps in identifying challenges and problems of health and service provision in its own region, and which commissions discussion and argument as to possible solutions, will be read with considerable interest in the wider world, and will secure credibility and influence. Herein, in my view, lies a marvellous opportunity for the IJSO and for the surgical oncology community of India. The IJSO represents a community of educated and ambitious professionals for whom English is the common language, just as it is the world’s common language of scientific communication. The problems of providing and delivering an acceptable standard of care to a growing and aging population of some 1200 million Indians, 900 million of whom are currently only able to access and afford the most basic health care are enormous, and provide fertile ground for imaginative writing in the surgical oncology specialities, as in all other health disciplines. How can the Indian surgical profession expand to meet those needs, and provide training and career opportunities for surgeons that will make working among rural communities as attractive as among metro elites? How can well established Western practices such as multidisciplinary cancer team working, governance and professional oversight be adapted to local conditions, and how can information flow around the mixed private-public sector surgical health economy, which is very different to that in many Western countries? How can the skills of the Indian IT sector be adapted to remedying the huge information gaps that still exist in our understanding of the consequences of our inputs and outcomes in our cancer treatments, and how can we use the new technologies of the television, internet and mobile telephony to educate and engage with the cancer needs of the increasingly interconnected rural communities? How can we address huge problems such as provision for palliative and terminal care, which are often held to be the hallmark of a caring society? Moreover, if “I” stands for “Indian Subcontinent” rather than “Indian”, then that sphere of influence and academic opportunity is substantially increased to include Pakistan, Bangladesh, Sri Lanka and the Himalayan states. Further afield, China, many countries in Africa, Asia and South America face health care delivery problems similar to those faced in India. Indeed, of the six billion or so people who inhabit the planet, only one billion or so enjoy the living standards of the most advanced economies and metro elites. The moral and practical obligations to meet the needs and raising expectations of the remainder create fertile ground for intelligent authorship and debate in a new professional journal, which now has the opportunity to develop a substantial sphere of influence. Thus in summary, I believe that the IJSO has a prosperous future ahead if it develops a unique identity, and if it focuses on the search for regional solutions to regional challenges, rather than in slavish replication of the work of surgical oncologists in the advanced economies. Over the next few years, it will need to build solid foundations, with which to secure the credible track record necessary to register on the major citation indices. If it succeeds in these challenges, then the lives of countless people around the world will be enhanced. I wish the Editorial Board of the IJSO well as the seeds of its ambition take root.
- Published
- 2012
49. Strategic Planning, Environmental Dynamicity and Their Impact on Business Model Design: The Case of a Mobile Middleware Technology Provider
- Author
-
Andrea Rangone, Antonio Ghezzi, and Raffaello Balocco
- Subjects
Strategic planning ,Knowledge management ,Process management ,business.industry ,Process (engineering) ,Computer science ,Digital content ,Middleware ,Provisioning ,Strategic management ,Mobile telephony ,Business model ,business - Abstract
The study aims at addressing how the approach to strategy definition, strategic planning and external dynamicity can affect the process of business model design within a firm. Analyzing the case of a newcomer in the market for the provisioning of middleware platforms enabling the delivery of Mobile digital content, the paper first identifies a set of critical decisions to be made at a business model design level for a Mobile Middleware Technology Provider, and aggregates these variables within an overall reference framework; then, through a longitudinal case study, it assesses how and why the initial business model configuration changed after two years of business activity. The study allows to argue that those business model design changes occurred in the timeframe considered depended largely on a defective approach to business strategy definition, while environmental dynamicity mostly played an “amplification effect” on the mistakes made in the underlying strategic planning process.
- Published
- 2011
50. Challenges and trends in wireless ubiquitous computing systems
- Author
-
Anis Koubaa, Elhadi M. Shakshuki, Abdelfettah Belghith, and Repositório Científico do Instituto Politécnico do Porto
- Subjects
Ubiquitous robot ,Context-aware pervasive systems ,Ubiquitous computing ,Computer science ,Wireless ad hoc network ,Mobile computing ,Management Science and Operations Research ,computer.software_genre ,Autonomic computing ,Wireless ,Group key ,Multimedia ,business.industry ,Quality of service ,User Interfaces and Human Computer Interaction ,Energy consumption ,Computer Science Applications ,Hardware and Architecture ,Computer Science ,The Internet ,Personal Computing ,business ,computer ,Wireless sensor network ,Computer network - Abstract
In the last decade, the Internet paradigm has been evolving toward a new frontier with the emergence of ubiquitous and pervasive systems, including wireless sensor networks, ad hoc networks, RFID systems, and wireless embedded systems. In fact, while the initial purpose of the Internet was to interconnect computers to share digital data at large scale, the current tendency is to enable ubiquitous and pervasive computing to control everything anytime and at a large scale. This new paradigm has given rise to a new generation of networked systems, commonly known as Internet-of-Things or Cyber-Physical Systems. The research community has actively investigated the underlying challenges pertaining to these systems, as they fundamentally differ from the classical problems due to their inherent constraints. This special issue presents six papers covering various topics in personal and ubiquitous computing. The first paper entitled ‘‘RiSeG: A Ring Based Secure Group Communication Protocol for Wireless Sensor Networks’’ deals with security in ubiquitous systems. The paper presents a new approach for secure group management in wireless sensor networks. The proposed approach is based on a logical ring architecture, which permits to alleviate the group controller’s task in updating the group key. The proposed scheme also provides backward and forward secrecy, addresses the node compromise attack, and gives a solution to detect and eliminate the compromised nodes. The authors evaluated their scheme in terms of storage, computation, and communication costs and compared its behavior against the Logical Key Hierarchy (LKH) scheme. It has been shown that RiSeG requires less storage cost and reduces computation and communication costs at the Group Controller as compared with LKH. The paper goes beyond theoretical work and proposes a realworld implementation, which first proved that RiSeG is applicable to WSNs and also showed that the performance results in terms of execution time, energy consumption, and memory consumption are satisfactory. The second paper entitled ‘‘Stability routing with constrained path length for improved routability in dynamic MANETs’’ addresses the problem of enhancing the routing validity in mobile wireless multi hop ad hoc networks. In proactive routing, routes are established and updated periodically and consequently loose their pertinence as time tics away from the start of their updating instants. Authors argue that routability, defined as the validity of established routes, stands among the most centric metric impacting the network efficiency. Indeed, in dynamic networks, traffic routed through invalid routes not only amounts to a waste of valuable resources but will never be delivered to its destination. In the quest of improving routability in dynamic networks, the authors considered a two-constrained QoS routing problem with one superlative constraint and one comparative constraint. In its general form, this is an NP hard problem. The authors proposed a novel exact algorithm and then instantiated the problem to solve the optimal integer weight-constrained path length A. Belghith (&) University of Manouba, Manouba, Tunisia e-mail: abdelfattah.belghith@ensi.rnu.tn
- Published
- 2011
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.