1,361 results
Search Results
52. Mining Intelligence and Knowledge Exploration : Third International Conference, MIKE 2015, Hyderabad, India, December 9-11, 2015, Proceedings.
- Author
-
Prasath, Rajendra, Vuppala, Anil Kumar, and Kathirvalavakumar, T.
- Subjects
Artificial intelligence ,Data mining ,Information storage and retrieval ,Application software ,Optical data processing ,Algorithms ,Artificial Intelligence ,Data Mining and Knowledge Discovery ,Information Storage and Retrieval ,Information Systems Applications (incl. Internet) ,Computer Imaging, Vision, Pattern Recognition and Graphics ,Algorithm Analysis and Problem Complexity - Abstract
Summary: This book constitutes the refereed proceedings of the Third International Conference on Mining Intelligence and Knowledge Exploration, MIKE 2015, held in Hyderabad, India, in December 2015. The 48 full papers and 8 short papers presented together with 4 doctoral consortium papers were carefully reviewed and selected from 185 submissions. The papers cover a wide range of topics including information retrieval, machine learning, pattern recognition, knowledge discovery, classification, clustering, image processing, network security, speech processing, natural language processing, language, cognition and computation, fuzzy sets, and business intelligence.
- Published
- 2015
53. Identifying Risk Factors in Implementing ERP Systems in Small Companies
- Abstract
Some risk factors exist within the implementation process of an ERP system in small companies. However, researchers claim different views on which impacts the implementation of ERP systems have. Actually, there are relatively few empirically based ERP implementation studies in small companies and its impact, as most of such studies are focused on larger companies. This paper is based on a case study at a small company. The aim of the paper is to explore risks at a small company when planning to implement an ERP system. The analysis shows that an ERP system is a good solution to avoid using systems that are not integrated. An ERP system could integrate all information in only one system, and all information could easily be accessed within that system. The implementation therefore lead to decreasing costs in the daily work as the activities and processes can be performed more effective and efficient.
- Published
- 2020
- Full Text
- View/download PDF
54. The Dark Sides of Technology : Barriers to Work-Integrated Learning
- Abstract
Digitalization and technology are interventions seen as a solution to the increasing demand for healthcare services, but the associated changes of the services are characterized by multiple challenges. A work-integrated learning approach implies that the learning outcome is related to the learning environment and the learning affordances available at the actual workplace. To shape workplace affordances it is of great importance to get a deeper understanding of the social practices. This paper will explore a wide range of managers' and professionals' emotions, moods and feelings related to digitalization and new ways of providing healthcare services, as well as the professionals' knowledge and experiences. Zhang's affective response model (ARM) will be used as a systematic approach and framework to gain knowledge of how professionals and managers experience and experience digitization of municipal health services. The research question is: How can knowledge about dark sides of technology reduce barriers to work-integrated learning?This paper is based on a longitudinal study with a qualitative approach. Focus group discussions were used as method for collecting data. The findings and themes crystallized through the content analysis were then applied to the Affective Response Model as a systematic approach to gain more knowledge about professionals and managers' experiences and how that knowledge can reduce the barriers to work-integrated learning. Understanding of, and consciousness about the dark sides of technology and the professionals' affective responses may support the digitalization of the sector and the development of the new ways of providing healthcare services.
- Published
- 2020
- Full Text
- View/download PDF
55. Leveraging Patient Preference Information in Medical Device Clinical Trial Design
- Abstract
Use of robust, quantitative tools to measure patient perspectives within product development and regulatory review processes offers the opportunity for medical device researchers, regulators, and other stakeholders to evaluate what matters most to patients and support the development of products that can best meet patient needs. The medical device innovation consortium (MDIC) undertook a series of projects, including multiple case studies and expert consultations, to identify approaches for utilizing patient preference information (PPI) to inform clinical trial design in the US regulatory context. Based on these activities, this paper offers a cogent review of considerations and opportunities for researchers seeking to leverage PPI within their clinical trial development programs and highlights future directions to enhance this field. This paper also discusses various approaches for maximizing stakeholder engagement in the process of incorporating PPI into the study design, including identifying novel endpoints and statistical considerations, crosswalking between attributes and endpoints, and applying findings to the population under study. These strategies can help researchers ensure that clinical trials are designed to generate evidence that is useful to decision makers and captures what matters most to patients.
- Published
- 2022
56. Aggregating Fuzzy Sentiments with Customized QoS Parameters for Cloud Provider Selection Using Fuzzy Best Worst and Fuzzy TOPSIS
- Abstract
Consumers often get confused to select the best cloud providers from the huge marketplace. The hesitancy of consumers further escalates when multiple service providers offer the same type and quality of services. To deal with such an uncertainty, the decision-makers always combine multiple factors to make an informed choice. Sentiment mining is one of the key parameters to determine the service quality and get an insight into the business. It assists service providers in precisely deduce consumer's emotions regarding the product. The analysis helps providers fine-tune the product based on consumer's sentiment and accommodates the consumer's request to find an optimal service provider. Several existing literature for cloud service selection mainly focuses on the Quality of Service (QoS) of the offered services. However, very few of them have considered the user experience of a consumer in the decision-making process. Moreover, there is minimal literature that amalgamates Quality of Experience (QoE) with customized Quality of Service (QoS) requirements to decide on a complex framework. The paper addresses the issue by aggregating consumer’s sentiments with customized QoS parameters to choose an optimal service provider. The paper uses the fuzzy Best Worst Method (BWM) to determine the weights of selection criteria and use the fuzzy TOPSIS to handle the uncertain linguistic preference. Analysis results demonstrate the applicability and effectiveness of the framework.
- Published
- 2022
57. Analytic Langlands correspondence for $$PGL_2$$ P G L 2 on $${\mathbb {P}}^1$$ P 1 with parabolic structures over local fields
- Abstract
We continue to develop the analytic Langlands program for curves over local fields initiated in our earlier papers, following a suggestion of Langlands and a work of Teschner. Namely, we study the Hecke operators which we introduced in those papers in the case of a projective line with parabolic structures at finitely many points for the group $$PGL_2$$ P G L 2 . We establish most of our conjectures in this case.
- Published
- 2022
58. Understanding an Integrated Management System in a Government Agency – Focusing Institutional Carriers : Focusing Institutional Carriers
- Abstract
Working with an integrated management system (IMS) is a challenging task. In public organizations, the formalization of an IMS including the communication of control mechanisms, rules, goals and culture are crucial. Several types of carriers are used in order to communicate the content in an IMS – both human actors and artefacts. An artefact studied in this paper is an intranet, as one carrier of the IMS. The purpose of this paper is to explore how institutional theory – focusing institutional carriers – can help us to understand how an IMS is represented through human actors and technology in a government agency. The conclusion is that the application of an institutional carrier perspective on an IMS can help us to understand the past and present, the role, and the relative success of such a system. An IMS can be aligned or misaligned related to three dimensions of structure, process and people. Achieving an aligned and legitimate IMS is crucial in order to achieve goals in an organization. The implications of this study are that further research and practice should give more attention to institutional carriers when studying and improving IMS., This study is partially financially supported by anonymized government agency.
- Published
- 2018
- Full Text
- View/download PDF
59. Understanding an Integrated Management System in a Government Agency – Focusing Institutional Carriers : Focusing Institutional Carriers
- Abstract
Working with an integrated management system (IMS) is a challenging task. In public organizations, the formalization of an IMS including the communication of control mechanisms, rules, goals and culture are crucial. Several types of carriers are used in order to communicate the content in an IMS – both human actors and artefacts. An artefact studied in this paper is an intranet, as one carrier of the IMS. The purpose of this paper is to explore how institutional theory – focusing institutional carriers – can help us to understand how an IMS is represented through human actors and technology in a government agency. The conclusion is that the application of an institutional carrier perspective on an IMS can help us to understand the past and present, the role, and the relative success of such a system. An IMS can be aligned or misaligned related to three dimensions of structure, process and people. Achieving an aligned and legitimate IMS is crucial in order to achieve goals in an organization. The implications of this study are that further research and practice should give more attention to institutional carriers when studying and improving IMS., This study is partially financially supported by anonymized government agency.
- Published
- 2018
- Full Text
- View/download PDF
60. Understanding an Integrated Management System in a Government Agency – Focusing Institutional Carriers : Focusing Institutional Carriers
- Abstract
Working with an integrated management system (IMS) is a challenging task. In public organizations, the formalization of an IMS including the communication of control mechanisms, rules, goals and culture are crucial. Several types of carriers are used in order to communicate the content in an IMS – both human actors and artefacts. An artefact studied in this paper is an intranet, as one carrier of the IMS. The purpose of this paper is to explore how institutional theory – focusing institutional carriers – can help us to understand how an IMS is represented through human actors and technology in a government agency. The conclusion is that the application of an institutional carrier perspective on an IMS can help us to understand the past and present, the role, and the relative success of such a system. An IMS can be aligned or misaligned related to three dimensions of structure, process and people. Achieving an aligned and legitimate IMS is crucial in order to achieve goals in an organization. The implications of this study are that further research and practice should give more attention to institutional carriers when studying and improving IMS., This study is partially financially supported by anonymized government agency.
- Published
- 2018
- Full Text
- View/download PDF
61. Understanding an Integrated Management System in a Government Agency – Focusing Institutional Carriers : Focusing Institutional Carriers
- Abstract
Working with an integrated management system (IMS) is a challenging task. In public organizations, the formalization of an IMS including the communication of control mechanisms, rules, goals and culture are crucial. Several types of carriers are used in order to communicate the content in an IMS – both human actors and artefacts. An artefact studied in this paper is an intranet, as one carrier of the IMS. The purpose of this paper is to explore how institutional theory – focusing institutional carriers – can help us to understand how an IMS is represented through human actors and technology in a government agency. The conclusion is that the application of an institutional carrier perspective on an IMS can help us to understand the past and present, the role, and the relative success of such a system. An IMS can be aligned or misaligned related to three dimensions of structure, process and people. Achieving an aligned and legitimate IMS is crucial in order to achieve goals in an organization. The implications of this study are that further research and practice should give more attention to institutional carriers when studying and improving IMS., This study is partially financially supported by anonymized government agency.
- Published
- 2018
- Full Text
- View/download PDF
62. Understanding an Integrated Management System in a Government Agency – Focusing Institutional Carriers : Focusing Institutional Carriers
- Abstract
Working with an integrated management system (IMS) is a challenging task. In public organizations, the formalization of an IMS including the communication of control mechanisms, rules, goals and culture are crucial. Several types of carriers are used in order to communicate the content in an IMS – both human actors and artefacts. An artefact studied in this paper is an intranet, as one carrier of the IMS. The purpose of this paper is to explore how institutional theory – focusing institutional carriers – can help us to understand how an IMS is represented through human actors and technology in a government agency. The conclusion is that the application of an institutional carrier perspective on an IMS can help us to understand the past and present, the role, and the relative success of such a system. An IMS can be aligned or misaligned related to three dimensions of structure, process and people. Achieving an aligned and legitimate IMS is crucial in order to achieve goals in an organization. The implications of this study are that further research and practice should give more attention to institutional carriers when studying and improving IMS., This study is partially financially supported by anonymized government agency.
- Published
- 2018
- Full Text
- View/download PDF
63. Finding Code-Clone Snippets in Large Source-Code Collection by ccgrep
- Abstract
Finding the same or similar code snippets in the source code for a query code snippet is one of the fundamental activities in software maintenance. Code clone detectors detect the same or similar code snippets, but they report all of the code clone pairs in the target, which are generally excessive to the users. In this paper, we propose ccgrep, a token-based pattern matching tool with the notion of code clone pairs. The user simply inputs a code snippet as a query and specifies the target source code, and gets the matched code snippets as the result. The query and the result snippets form clone pairs. The use of special tokens (named meta-tokens) in the query allows the user to have precise control over the matching. It works for the source code in C, C++, Java, and Python on Windows or Unix with practical scalability and performance. The evaluation results show that ccgrep is effective in finding intended code snippets in large Open Source Software.
- Published
- 2021
64. An Improved Algorithm for Fast K-Word Proximity Search Based on Multi-Component Key Indexes
- Abstract
A search query consists of several words. In a proximity full-text search, we want to find documents that contain these words near each other. This task requires much time when the query consists of high-frequently occurring words. If we cannot avoid this task by excluding high-frequently occurring words from consideration by declaring them as stop words, then we can optimize our solution by introducing additional indexes for faster execution. In a previous work, we discussed how to decrease the search time with multi-component key indexes. We had shown that additional indexes can be used to improve the average query execution time up to 130 times if queries consisted of high-frequently occurring words. In this paper, we present another search algorithm that overcomes some limitations of our previous algorithm and provides even more performance gain. © 2021, Springer Nature Switzerland AG.
- Published
- 2021
65. Automated Discovery of Process Models with True Concurrency and Inclusive Choices
- Abstract
Enterprise information systems allow companies to maintain detailed records of their business process executions. These records can be extracted in the form of event logs, which capture the execution of activities across multiple instances of a business process. Event logs may be used to analyze business processes at a fine level of detail using process mining techniques. Among other things, process mining techniques allow us to discover a process model from an event log – an operation known as automated process discovery. Despite a rich body of research in the field, existing automated process discovery techniques do not fully capture the concurrency inherent in a business process. Specifically, the bulk of these techniques treat two activities A and B as concurrent if sometimes A completes before B and other times B completes before A. Typically though, activities in a business process are executed in a true concurrency setting, meaning that two or more activity executions overlap temporally. This paper addresses this gap by presenting a refined version of an automated process discovery technique, namely Split Miner, that discovers true concurrency relations from event logs containing start and end timestamps for each activity. The proposed technique is also able to differentiate between exclusive and inclusive choices. We evaluate the proposed technique relative to existing baselines using 11 real-life logs drawn from different industries.
- Published
- 2021
66. Innovative Blockchain-Based Applications - State of the Art and Future Directions
- Abstract
Recently, blockchain technology has increasingly being used to provide a secure environment that is immutable, consensus-based and transparent in the finance technology world. However, significant efforts have been made to use blockchain in other fields where trust and transparency are required. The distributed power and embedded security of blockchain leverage the operational efficiency of other domains to be immutable, transparent, and trustworthy. The trust of the published literature in blockchain technology is centered on crypto-currencies. Therefore, this paper addresses this gap and presents to the user several applications in many fields, including education, health, carbon credits, robotics, energy, pharmaceutical supply chains, identity management, and crypto-currency wallets. This paper overviews the knowledge on blockchain technology, discusses the innovation of blockchain technology based on the number of applications which have been introduced, describes the challenges associated with blockchain technology, and makes suggestions for future work.
- Published
- 2021
67. Combining Heterogeneous Indicators by Adopting Adaptive MCDA: Dealing with Uncertainty
- Abstract
Adaptive MCDA systematically supports the dynamic combination of heterogeneous indicators to assess overall performance. The method is completely generic and is currently adopted to undertake a number of studies in the area of sustainability. The intrinsic heterogeneity characterizing this kind of analysis leads to a number of biases, which need to be properly considered and understood to correctly interpret computational results in context. While on one side the method provides a comprehensive data-driven analysis framework, on the other side it introduces a number of uncertainties that are object of discussion in this paper. Uncertainty is approached holistically, meaning we address all uncertainty aspects introduced by the computational method to deal with the different biases. As extensively discussed in the paper, by identifying the uncertainty associated with the different phases of the process and by providing metrics to measure it, the interpretation of results can be considered more consistent, transparent and, therefore, reliable.
- Published
- 2021
68. Learning from Selembao: An Alternative Approach to Kinshasa’s Urbanization, Using the Concept of Mboka Bilanga
- Abstract
According to most technicians, the uninterrupted urbanized areas in the outskirts of the city of Kinshasa are systematically categorized as peri-urban areas. An on-site exploration of Selembao, a south-western municipality of Kinshasa, pushes us to go beyond this categorization towards the recognition of a specific kind of urbanization through a specific descriptive concept: Mboka Bilanga. The first section of this paper aims at framing the peri-urban development terminology as it has applied to the African continent. A selective and thematic literature review shows off a shared vision of peri-urban areas understood as satellite, incomplete and unstable territories, which are expected to leverage their relationship of dependence with the city centre. The following section proposes a paradigm shift outlining an alternative analytical frame for the Kinshasan urbanization: the Mboka Bilanga. Here, we assess the necessity to go beyond the urban/rural opposition in a descriptive effort to depict a general social transformation rather than a mere physiological dynamic of expansion. In the third section of this paper, examining the case of Selembao, we seek to underline very specific urbanization dynamics. The observation of the urban phenomenon of Selembao follows an interpretative framework seeking to identify the traces of urbanity through place-based solidarities. What emerges from the case study is a set of survival tactics, based on informal economic relations, pointing out a few features of what we propose to call Mboka Bilanga urbanization.
- Published
- 2021
69. Iterated Local Search with Neighbourhood Reduction for the Pickups and Deliveries Problem Arising in Retail Industry
- Abstract
The paper studies a vehicle routing problem with simultaneous pickups and deliveries that arises in the retail sector, which considers a heterogeneous fleet of vehicles, time windows of the demands, practical restrictions on the drivers and a roster specifying the order of vehicle loading at the depot. The high competition in this industry requires that a viable optimisation approach must achieve a good balance of solution time, quality and robustness. In this paper, a novel iterated local search algorithm is proposed which dynamically reduces the neighbourhood so that only the most promising moves are considered. The results of computational experiments on real-world data demonstrate the high efficiency of the presented optimisation procedure in terms of computation time, stability of the optimisation procedure and solution quality.
- Published
- 2021
70. Iterated Local Search with Neighbourhood Reduction for the Pickups and Deliveries Problem Arising in Retail Industry
- Abstract
The paper studies a vehicle routing problem with simultaneous pickups and deliveries that arises in the retail sector, which considers a heterogeneous fleet of vehicles, time windows of the demands, practical restrictions on the drivers and a roster specifying the order of vehicle loading at the depot. The high competition in this industry requires that a viable optimisation approach must achieve a good balance of solution time, quality and robustness. In this paper, a novel iterated local search algorithm is proposed which dynamically reduces the neighbourhood so that only the most promising moves are considered. The results of computational experiments on real-world data demonstrate the high efficiency of the presented optimisation procedure in terms of computation time, stability of the optimisation procedure and solution quality.
- Published
- 2021
71. Building Smart Healthy Inclusive Environments for All Ages with Citizens
- Abstract
The paper provides an introduction to the public discourse around the notion of smart healthy inclusive environments. First, the basic ideas are explained and related to citizen participation in the context of implementation of a "society for all ages" concept disseminated by the United Nations. Next, the text discusses selected initiatives of the European Commission in the field of intergenerational programming and policies as well as features of the COST Action NET4Age-Friendly: Smart Healthy Age-Friendly Environments (SHAFE). The following sections are focused on studying and discussing examples of projects and methodologies that have been aimed at: empowering facilitators of smart healthy inclusive environments, empowering citizens to deal with health emergencies, and supporting older people's voices. The conclusion covers selected recommendations for entities of public policy on ageing (ageing policy) as well as potential directions for further research.
- Published
- 2021
72. Nanotechnology
- Abstract
My group got interested in designing nanoscale systems that could self-organize in the 1980s. This led to papers on molecular machines, where we carried out MD for systems designed by others. Usually they did not work so well because the atom sizes are discrete. One of my graduate students (Ching-Hwa Kiang) discovered the first single wall nanotube, which led to papers characterizing them and the mechanism by which they formed. We also had some collaborations with the Atwater group.
- Published
- 2021
73. Smart Lamp or Security Camera? Automatic Identification of IoT Devices
- Abstract
The tsunami of connectivity brought by the Internet of Things is rapidly revolutionising several sectors, ranging from industry and manufacturing, to home automation, healthcare and many more. When it comes to enforce security within an IoT network such as a smart home, there is a need to automatically recognise the type of each joining devices, in order to apply the right security policy. In this paper, we propose a method for identifying IoT devices’ types based on natural language processing (NLP), text classification, and web search engines. We implement a proof of concept and we test it against 33 different IoT devices. With a success rate of 88.9% for BACnet and 87.5% for MUD devices, our experiments show that we can efficiently and effectively identify different IoT devices.
- Published
- 2021
74. Trigger Selection Strategies to Stabilize Program Verifiers
- Abstract
SMT-based program verifiers often suffer from the so-called butterfly effect, in which minor modifications to the program source cause significant instabilities in verification times, which in turn may lead to spurious verification failures and a degraded user experience. This paper identifies matching loops (ill-behaved quantifiers causing an SMT solver to repeatedly instantiate a small set of quantified formulas) as a significant contributor to these instabilities, and describes some techniques to detect and prevent them. At their core, the contributed techniques move the trigger selection logic away from the SMT solver and into the high-level verifier: this move allows authors of verifiers to annotate, rewrite, and analyze user-written quantifiers to improve the solver’s performance, using information that is easily available at the source level but would be hard to extract from the heavily encoded terms that the solver works with. The paper demonstrates three core techniques (quantifier splitting, trigger sharing, and matching loop detection) by extending the Dafny verifier with its own trigger selection routine, and demonstrates significant predictability and performance gains on both Dafny’s test suite and large verification efforts using Dafny.
- Published
- 2021
75. Analyzing a socially responsible closed-loop distribution channel with recycling facility
- Abstract
This paper deals with a closed-loop distribution channel consisting of a socially responsible manufacturer, multiple retailers and a third party collector. In reality, collection of used products (plastic, glass, metal) by a third party collector is more common than the collection through retailers. This is because retailers generally faces difficulties such as lack of space and manpower. Aligned with many closed loop supply chains, this paper assumes that the third party operates the reverse channel by collecting the used products. The third party collects used products, segregates recyclable items and sends them to the manufacturer for further use. The manufacturer not only shows social responsibility to the stakeholders and shareholders, but also collects the used products from the third party and recycles them to new products. Considering profit maximizing motives of the channel members, the paper examines the effect of manufacturer’s degree of social responsibility on the collection activity of the third party. Under manufacturer Stackelberg game setting, it is found that product recycling is directly proportional to the manufacturer’s corporate social responsibility (CSR) concerns and there must be a threshold of recycling for the optimal benefit that can be acquired through CSR practice. The proposed model is illustrated by a numerical example and a sensitivity analysis reveals nature of the parameters.
- Published
- 2021
76. Reactive transport modeling in heterogeneous porous media with dynamic mesh optimization
- Abstract
This paper presents a numerical simulator for solving compositional multiphase flow and reactive transport. The simulator was developed by effectively linking IC-FERST (Imperial College Finite Element Reservoir SimulaTor) with PHREEQCRM. IC-FERST is a next-generation three-dimensional reservoir simulator based on the double control volume finite element method and dynamic unstructured mesh optimization and is developed by the Imperial College London. PHREEQCRM is a state-of-the-art geochemical reaction package and is developed by the United States Geological Survey. We present a step-by-step framework on how the coupling is performed. The coupled code is called IC-FERST-REACT and is capable of simulating complex hydrogeological, biological, chemical, and mechanical processes occurring including processes occur during CO2 geological sequestration, CO2 enhanced oil recovery, and geothermal systems among others. In this paper, we present our preliminary work as well as examples related to CO2 geological sequestration. We performed the model coupling through developing an efficient application programming interface (API). IC-FERST-REACT inherits high-order methods and unstructured meshes with dynamic mesh optimization from IC-FERST. This reduces the computational cost by placing the mesh resolution where and when necessary and it can better capture flow instabilities if they occur. This can have a strong impact on reactive transport simulations which usually suffer from computational cost. From PHREEQCRM the code inherits the ability to efficiently model geochemical reactions. Benchmark examples are used to show the capability of IC-FERST-REACT in solving multiphase flow and reactive transport.
- Published
- 2021
77. Shaped beams: unlocking new geometry for efficient structures
- Abstract
Society is building at an unprecedented rate: to house over 200,000 people moving to cities each day building stock will need to double by 2060. Importantly, the embodied carbon of construction due to material extraction, manufacturing, transportation, and demolition accounts for 11% of global carbon emissions, and this number is only expected to rise. With no end to construction in sight, it is essential that we develop better building practices. The research presented in this paper begins with the critical issue of embodied energy in horizontal structural systems. In high-rise buildings, between 60 and 80% of the mass and embodied energy of the structure can be found in the floors, suggesting a compelling starting point for materially efficient design. A reduction in a floor system’s mass can lead to a similar reduction in the mass of vertical (columns, walls) and lateral systems. This paper focuses on the design of horizontal spanning elements, such as floor beams and slabs, and has two parts. The first part evaluates and compares historic methods of shaped beam design and classical methods of structural optimization. The second part presents a new flexible method of beam shape optimization. These design methods for structural efficiency allow us to build far more with far less, reducing the environmental and economic costs of construction while meeting the demands of a growing population.
- Published
- 2021
78. Predicting the future success of scientific publications through social network and semantic analysis
- Abstract
Citations acknowledge the impact a scientific publication has on subsequent work. At the same time, deciding how and when to cite a paper, is also heavily influenced by social factors. In this work, we conduct an empirical analysis based on a dataset of 2010–2012 global publications in chemical engineering. We use social network analysis and text mining to measure publication attributes and understand which variables can better help predicting their future success. Controlling for intrinsic quality of a publication and for the number of authors in the byline, we are able to predict scholarly impact of a paper in terms of citations received 6 years after publication with almost 80% accuracy. Results suggest that, all other things being equal, it is better to co-publish with rotating co-authors and write the papers’ abstract using more positive words, and a more complex, thus more informative, language. Publications that result from the collaboration of different social groups also attract more citations.
- Published
- 2021
79. Big Data Analytics and Artificial Intelligence Against COVID-19: Innovation Vision and Approach.
- Author
-
Hassanien, Aboul Ella, Dey, Nilanjan, and Elghamrawy, Sally
- Subjects
Engineering-Data processing ,Artificial intelligence ,Computational intelligence ,Biomedical engineering - Abstract
Summary: This book includes research articles and expository papers on the applications of artificial intelligence and big data analytics to battle the pandemic. In the context of COVID-19, this book focuses on how big data analytic and artificial intelligence help fight COVID-19. The book is divided into four parts. The first part discusses the forecasting and visualization of the COVID-19 data. The second part describes applications of artificial intelligence in the COVID-19 diagnosis of chest X-Ray imaging. The third part discusses the insights of artificial intelligence to stop spread of COVID-19, while the last part presents deep learning and big data analytics which help fight the COVID-19. .
- Published
- 2020
80. Families of the Granules for Association Rules and Their Properties
- Abstract
type:Conference Paper, We employed the granule (or the equivalence class) defined by a descriptor in tables, and investigated rough set-based rule generation. In this paper, we consider the new granules defined by an implication, and propose a family of the granules defined by an implication in a table with exact data. Each family consists of the four granules, and we show that three criterion values, support, accuracy, and coverage, can easily be obtained by using the four granules. Then, we extend this framework to tables with non-deterministic data. In this case, each family consists of the nine granules, and the minimum and the maximum values of three criteria are also obtained by using the nine granules. We prove that there is a table causing support and accuracy the minimum, and generally there is no table causing support, accuracy, and coverage the minimum. Finally, we consider the application of these properties to Apriori-based rule generation from uncertain data. These properties will make Apriori-based rule generation more effective., 10th International Conference, RSKT 2015, Held as Part of the International Joint Conference on Rough Sets, IJCRS 2015, November 20-23, 2015, Tianjin, China, source:https://doi.org/10.1007/978-3-319-25754-9_16
- Published
- 2017
81. Development of new composites made of waste materials for wood pallet element
- Abstract
The recycling of waste products and its further use for new products is of the utmost importance nowadays. The quantities of waste product originating from industries involving plastics, paper, wood, textile and metal foils, such as the related automotive, paper, wood and food industries, represent extremely large numbers, strongly indicating the need for efficient waste management. On the other side of things, companies are always looking for ways to lower material costs. The combination of different waste materials can be used for production of new composite materials. This paper will present a brief overview of existing possibilities in the development of new composites completely made of waste materials, as well as further research directions. A preliminary study of material combinations that can provide a composite aimed at load bearing applications is given, for the purpose of replacing elements such as wood blocks in transport pallets. Several combinations of waste material from different industries were studied in a composite structure: paper, cardboard boxes, tetra-pak containers, expanded polystyrene (styrofoam), polyurethane (PU) foam, artificial leather, textile, wood chips and dust. Preliminary compressive tests were performed. The results indicated unsuitable combinations, but also some that provided a stable compact composite which endured high compressive loads. An important result is that such a composite can be made without adding any adhesives. Waste materials from different industries can be efficiently used for new composites, and further study of this is clearly needed.
- Published
- 2017
82. Development of new composites made of waste materials for wood pallet element
- Abstract
The recycling of waste products and its further use for new products is of the utmost importance nowadays. The quantities of waste product originating from industries involving plastics, paper, wood, textile and metal foils, such as the related automotive, paper, wood and food industries, represent extremely large numbers, strongly indicating the need for efficient waste management. On the other side of things, companies are always looking for ways to lower material costs. The combination of different waste materials can be used for production of new composite materials. This paper will present a brief overview of existing possibilities in the development of new composites completely made of waste materials, as well as further research directions. A preliminary study of material combinations that can provide a composite aimed at load bearing applications is given, for the purpose of replacing elements such as wood blocks in transport pallets. Several combinations of waste material from different industries were studied in a composite structure: paper, cardboard boxes, tetra-pak containers, expanded polystyrene (styrofoam), polyurethane (PU) foam, artificial leather, textile, wood chips and dust. Preliminary compressive tests were performed. The results indicated unsuitable combinations, but also some that provided a stable compact composite which endured high compressive loads. An important result is that such a composite can be made without adding any adhesives. Waste materials from different industries can be efficiently used for new composites, and further study of this is clearly needed.
- Published
- 2017
83. Thin shell foundations: Quantification of embodied carbon reduction through materially efficient geometry
- Abstract
Building foundation systems are a significant but understudied contributor to embodied carbon emissions of the built environment, and typically use excess material in prismatic, bending-dominated typologies. This paper identifies and characterizes a promising pathway for reducing the embodied carbon associated with reinforced concrete shallow foundations through an alternative typology, thin shell foundations. The main focus is a quantification and comparison of the environmental impact of typical spread footings and materially efficient shell foundations. Validated analytical engineering equations are applied in a parametric design workflow for the same design load and soil bearing capacity. By iterating through this workflow systematically, insights are gained regarding the applicability of shell foundations to various building typologies and site conditions. Results show that for small column loads and weak soils, shells reduce embodied carbon by about half compared to spread footings. For high applied loads, shells significantly outperform their prismatic counterparts, reducing the environmental impact by almost two-thirds. Foundations are then considered within the context of a whole building structural frame to determine the potential downstream savings when multiple systems are optimized to reduce material use and mass. When floor slabs are shape-optimized in addition to using shell foundations, a building structural system can be constructed for nearly one-quarter of the embodied carbon of a typical system. To take advantage of these potential savings, a method for fabricating thin shell foundations, where earth is compacted and milled to create the formwork, is presented following a review of digital fabrication methods.
- Published
- 2023
84. The EU Commission’s Proposal for Improving the Electricity Market Design: Treading Water, but not Drowning
- Abstract
Purpose of Review On March 14, 2023, the European Commission (EC) published the much awaited “Proposal for a regulation (…) to improve the Union’s electricity market design.” The proposed regulation reflects the verdict of the EC after several months of fervent debate triggered by the energy crisis that has affected the European region. In this paper, we discuss several crucial elements that are part of the proposed regulation. Recent Findings In a nutshell, we deem the EC has done a great job managing a highly complicated situation. The proposal preserves the crucial role of short-term electricity markets and puts the focus on the key flaw: the perennial incompleteness of long-term power markets. The EC has put forward a large battery of measures, covering different dimensions and with very different potential impacts on the market design. Summary Here we focus on what we consider to be the four key elements of the proposal: (i) the promotion of long-term contracting, (ii) interventions during electricity price crises, (iii) the strategy for an efficient supplier risk management, and (iv) flexibility support schemes and capacity remuneration mechanisms.
- Published
- 2023
85. Elliptic stable envelopes and hypertoric loop spaces
- Abstract
This paper describes a relation between the elliptic stable envelopes of a hypertoric variety $$X$$ X and a distinguished K-theory class on the product of the loop hypertoric space $$\widetilde{\mathscr {L}}X$$ L ~ X and its symplectic dual $$\mathscr {P}X^!$$ P X ! . This class intertwines the K-theoretic stable envelopes in a certain limit. Our results are suggestive of a possible categorification of elliptic stable envelopes.
- Published
- 2023
86. Multi-objective genetic programming with partial sampling and its extension to many-objective
- Abstract
This paper describes a technique on an optimization of tree-structure data by of multi-objective evolutionary algorithm, or multi-objective genetic programming. GP induces bloat of the tree structure as one of the major problem. The cause of bloat is that the tree structure obtained by the crossover operator grows bigger and bigger but its evaluation does not improve. To avoid the risk of bloat, a partial sampling operator is proposed as a mating operator. The size of the tree and a structural distance are introduced into the measure of the tree-structure data as the objective functions in addition to the index of the goodness of tree structure. GP is defined as a three-objective optimization problem. SD is also applied for the ranking of parent individuals instead to the crowding distance of the conventional NSGA-II. When the index of the goodness of tree-structure data is two or more, the number of objective functions in the above problem becomes four or more. We also propose an effective many-objective EA applicable to such the many-objective GP. We focus on NSGA-II based on Pareto partial dominance (NSGA-II-PPD). NSGA-II-PPD requires beforehand a combination list of the number of objective functions to be used for Pareto partial dominance (PPD). The contents of the combination list greatly influence the optimization result. We propose to schedule a parameter r meaning the subset size of objective functions for PPD and to eliminate individuals created by the mating having the same contents as the individual of the archive set., source:Ohki, M. Multi-objective genetic programming with partial sampling and its extension to many-objective. SN Appl. Sci. (2019) 1: 207. https://doi.org/10.1007/s42452-019-0208-y. This is a post-peer-review, pre-copyedit version of an article published in SN, source:https://link.springer.com/article/10.1007/s42452-019-0208-y
- Published
- 2023
87. Adaptation of AI-Accelerated CFD Simulations to the IPU Platform
- Abstract
Intelligence Processing Units (IPU) have proven useful for many AI applications. In this paper, we evaluate them within the emerging field of AI for simulation, where traditional numerical simulations are supported by artificial intelligence approaches. We focus specifically on a program for training machine learning models supporting a computational fluid dynamics application. We use custom TensorFlow provided by the Poplar SDK to adapt the program for the IPU-POD16 platform and investigate its ease of use and performance scalability. Training a model on data from OpenFOAM simulations allows us to get accurate simulation state predictions in test time. We show how to utilize the popdist library to overcome a performance bottleneck in feeding training data to the IPU on the host side, achieving up to 34{\%} speedup. Due to communication overheads, using data parallelism to utilize two IPUs instead of one does not improve the throughput. However, once the intra-IPU costs have been paid, the hardware capabilities for inter-IPU communication allow for good scalability. Increasing the number of IPUs from 2 to 16 improves the throughput from 560.8 to 2805.8 samples/s., The authors would like to thank Grzegorz Andrejczuk for his ideas and help with investigating data loading overheads. Big thanks to Charis Fisher for her support and valuable comments. Researcher Sergio Iserte was supported by the postdoctoral fellowship APOSTD/2020/026 from Valencian Region Government (GVA) and European Social Funds (ESF). CFD Simulations were executed on Tirant III cluster of the Servei d’Informàtica of the University of Valencia (UV)., Peer Reviewed, Postprint (author's final draft)
- Published
- 2023
88. Semiclassical Measures for Higher-Dimensional Quantum Cat Maps
- Abstract
Consider a quantum cat map M associated with a matrix $$A\in {{\,\textrm{Sp}\,}}(2n,{\mathbb {Z}})$$ A ∈ Sp ( 2 n , Z ) , which is a common toy model in quantum chaos. We show that the mass of eigenfunctions of M on any nonempty open set in the position–frequency space satisfies a lower bound which is uniform in the semiclassical limit, under two assumptions: (1) there is a unique simple eigenvalue of A of largest absolute value and (2) the characteristic polynomial of A is irreducible over the rationals. This is similar to previous work (Dyatlov and Jin in Acta Math 220(2):297–339, 2018; Dyatlov et al. in J Am Math Soc 35(2):361–465, 2022) on negatively curved surfaces and (Schwartz in The full delocalization of eigenstates for the quantized cat map, 2021) on quantum cat maps with $$n=1$$ n = 1 , but this paper gives the first results of this type which apply in any dimension. When condition (2) fails we provide a weaker version of the result and discuss relations to existing counterexamples. We also obtain corresponding statements regarding semiclassical measures and damped quantum cat maps.
- Published
- 2023
89. Properties, applications, and prospects of carbon nanotubes in the construction industry
- Abstract
Nanotechnology and nanomaterials have offered sustainable design options for the built environment and enabled architects to design more flexible architectural forms. Carbon nanotubes have excellent mechanical, electrical, thermal, and chemical properties and are useful in a wide range of engineering applications. However, the role of carbon nanotube composites as a functional construction material has large potential and awaits further investigation and exploration. This paper gives an overview of the synthesis and fabrication methods of carbon nanotubes, carbon nanotube properties, different forms of carbon nanotube composites, and application of carbon nanotubes in the construction industry. To explore the prospects for construction use, the aesthetic, structural, and functional characteristics of several futuristic building projects are discussed. This overview proposes a promising material approach for the application of carbon nanotubes in construction and explains the related opportunities and challenges.
- Published
- 2023
90. Simulated Evolution and Learning : 11th International Conference, SEAL 2017, Shenzhen, China, November 10-13, 2017, Proceedings.
- Author
-
Jin, Yaochu, Li, Xiaodong, Middendorf, Martin, Shi, Yuhui, Tan, Kay Chen, Tan, Ying, Tang, Ke, Zhang, Mengjie, and Zhang, Qingfu
- Subjects
Algorithms ,Artificial intelligence ,Computer communication systems ,Computer simulation ,Computers ,Computation by Abstract Devices ,Algorithm Analysis and Problem Complexity ,Artificial Intelligence ,Computer Communication Networks ,Models and Principles ,Simulation and Modeling - Abstract
Summary: This book constitutes the refereed proceedings of the 11th International Conference on Simulated Evolution and Learning, SEAL 2017, held in Shenzhen, China, in November 2017. The 85 papers presented in this volume were carefully reviewed and selected from 145 submissions. They were organized in topical sections named: evolutionary optimisation; evolutionary multiobjective optimisation; evolutionary machine learning; theoretical developments; feature selection and dimensionality reduction; dynamic and uncertain environments; real-world applications; adaptive systems; and swarm intelligence.
- Published
- 2017
91. Bayesian Inference Federated Learning for Heart Rate Prediction
- Abstract
The advances of sensing and computing technologies pave the way to develop novel applications and services for wearable devices. For example, wearable devices measure heart rate, which accurately reflects the intensity of physical exercise. Therefore, heart rate prediction from wearable devices benefits users with optimization of the training process. Conventionally, Cloud collects user data from wearable devices and conducts inference. However, this paradigm introduces significant privacy concerns. Federated learning is an emerging paradigm that enhances user privacy by remaining the majority of personal data on users’ devices. In this paper, we propose a statistically sound, Bayesian inference federated learning for heart rate prediction with autoregression with exogenous variable (ARX) model. The proposed privacy-preserving method achieves accurate and robust heart rate prediction. To validate our method, we conduct extensive experiments with real-world outdoor running exercise data collected from wearable devices.
- Published
- 2020
92. Minimal kernels and compact analytic objects in complex surfaces
- Abstract
In this paper, we want to study the link between the presence of compact objects with some analytic structure and the global geometry of a weakly complete surface. We begin with a brief survey of some now classic results on the local geometry around a (complex) curve, which depends on the sign of its self-intersection and, in the flat case, on some more refined invariants (see the works of Grauert, Suzuki, Ueda). Then, we recall some results about the propagation of compact curves and the existence of holomorphic functions (from the works of Nishino and Ohsawa). With such considerations in mind, we give an overview of the classification results for weakly complete surfaces that we obtained in two joint papers with Slodkowski (see Mongodi et al. (Indiana Univ. Math. J., 67(2), 899-935 (2018); Int. J. Math., 28(8), 1750063, 16 (2017))) and we present some new results which stem from this somehow more local (or less global) viewpoint (see Sections 4.2, 4.3, and 5).
- Published
- 2020
93. Evaluation of SLA Negotiation for Personalized SDN Service Delivery
- Abstract
Ensuring the quality of services (QoS) is crucial in a service-oriented business model. A service level agreement (SLA) is an important agreement between a consumer and a provider and is a key element in ensuring QoS. Service negotiation occurs in an initial stage of the SLA where service requirements are agreed upon to avoid conflict situations. Guaranteeing QoS is one of the key challenges in software defined networking (SDN). Several intelligent solutions have been proposed, however most of them are application focused and are unable to provide personalized and reliable QoS delivery in SDN. This paper presents a reputation data-driven SLA negotiation framework that provides personalized and reliable service delivery in SDN and assists in QoS management for informed decision making. In addition, a fuzzy inference system (FIS) is used to implement the framework and the results are discussed in this paper.
- Published
- 2020
94. Detecting Alzheimer’s Disease by Exploiting Linguistic Information from Nepali Transcript
- Abstract
© 2020, Springer Nature Switzerland AG. Alzheimer’s disease (AD) is the most common form of neurodegenerating disorder accounting for 60–80% of all dementia cases. The lack of effective clinical treatment options to completely cure or even slow the progression of disease makes it even more serious. Treatment options are available to treat the milder stage of the disease to provide symptomatic short-term relief and improve quality of life. Early diagnosis is key in the treatment and management of AD as advanced stages of disease cause severe cognitive decline and permanent brain damage. This has prompted researchers to explore innovative ways to detect AD early on. Changes in speech are one of the main signs of AD patients. As the brain deteriorates the language processing ability of the patients deteriorates too. Previous research has been done in the English language using Natural Language Processing (NLP) techniques for early detection of AD. However, research using local languages and low resourced language like Nepali still lag behind. NLP is an important tool in Artificial Intelligence to decipher the human language and perform various tasks. In this paper, various classifiers have been discussed for the early detection of Alzheimer’s in the Nepali language. The proposed study makes a convincing conclusion that the difficulty in processing information in AD patients reflects in their speech while describing a picture. The study incorporates the speech decline of AD patients to classify them as control subjects or AD patients using various classifiers and NLP techniques. Furthermore, in this experiment a new dataset consisting of transcripts of AD patients and Control normal (CN) subjects in the Nepali language. In addition, this paper sets a baseline for the early detection of AD using NLP in the Nepali language.
- Published
- 2020
95. BlockMeds: A Blockchain-Based Online Prescription System with Privacy Protection
- Abstract
Since the authentication of digital prescription is a lengthy and error-prone process by pharmacy employees, nowadays in many countries around the world, the paper-based prescription is still the only valid document for patients to purchase their prescribed medication from a pharmacy. Moreover, as a prescription can contain a lot of private information about the patients and their illness, the security and privacy issues in using digital prescription also raise big concerns. Recently, Blockchain has been widely regarded as a promising technology to secure online business data and transactions. In this paper, we present BlockMeds, a Blockchain based online prescription system which enables the authentication of digital prescriptions. Meanwhile, to address the privacy issue during the authentication and transaction for buying the medication, a privacy protection strategy is also implemented in the system. BlockMeds provides the proof of concept for a Blockchain based online prescription system. It also demonstrates the need for privacy protection which is often overlooked in a Blockchain-based system. BlockMeds can be used as a prototype system by both researchers and industrial practitioners who are interested in Blockchain-based medical service systems.
- Published
- 2020
96. PredatorHP Revamped (Not Only) for Interval-Sized Memory Regions and Memory Reallocation (Competition Contribution)
- Abstract
This paper concentrates on improvements of the PredatorHP shape analyzer in the past two years, including, e.g., improved handling of interval-sized memory regions or new support of memory reallocation. The paper characterizes PredatorHP's participation in SV-COMP 2020, pointing out its strengths and weakness and the way they were influenced by the latest changes in the tool., Tento článek se zaměřuje na zdokonalení nástroje PredatorHP v posledních dvou letech, například na zlepšení manipulace s paměťovými oblastmi s intervalovou velikostí nebo na novou podporu realokace paměti. Příspěvek popisuje účast nástroje PredatorHP na SV-COMP 2020 a poukazuje na jeho silné a slabé stránky a dopad nejnovějších změn v nástroji.
- Published
- 2020
97. Surveying Empires: Archaeologies of Colonial Cartography and the Great Trigonometrical Survey of India
- Abstract
This paper demonstrates the important contribution material culture makes to histories of cartography. Focusing on the tangible and intangible heritage of the Great Trigonometrical Survey (GTS) of India, and West Bengal in particular, the material landscape legacies of the GTS are analysed and interpreted. This reveals new insights into how surveys of the GTS were undertaken in the nineteenth century, under George Everest, and the infrastructure that was created to underpin the British mapping of India. The trigonometrical stations built by the GTS were of different designs and construction, adapted in response to local conditions and circumstances. Today, this ‘survey heritage’ is at risk, yet provides a basis for understanding more deeply the materiality of mapping and survey practices used in mapping empires. The paper connects the three-dimensional ‘spaces of survey’ with the two-dimensional ‘space of the map’, and concludes by arguing for greater consideration of the verticality of mapping.
- Published
- 2020
98. Management for stakeholder approach for a socially sustainable governance of megaprojects
- Abstract
The purpose of this paper is to provide a brief overview from a multidisciplinary literature (organizational, political, sociological) on how to manage megaprojects in contemporary societies, focusing on the role of stakeholders engagement. Starting from the main points of weakness stated in literature on megaprojects failure, the paper analyzes these points from a sociological and political perspective in order to understand how to overcome these limits and increase the probability of megaprojects to succeed. Among these points the multi-stakeholder nature of megaprojects, the kind of actors involved and the difficulty to engage all of them to assure an inclusive participation along with a socially responsible model of governance, are emphasized. This conceptual paper refers to both social network and stakeholder theories to integrate the current theoretical body of literature in the field of project management. In particular, the approach of management-for-stakeholders is presented to complement the more traditional management-of-stakeholders. It allows to emphasize megaproject social responsibility and to promote a more comprehensive understanding of megaproject governance in a context of sustainable development.
- Published
- 2020
99. Beyond Artificial Intelligence : The Disappearing Human-Machine Divide.
- Author
-
Kelemen, Jozef, Romportl, Jan, and Zackova, Eva
- Subjects
Artificial intelligence ,Computational intelligence - Abstract
Summary: This book is an edited collection of chapters based on the papers presented at the conference "Beyond AI: Artificial Dreams" held in Pilsen in November 2012. The aim of the conference was to question deep-rooted ideas of artificial intelligence and cast critical reflection on methods standing at its foundations. Artificial Dreams epitomize our controversial quest for non-biological intelligence, and therefore the contributors of this book tried to fully exploit such a controversy in their respective chapters, which resulted in an interdisciplinary dialogue between experts from engineering, natural sciences and humanities. While pursuing the Artificial Dreams, it has become clear that it is still more and more difficult to draw a clear divide between human and machine. And therefore this book tries to portrait such an image of what lies beyond artificial intelligence: we can see the disappearing human-machine divide, a very important phenomenon of nowadays technological society, the phenomenon which is often uncritically praised, or hypocritically condemned. And so this phenomenon found its place in the subtitle of the whole volume as well as in the title of the chapter of Kevin Warwick, one of the keynote speakers at "Beyond AI: Artificial Dreams". .
- Published
- 2015
100. How the Swedish Rheumatism Association uses the design for all tests to approve easy to handle packages and products
- Abstract
Swedish Rheumatism Association has for many years fought for accessibility of products and services for their members. One tool in that struggle is a method to certify products and packaging as “easy to use”. This paper describes the development of the latest version of that test. It relies on people experiences and puts value on user satisfaction. The result is a powerful tool in inclusive design. Performing calibrated product testing by test groups of persons with reduced functions in their hands is now used as a product development and, if approved as a marketing tool., Conference Paper
- Published
- 2016
- Full Text
- View/download PDF
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.