903 results
Search Results
2. Editorial: Special Issue for Selected Papers of VLDB 2021
- Author
-
Felix Naumann and Xin Luna Dong
- Subjects
Hardware and Architecture ,Information Systems - Published
- 2023
3. Four Limit Cycles for a Rock-Scissor-Paper Game Between Bacteriocin Producing Bacteria
- Author
-
Sibi Ma, Mingzhi Hu, and Zhengyi Lu
- Subjects
Computer Science (miscellaneous) ,Information Systems - Published
- 2023
4. Call for Papers, Issue 5/2024
- Author
-
Thomas Grisold, Christian Janiesch, Maximilian Röglinger, and Moe Thandar Wynn
- Subjects
Information Systems - Published
- 2022
5. Abstract papers from the Energy Informatics.Academy Conference 2022 (EI.A 2022)
- Author
-
Filipe Caldeira, Hamid Reza Shaker, Filipe Gonçalves Cardoso, Christian Veje, and Lasse Kappel Mortensen
- Subjects
Computer Networks and Communications ,Energy Engineering and Power Technology ,Information Systems - Published
- 2022
6. Correction to: Call for Papers, Issue 3/2024
- Author
-
Ali Sunyaev, Daniel Fürstenau, and Elizabeth Davidson
- Subjects
Information Systems - Published
- 2022
7. A Commentary on Process Improvements to Reduce Manual Tasks and Paper at Covid-19 Mass Vaccination Points of Dispensing in California
- Author
-
Eric G, Yan and Noam H, Arzt
- Subjects
Health Information Management ,Vaccination ,COVID-19 ,Humans ,Medicine (miscellaneous) ,Health Informatics ,Child ,Mass Vaccination ,California ,Information Systems - Abstract
My Turn is software used to manage several Covid-19 mass vaccination campaigns in California. The objective of this article is to describe the use of My Turn at two points of dispensing in California and comment on process improvements to reduce manual tasks of six identified processes of vaccination-registration, scheduling, administration, documentation, follow-up, and digital vaccine record-and paper. We reviewed publicly available documents of My Turn and patients vaccinated at George R. Moscone Convention Center in San Francisco and Oakland Coliseum Community Vaccination Clinic. For publicly available documents of My Turn, we examined videos of My Turn on YouTube, and documentation from EZIZ, the website for the California Vaccines for Children Program. For patients, we examined publicly available vaccination record cards on Instagram and Google. At the George R. Moscone Convention Center, 329,608 vaccines doses were given. At the Oakland Coliseum Community Vaccination Clinic, more than 500,000 vaccine doses were administered. The use of My Turn can be used to reduce manual tasks and paper for mass vaccinating patients against Covid-19.
- Published
- 2022
8. Call for Papers, Issue 1/2024
- Author
-
Jerry Chun-Wei Lin, Gautam Srivastava, Yu-Dong Zhang, and Christoph M. Flath
- Subjects
Information Systems - Published
- 2022
9. Special issue on the best papers of DaMoN 2020
- Author
-
Danica Porobic
- Subjects
Hardware and Architecture ,Information Systems - Published
- 2022
10. A holistic overview of deep learning approach in medical imaging
- Author
-
Rammah Yousef, Gaurav Gupta, Nabhan Yousef, and Manju Khari
- Subjects
Computer Networks and Communications ,Hardware and Architecture ,Deep learning (DL) ,Regular Paper ,Medical data augmentation ,Media Technology ,Medical imaging ,Software ,Transfer learning ,Information Systems - Abstract
Medical images are a rich source of invaluable necessary information used by clinicians. Recent technologies have introduced many advancements for exploiting the most of this information and use it to generate better analysis. Deep learning (DL) techniques have been empowered in medical images analysis using computer-assisted imaging contexts and presenting a lot of solutions and improvements while analyzing these images by radiologists and other specialists. In this paper, we present a survey of DL techniques used for variety of tasks along with the different medical image’s modalities to provide critical review of the recent developments in this direction. We have organized our paper to provide significant contribution of deep leaning traits and learn its concepts, which is in turn helpful for non-expert in medical society. Then, we present several applications of deep learning (e.g., segmentation, classification, detection, etc.) which are commonly used for clinical purposes for different anatomical site, and we also present the main key terms for DL attributes like basic architecture, data augmentation, transfer learning, and feature selection methods. Medical images as inputs to deep learning architectures will be the mainstream in the coming years, and novel DL techniques are predicted to be the core of medical images analysis. We conclude our paper by addressing some research challenges and the suggested solutions for them found in literature, and also future promises and directions for further developments.
- Published
- 2022
11. Multicriteria decision-making based on distance measures and knowledge measures of Fermatean fuzzy sets
- Author
-
Ganie, Abdul Haseeb
- Subjects
Original Paper ,Multicriteria decision-making ,Fermatean fuzzy set ,Artificial Intelligence ,Pattern recognition ,ComputingMethodologies_GENERAL ,t-conorm ,Knowledge measure ,Pythagorean fuzzy set ,Computer Science Applications ,Information Systems - Abstract
Fermatean fuzzy sets are more powerful than fuzzy sets, intuitionistic fuzzy sets, and Pythagorean fuzzy sets in handling various problems involving uncertainty. The distance measures in the fuzzy and non-standard fuzzy frameworks have got their applicability in various areas such as pattern analysis, clustering, medical diagnosis, etc. Also, the fuzzy and non-standard fuzzy knowledge measures have played a vital role in computing the criteria weights in the multicriteria decision-making problems. As there is no study concerning the distance and knowledge measures of Fermatean fuzzy sets, so in this paper, we propose some novel distance measures for Fermatean fuzzy sets using t-conorms. We also discuss their various desirable properties. With the help of suggested distance measures, we introduce some knowledge measures for Fermatean fuzzy sets. Through numerical comparison and linguistic hedges, we establish the effectiveness of the suggested distance measures and knowledge measures, respectively, over the existing measures in the Pythagorean/Fermatean fuzzy setting. At last, we demonstrate the application of the suggested measures in pattern analyis and multicriteria decision-making.
- Published
- 2022
12. A survey of application research based on blockchain smart contract
- Author
-
Lin, Shi-Yi, Zhang, Lei, Li, Jing, Ji, Li-li, and Sun, Yue
- Subjects
Original Paper ,Blockchain ,Computer Networks and Communications ,Smart contract ,Industrial internet of things ,DAG-based blockchain ,Electrical and Electronic Engineering ,Industry 4.0 ,Blockchain Oracle ,Information Systems - Abstract
Nowadays, blockchain technology and industry has developed rapidly all over the world, which is inseparable from continuous innovation and improvement on smart contract technology. Therefore, by summarizing the working principle and application research status of blockchain smart contract, this paper analyzes the development and challenges of smart contract. Firstly, we introduce the model and operation principle of blockchain smart contract for the overall architecture, analyze the deployment process of smart contract with Ethereum, Hyperledger Fabric and EOSIO, and make a comparative analysis from the technical level. And taking Byteball, InterValue and IOTA platforms as examples, we introduce the deployment process and application potential for DAG-based blockchain smart contract. Additionally, we also summarize the application research of smart contract for international and Blockchain Oracle, and discuss its innovative application and development trend in the future. Secondly, we introduce the application status of smart contract with Ethereum and Hyperledger Fabric platforms from the aspects of financial transactions, Internet of things, medical applications, and supply chain, and further discuss EOS (enterprise operation system), Blockchain Oracle and other application fields. Furthermore, we introduce the application advantages and challenges to smart contract for industrial Internet from the fields of manufacturing, food industry, industrial Internet of things and industry 4.0. Finally, we discuss the challenges faced by smart contract with technical issues, analyzes the impact on large-scale applications and mining system on the sustainable development of smart contract, and looks forward to the future research direction of blockchain smart contract.
- Published
- 2022
13. A light-weight convolutional Neural Network Architecture for classification of COVID-19 chest X-Ray images
- Author
-
Masud, Mehedi
- Subjects
Computer Networks and Communications ,Hardware and Architecture ,Special Issue Paper ,Media Technology ,Software ,Information Systems - Abstract
The COVID-19 pandemic has opened numerous challenges for scientists to use massive data to develop an automatic diagnostic tool for COVID-19. Since the outbreak in January 2020, COVID-19 has caused a substantial destructive impact on society and human life. Numerous studies have been conducted in search of a suitable solution to test COVID-19. Artificial intelligence (AI) based research is not behind in this race, and many AI-based models have been proposed. This paper proposes a lightweight convolutional neural network (CNN) model to classify COVID and Non_COVID patients by analyzing the hidden features in the X-Ray images. The model has been evaluated with different standard metrics to prove the reliability of the model. The model obtained 98.78%, 93.22%, and 92.7% accuracy in the training, validation, and testing phases. In addition, the model achieved 0.964 scores in the Area Under Curve (AUC) metric. We compared the model with four state-of-art pre-trained models (VGG16, InceptionV3, DenseNet121, and EfficientNetB6). The evaluation results demonstrate that the proposed CNN model is a candidate for an automatic diagnostic tool for the classification of COVID-19 patients using chest X-ray images. This research proposes a technique to classify COVID-19 patients and does not claim any medical diagnosis accuracy.
- Published
- 2022
14. Pythagorean fuzzy entropy measure-based complex proportional assessment technique for solving multi-criteria healthcare waste treatment problem
- Author
-
Chaurasiya, Rishikesh and Jain, Divya
- Subjects
Original Paper ,Health care waste treatment ,Artificial Intelligence ,PF-COPRAS ,Intuitionistic fuzzy set ,Pythagorean fuzzy set ,Computer Science Applications ,Information Systems - Abstract
With the increasing risk to human health and environmental issues, the selection of appropriate management and treatment of healthcare waste has become a major problem, especially in developing countries. There are various alternatives to dispose of health care waste. The important is to assess the best alternative among them. The assessment of each alternative should be done based on public health, psychological, economic, environmental, technological, and operational aspect. The selection of the best health care waste treatment (HCWT) alternative is a complicated, multi-criteria decision-making (MCDM) problem involving numerous disparate qualitative and quantitative features. Hence, in this research article, the MCDM method is presented for estimating and choosing the best alternative of HCWT by COPRAS technique in a Pythagorean fuzzy set (PFS). Here, in this paper, first of all, a new entropy measure on PFSs is proposed and its validity is studied. Thereafter, the MCDM technique Complex Proportional Assessment (COPRAS) is discussed in which the criteria weights are assessed by the proposed entropy measure and score function to enhance an efficacy and efficiency of the proposed technique. Furthermore, the above-defined technique is employed to resolve the real-life problem to obtain the best treatment alternative to disposal of the health care waste. Finally, sensitivity analysis is presented to rationale the proposed viewpoint for prioritizing HCWT alternatives.
- Published
- 2022
15. Named entity disambiguation in short texts over knowledge graphs
- Author
-
Bouarroudj, Wissem, Boufaida, Zizette, and Bellatreche, Ladjel
- Subjects
Human-Computer Interaction ,Entity linking ,Semantic and syntactic features ,Artificial Intelligence ,Hardware and Architecture ,Regular Paper ,Named entity disambiguation ,Queries ,Linked open data ,Short texts ,Software ,Information Systems - Abstract
The ever-growing usage of knowledge graphs (KGs) positions named entity disambiguation (NED) at the heart of designing accurate KG-driven systems such as query answering systems (QAS). According to the current research, most studies dealing with NED on KGs involve long texts, which is not the case of short text fragments, identified by their limited contexts. The accuracy of QASs strongly depends on the management of such short text. This limitation motivates this paper, which studies the NED problem on KGs, involving only short texts. First, we propose a NED approach including the following steps: (i) context expansion using WordNet to measure its similarity to the resource context. (ii) Exploiting coherence between entities in queries that contain more than one entity, such as “Is Michelle Obama the wife of Barack Obama?”. (iii) Taking into account the relations between words to calculate their similarity with the properties of a resource. (iv) the use of syntactic features. The NED solution approach is compared to state-of-the-art approaches using five datasets. The experimental results show that our approach outperforms these systems by 27% in the F-measure. A system called Welink, implementing our proposal, is available on GitHub, and it is also accessible via a REST API.
- Published
- 2022
16. Privacy-preserving association rule mining based on electronic medical system
- Author
-
Xu, Wenju, Zhao, Qingqing, Zhan, Yu, Wang, Baocang, and Hu, Yupu
- Subjects
Association rule mining ,Cooperative computation ,Original Paper ,Computer Networks and Communications ,Privacy-preserving ,Electrical and Electronic Engineering ,Homomorphic encryption ,Information Systems - Abstract
Privacy protection during collaborative distributed association rule mining is an important research, which has been widely used in market prediction, medical research and other fields. In medical research, Domadiya et al. (Sadhana 43(8):127, 2018) focused on mining association rules from horizontally distributed healthcare data to diagnose heart disease. They claimed they proposed a more effective privacy-preserving distributed association rule mining (PPDARM) scheme. However, a serious security scrutiny of the scheme is performed, and we find it vulnerable to protect the support of the itemsets from any electronic health record (EHR) system, which is the most important parameter Domadiya et al. tried to protect. In this paper, we first present the cryptanalysis of the PPDARM scheme proposed by Domadiya et al. as well as some revised performance analyses. Then a new PPDARM scheme with less interactions is proposed to avert the shortcomings of Domadiya et al., using the homomorphic properties of the distributed Paillier cryptosystem to accomplish the cooperative computation. Our scheme allows the directed authority (miner) to obtain the final results rather than all cooperative EHR systems, in case of semi-honest but pseudo EHR systems. Moreover, security analysis and performance evaluation demonstrate our proposal is efficient and feasible.
- Published
- 2022
17. The Effects of Digital Technology on Opportunity Recognition
- Author
-
Kreuzer, Thomas, Lindenthal, Anna-Katharina, Oberländer, Anna Maria, Röglinger, Maximilian, and Publica
- Subjects
Digital innovation ,Opportunity recognition ,Digital technology ,ddc:004 ,Digital technology effects ,Digital entrepreneurship ,Research Paper ,Information Systems - Abstract
Recognizing opportunities enabled by digital technology (DT) has become a competitive necessity in today’s digital world. However, opportunity recognition is a major challenge given the influence of DT, which not only disperses agency across various actors, but also blurs boundaries between customers, companies, products, and industries. As a result, traditional entrepreneurship knowledge needs to be rethought and the effects of DT on opportunity recognition need to be better understood. Drawing from opportunity recognition theory – as one of the central theories in the entrepreneurship domain – this study builds on a structured literature review to identify and explain three direct as well as three transitive effects of DT on opportunity recognition. These effects have been validated with real-world cases as well as interviews with academics and practitioners. In sum, this study contributes to descriptive and explanatory knowledge on the evolution from traditional to digital entrepreneurship. As a theory for explaining, the findings extend opportunity recognition theory by illuminating how and why DT influences opportunity recognition. This supports research and practice in investigating and managing opportunities more effectively. Supplementary Information The online version contains supplementary material available at 10.1007/s12599-021-00733-9.
- Published
- 2022
18. Fake news detection based on news content and social contexts: a transformer-based approach
- Author
-
Raza, Shaina and Ding, Chen
- Subjects
Transformer ,Zero shot learning ,Fake news ,Computational Theory and Mathematics ,Applied Mathematics ,Modeling and Simulation ,Regular Paper ,Social contexts ,User credibility ,Concept drift ,Weak supervision ,Computer Science Applications ,Information Systems - Abstract
Fake news is a real problem in today’s world, and it has become more extensive and harder to identify. A major challenge in fake news detection is to detect it in the early phase. Another challenge in fake news detection is the unavailability or the shortage of labelled data for training the detection models. We propose a novel fake news detection framework that can address these challenges. Our proposed framework exploits the information from the news articles and the social contexts to detect fake news. The proposed model is based on a Transformer architecture, which has two parts: the encoder part to learn useful representations from the fake news data and the decoder part that predicts the future behaviour based on past observations. We also incorporate many features from the news content and social contexts into our model to help us classify the news better. In addition, we propose an effective labelling technique to address the label shortage problem. Experimental results on real-world data show that our model can detect fake news with higher accuracy within a few minutes after it propagates (early detection) than the baselines.
- Published
- 2022
19. BDCNet: multi-classification convolutional neural network model for classification of COVID-19, pneumonia, and lung cancer from chest radiographs
- Author
-
Malik, Hassaan, Anees, Tayyaba, and Mui-zzud-din
- Subjects
Coronavirus ,Chest radiographs ,Computer Networks and Communications ,Hardware and Architecture ,Regular Paper ,Media Technology ,COVID-19 ,Deep learning ,Software ,Information Systems - Abstract
Globally, coronavirus disease (COVID-19) has badly affected the medical system and economy. Sometimes, the deadly COVID-19 has the same symptoms as other chest diseases such as pneumonia and lungs cancer and can mislead the doctors in diagnosing coronavirus. Frontline doctors and researchers are working assiduously in finding the rapid and automatic process for the detection of COVID-19 at the initial stage, to save human lives. However, the clinical diagnosis of COVID-19 is highly subjective and variable. The objective of this study is to implement a multi-classification algorithm based on deep learning (DL) model for identifying the COVID-19, pneumonia, and lung cancer diseases from chest radiographs. In the present study, we have proposed a model with the combination of Vgg-19 and convolutional neural networks (CNN) named BDCNet and applied it on different publically available benchmark databases to diagnose the COVID-19 and other chest tract diseases. To the best of our knowledge, this is the first study to diagnose the three chest diseases in a single deep learning model. We also computed and compared the classification accuracy of our proposed model with four well-known pre-trained models such as ResNet-50, Vgg-16, Vgg-19, and inception v3. Our proposed model achieved an AUC of 0.9833 (with an accuracy of 99.10%, a recall of 98.31%, a precision of 99.9%, and an f1-score of 99.09%) in classifying the different chest diseases. Moreover, CNN-based pre-trained models VGG-16, VGG-19, ResNet-50, and Inception-v3 achieved an accuracy of classifying multi-diseases are 97.35%, 97.14%, 97.15%, and 95.10%, respectively. The results revealed that our proposed model produced a remarkable performance as compared to its competitor approaches, thus providing significant assistance to diagnostic radiographers and health experts.
- Published
- 2022
20. Usability of a telehealth solution based on TV interaction for the elderly: the VITASENIOR-MT case study
- Author
-
Gabriel Pires, Ana Lopes, Pedro Correia, Luis Almeida, Luis Oliveira, Renato Panda, Dario Jorge, Diogo Mendes, Pedro Dias, Nelson Gomes, and Telmo Pereira
- Subjects
Remote patient monitoring ,User-centred design ,Computer Networks and Communications ,TV interaction ,Usability ,Heart rate ,Weight ,Human-Computer Interaction ,Telehealth ,Elderly ,ComputerApplications_MISCELLANEOUS ,Blood pressure ,Long Paper ,Oximetry ,Glycaemia ,Biometric and environmental data ,Software ,Information Systems - Abstract
Remote monitoring of biometric data in the elderly population is an important asset for improving the quality of life and level of independence of elderly people living alone. However, the design and implementation of health technological solutions often disregard the elderly physiological and psychological abilities, leading to low adoption of these technologies. We evaluate the usability of a remote patient monitoring solution, VITASENIOR-MT, which is based on the interaction with a television set. Twenty senior participants (over 64 years) and a control group of 20 participants underwent systematic tests with the health platform and assessed its usability through several questionnaires. Elderly participants scored high on the usability of the platform, very close to the evaluation of the control group. Sensory, motor and cognitive limitations were the issues that most contributed to the difference in usability assessment between the elderly group and the control group. The solution showed high usability and acceptance regardless of age, digital literacy, education and impairments (sensory, motor and cognitive), which shows its effective viability for use and implementation as a consumer product in the senior market. This work has been financially supported by the Portuguese foundation for science and technology (FCT) and European funds through Project VITASENIOR-MT with grant CENTRO-01-0145-FEDER-023659. CENTRO-01-0145-FEDER-023659 info:eu-repo/semantics/publishedVersion
- Published
- 2022
21. The impact of information sources on COVID-19 knowledge accumulation and vaccination intention
- Author
-
Alin Coman and Madalina Vlasceanu
- Subjects
Knowledge management ,Coronavirus disease 2019 (COVID-19) ,business.industry ,Applied Mathematics ,COVID-19 ,Belief change ,Computer Science Applications ,Vaccination ,Text mining ,Computational Theory and Mathematics ,Modeling and Simulation ,Regular Paper ,Vaccination intention ,business ,Psychology ,Source credibility ,Information Systems - Abstract
During a global health crisis, people are exposed to vast amounts of information from a variety of sources. Here, we assessed which information source could increase knowledge about COVID-19 (Study 1) and COVID-19 vaccines (Study 2). In Study 1, a US census matched sample of 1060 participants rated the accuracy of a set of statements and then were randomly assigned to one of 10 between-subjects conditions of varying sources providing belief-relevant information: a political leader (Trump/Biden), a health authority (Fauci/CDC), an anecdote (Democrat/Republican), a large group of prior participants (Democrats/Republicans/Generic), or no source (Control). Finally, they rated the accuracy of the initial set of statements again. Study 2 involved a replication with a sample of 1876 participants and focused on the COVID-19 vaccine. We found that knowledge increased most when the source of information was a generic group of people, irrespective of participants’ political affiliation. We also found that while expert communications were most successful at increasing Democrats’ vaccination intentions, no source was successful at increasing Republicans’ vaccination intention. We discuss these findings in the context of the current misinformation epidemic. Supplementary Information The online version contains supplementary material available at 10.1007/s41060-021-00307-8.
- Published
- 2022
22. Conspiracy theories on Twitter: emerging motifs and temporal dynamics during the COVID-19 pandemic
- Author
-
Veronika Batzdorfer, Holger Steinmetz, Marco Biella, and Meysam Alizadeh
- Subjects
Twitter structural break analysis ,Conspiracy beliefs ,Computational Theory and Mathematics ,Applied Mathematics ,Modeling and Simulation ,Regular Paper ,Word embedding ,COVID-19 ,Time series analysis ,Computer Science Applications ,Information Systems - Abstract
The COVID-19 pandemic resulted in an upsurge in the spread of diverse conspiracy theories (CTs) with real-life impact. However, the dynamics of user engagement remain under-researched. In the present study, we leverage Twitter data across 11 months in 2020 from the timelines of 109 CT posters and a comparison group (non-CT group) of equal size. Within this approach, we used word embeddings to distinguish non-CT content from CT-related content as well as analysed which element of CT content emerged in the pandemic. Subsequently, we applied time series analyses on the aggregate and individual level to investigate whether there is a difference between CT posters and non-CT posters in non-CT tweets as well as the temporal dynamics of CT tweets. In this regard, we provide a description of the aggregate and individual series, conducted a STL decomposition in trends, seasons, and errors, as well as an autocorrelation analysis, and applied generalised additive mixed models to analyse nonlinear trends and their differences across users. The narrative motifs, characterised by word embeddings, address pandemic-specific motifs alongside broader motifs and can be related to several psychological needs (epistemic, existential, or social). Overall, the comparison of the CT group and non-CT group showed a substantially higher level of overall COVID-19-related tweets in the non-CT group and higher level of random fluctuations. Focussing on conspiracy tweets, we found a slight positive trend but, more importantly, an increase in users in 2020. Moreover, the aggregate series of CT content revealed two breaks in 2020 and a significant albeit weak positive trend since June. On the individual level, the series showed strong differences in temporal dynamics and a high degree of randomness and day-specific sensitivity. The results stress the importance of Twitter as a means of communication during the pandemic and illustrate that these beliefs travel very fast and are quickly endorsed. Supplementary Information The online version contains supplementary material available at 10.1007/s41060-021-00298-6.
- Published
- 2021
23. Mining subgraph coverage patterns from graph transactions
- Author
-
A. Srinivas Reddy, Anirban Mondal, U. Deva Priyakumar, and P. Krishna Reddy
- Subjects
Computer science ,Applied Mathematics ,Scale-invariant feature transform ,computer.software_genre ,Frequency ,Computer Science Applications ,Set (abstract data type) ,Management information systems ,Subgraph mining ,Computational Theory and Mathematics ,Modeling and Simulation ,Bio-informatics ,Regular Paper ,Graph (abstract data type) ,Data mining ,Graph mining ,Subgraph coverage patterns ,Transaction data ,computer ,Information Systems - Abstract
Pattern mining from graph transactional data (GTD) is an active area of research with applications in the domains of bioinformatics, chemical informatics and social networks. Existing works address the problem of mining frequent subgraphs from GTD. However, the knowledge concerning the coverage aspect of a set of subgraphs is also valuable for improving the performance of several applications. In this regard, we introduce the notion of subgraph coverage patterns (SCPs). Given a GTD, a subgraph coverage pattern is a set of subgraphs subject to relative frequency, coverage and overlap constraints provided by the user. We propose the Subgraph ID-based Flat Transactional (SIFT) framework for the efficient extraction of SCPs from a given GTD. Our performance evaluation using three real datasets demonstrates that our proposed SIFT framework is indeed capable of efficiently extracting SCPs from GTD. Furthermore, we demonstrate the effectiveness of SIFT through a case study in computer-aided drug design.
- Published
- 2021
24. Safe and secure system architectures for cyber-physical systems
- Author
-
Frank J. Furrer
- Subjects
Computer Science Applications ,Information Systems - Abstract
Cyber-physical systems are at the core of our current civilization. Countless examples dominate our daily life and work, such as driverless cars that will soon master our roads, implanted medical devices that will improve many lives, and industrial control systems that control production and infrastructure. Because cyber-physical systems manipulate the real world, they constitute a danger for many applications. Therefore, their safety and security are essential properties of these indispensable systems. The long history of systems engineering has demonstrated that the system quality properties—such as safety and security—strongly depend on the underlying system architecture. Satisfactory system quality properties can only be ensured if the fundamental system architecture is sound! The development of dependable cyber-physical architectures in recent years suggests that two harmonical architectures are required: a design-time architecture and a run-time architecture. The design-time architecture defines and specifies all parts and relationships, assuring the required system quality properties. However, in today’s complex systems, ensuring all quality properties in all operating conditions during design time will never be possible. Therefore, an additional line of defense against safety accidents and security incidents is indispensable: This must be provided by the run-time architecture. The run-time architecture primarily consists of a protective shell that monitors the run-time system during operation. It detects anomalies in system behavior, interface functioning, or data—often using artificial intelligence algorithms—and takes autonomous mitigation measures, thus attempting to prevent imminent safety accidents or security incidents before they occur. This paper’s core is the protective shell as a run-time protection mechanism for cyber-physical systems. The paper has the form of an introductory tutorial and includes focused references.
- Published
- 2023
25. Parallel border tracking in binary images for multicore computers
- Author
-
Victor M. Garcia-Molla and Pedro Alonso-Jordá
- Subjects
Hardware and Architecture ,Software ,Information Systems ,Theoretical Computer Science - Abstract
Border tracking in binary images is an important operation in many computer vision applications. The problem consists in finding borders in a 2D binary image (where all of the pixels are either 0 or 1). There are several algorithms available for this problem, but most of them are sequential. In a former paper, a parallel border tracking algorithm was proposed. This algorithm was designed to run in Graphics Processing units, and it was based on the sequential algorithm known as the Suzuki algorithm. In this paper, we adapt the previously proposed GPU algorithm so that it can be executed in multicore computers. The resulting algorithm is evaluated against its GPU counterpart. The results show that the performance of the GPU algorithm worsens (or even fails) for very large images or images with many borders. On the other hand, the proposed multicore algorithm can efficiently cope with large images.
- Published
- 2023
26. On computing exact means of time series using the move-split-merge metric
- Author
-
Jana Holznigenkemper, Christian Komusiewicz, and Bernhard Seeger
- Subjects
FOS: Computer and information sciences ,Computer Networks and Communications ,Computer Science - Data Structures and Algorithms ,Data Structures and Algorithms (cs.DS) ,Computer Science Applications ,Information Systems - Abstract
Computing an accurate mean of a set of time series is a critical task in applications like nearest-neighbor classification and clustering of time series. While there are many distance functions for time series, the most popular distance function used for the computation of time series means is the non-metric dynamic time warping (DTW) distance. A recent algorithm for the exact computation of a DTW-Mean has a running time of $\mathcal{O}(n^{2k+1}2^kk)$, where $k$ denotes the number of time series and $n$ their maximum length. In this paper, we study the mean problem for the move-split-merge (MSM) metric that not only offers high practical accuracy for time series classification but also carries of the advantages of the metric properties that enable further diverse applications. The main contribution of this paper is an exact and efficient algorithm for the MSM-Mean problem of time series. The running time of our algorithm is $\mathcal{O}(n^{k+3}2^k k^3 )$, and thus better than the previous DTW-based algorithm. The results of an experimental comparison confirm the running time superiority of our algorithm in comparison to the DTW-Mean competitor. Moreover, we introduce a heuristic to improve the running time significantly without sacrificing much accuracy., 25 pages, 14 figures
- Published
- 2023
27. Recent advances in domain-driven data mining
- Author
-
Chuanren Liu, Ehsan Fakharizadi, Tong Xu, and Philip S. Yu
- Subjects
Computational Mathematics ,Computational Theory and Mathematics ,Artificial Intelligence ,Applied Mathematics ,Modeling and Simulation ,Engineering (miscellaneous) ,Computer Science Applications ,Information Systems - Abstract
Data mining research has been significantly motivated by and benefited from real-world applications in novel domains. This special issue was proposed and edited to draw attention to domain-driven data mining and disseminate research in foundations, frameworks, and applications for data-driven and actionable knowledge discovery. Along with this special issue, we also organized a related workshop to continue the previous efforts on promoting advances in domain-driven data mining. This editorial report will first summarize the selected papers in the special issue, then discuss various industrial trends in the context of the selected papers, and finally document the keynote talks presented by the workshop. Although many scholars have made prominent contributions with the theme of domain-driven data mining, there are still various new research problems and challenges calling for more research investigations in the future. We hope this special issue is helpful for scholars working along this critically important line of research.
- Published
- 2022
28. Performance evaluation of multi-exaflops machines using Equality network topology
- Author
-
Chi-Hsiu Liang, Chun-Ho Cheng, Hong-Lin Wu, Chao-Chin Li, Po-Lin Huang, and Chi-Chuan Hwang
- Subjects
Hardware and Architecture ,Software ,Information Systems ,Theoretical Computer Science - Abstract
In modern computing architectures, graph theory is the soul of the play due to the rising core counts. It is indispensable to keep finding a better way to connect the cores. A novel chordal-ring interconnect topology system, Equality, is revisited in this paper to compare with a few previous works. This paper details the procedures for constructing the Equality interconnects, its special routing procedures, the strategies for selecting a configuration, and evaluating its performance using the open-source cycle-accurate BookSim package. Four scenarios representing small- to large-scale computing facilities are presented to assess the network performance. This work shows that in 16,384-endpoint systems, the Equality network turns out to be the most efficient system. The results also show the steady scalability of Equality networks extending to 48–320K, and a million endpoints. Equality networks are adjustable to fit with commodity hardware and resilient under ten common traffic models. It is suggested that Equality network topology can be used in constructing efficient multi-exaflops supercomputers and data centers.
- Published
- 2022
29. Extracting information and inferences from a large text corpus
- Author
-
Sandhya Avasthi, Ritu Chauhan, and Debi Prasanna Acharjya
- Subjects
Computational Theory and Mathematics ,Artificial Intelligence ,Computer Networks and Communications ,Applied Mathematics ,Electrical and Electronic Engineering ,Computer Science Applications ,Information Systems - Abstract
The usage of various software applications has grown tremendously due to the onset of Industry 4.0, giving rise to the accumulation of all forms of data. The scientific, biological, and social media text collections demand efficient machine learning methods for data interpretability, which organizations need in decision-making of all sorts. The topic models can be applied in text mining of biomedical articles, scientific articles, Twitter data, and blog posts. This paper analyzes and provides a comparison of the performance of Latent Dirichlet Allocation (LDA), Dynamic Topic Model (DTM), and Embedded Topic Model (ETM) techniques. An incremental topic model with word embedding (ITMWE) is proposed that processes large text data in an incremental environment and extracts latent topics that best describe the document collections. Experiments in both offline and online settings on large real-world document collections such as CORD-19, NIPS papers, and Tweet datasets show that, while LDA and DTM is a good model for discovering word-level topics, ITMWE discovers better document-level topic groups more efficiently in a dynamic environment, which is crucial in text mining applications.
- Published
- 2022
30. An adaptive timing mechanism for urban traffic pre-signal based on hybrid exploration strategy to improve double deep Q network
- Author
-
Minglei Liu, Huizhen Zhang, Youqing Chen, Hui Xie, and Yubiao Pan
- Subjects
Computational Mathematics ,Artificial Intelligence ,Engineering (miscellaneous) ,Information Systems - Abstract
With the increasing traffic congestion in cities, the priority of public transit has become a consensus for the development and management of urban transportation. The traffic pre-signal mechanism, which gives priority in time and space to buses by optimizing road right-of-way allocation, has gained wide attention and application. In order to broaden the action exploration range of the agent and avoid the pre-signal decision from falling into suboptimal strategy or local optimal strategy. For the exploration strategy of the DDQN algorithm, this paper reduces the probability of directly selecting the local optimal action and increases the probability of selecting non-greedy actions based on the principle that “the action with a larger value function is more likely to be selected.” This paper addresses the problem that the existing urban traffic pre-signal mechanism cannot adaptively adjust the advance time, and proposes a traffic pre-signal adaptive timing mechanism based on a Hybrid Exploration Strategy Double Deep Q Network (HES-DDQN) by combining the $$\epsilon $$ ϵ -greedy strategy and Boltzmann strategy. We have used the traffic simulation software VISSIM to conduct simulation experiments on an intersection. The experimental results show that, compared with the method of setting no pre-signal and the formula method of setting pre-signal, the HES-DDQN pre-signal mechanism can significantly reduce the average delay of buses, the waiting queue length, and the number of stops of social vehicles.
- Published
- 2022
31. Real-time monitoring solution with vibration analysis for industry 4.0 ventilation systems
- Author
-
Rubén Muñiz, Fernando Nuño, Juan Díaz, María González, Miguel J. Prieto, and Óliver Menéndez
- Subjects
Hardware and Architecture ,Software ,Information Systems ,Theoretical Computer Science - Abstract
Predictive maintenance has revealed as one of the paradigms of Industry 4.0. This paper addresses a complete system for the acquisition, computing, monitoring and communication of ventilation equipment in underground tunnels based on TCP/IP protocol and accessible via WEB services. Not only does the proposed system collect different sensor data (temperatures, vibrations, pressures, tilt angles or rotational speed), it performs local data processing as well. This feature is the newest and most important of all those provided by the system design, and there is no equipment that offers a similar performance in current ventilation systems. This paper shows the design and implementation of the equipment (system architecture and processing), as well as the experimental results obtained.
- Published
- 2022
32. Evaluation of e-learners’ concentration using recurrent neural networks
- Author
-
Young-Sang Jeong and Nam-Wook Cho
- Subjects
Hardware and Architecture ,Software ,Information Systems ,Theoretical Computer Science - Abstract
Recently, interest in e-learning has increased rapidly owing to the lockdowns imposed by COVID-19. A major disadvantage of e-learning is the difficulty in maintaining concentration because of the limited interaction between teachers and students. The objective of this paper is to develop a methodology to predict e-learners' concentration by applying recurrent neural network models to eye gaze and facial landmark data extracted from e-learners' video data. One hundred eighty-four video data of ninety-two e-learners were obtained, and their frame data were extracted using the OpenFace 2.0 toolkit. Recurrent neural networks, long short-term memory, and gated recurrent units were utilized to predict the concentration of e-learners. A set of comparative experiments was conducted. As a result, gated recurrent units exhibited the best performance. The main contribution of this paper is to present a methodology to predict e-learners' concentration in a natural e-learning environment.
- Published
- 2022
33. Low-latency and High-Reliability FBMC Modulation scheme using Optimized Filter design for enabling NextG Real-time Smart Healthcare Applications
- Author
-
Abhinav Adarsh, Shashwat Pathak, Digvijay Singh Chauhan, and Basant Kumar
- Subjects
Hardware and Architecture ,Software ,Information Systems ,Theoretical Computer Science - Abstract
This paper presents a prototype filter design using the orthant optimization technique to assist a filter bank multicarrier (FBMC) modulation scheme of a NextG smart e-healthcare network framework. Low latency and very high reliability are one of the main requirements of a real-time e-healthcare system. In recent times, FBMC modulation has gotten more attention due to its spectral efficiency. The characteristics of a filter bank are determined by t's, prototype filter. A prototype filter cannot be designed to achieve an arbitrary time localization (for low latency) and frequency localization (spectral efficiency), as time and frequency spreading are conflicting goals. Hence, an optimum design needed to be achieved. In this paper, a constraint for perfect or nearly perfect reconstruction is formulated for prototype filter design and an orthant-based enriched sparse ℓ1-optimization method is applied to achieve the optimum performance in terms of higher availability of subcarrier spacing for the given requirement of signal-to-interference ratio. Larger subcarrier spacing ensures lower latency and better performance in real-time applications. The proposed FBMC system, based on an optimum design of the prototype filter, also supports a higher data rate as compared to traditional FBMC and OFDM systems, which is another requirement of real-time communication. In this paper, the solution for the different technical issues of physical layer design is provided. The presented modulation scheme through the proposed prototype filter-based FBMC can suppress the side lobe energy of the constituted filters up to large extent without compromising the recovery of the signal at the receiver end. The proposed system provides very high spectral efficiency; it can sacrifice large guard band frequencies to increase the subcarrier spacing to provide low-latency communication to support the real-time e-healthcare network.
- Published
- 2022
34. Anti-aliasing convolution neural network of finger vein recognition for virtual reality (VR) human–robot equipment of metaverse
- Author
-
Nghi C. Tran, Jian‑Hong Wang, Toan H. Vu, Tzu-Chiang Tai, and Jia-Ching Wang
- Subjects
Hardware and Architecture ,Software ,Information Systems ,Theoretical Computer Science - Abstract
Metaverse, which is anticipated to be the future of the internet, is a 3D virtual world in which users interact via highly customizable computer avatars. It is considerably promising for several industries, including gaming, education, and business. However, it still has drawbacks, particularly in the privacy and identity threads. When a person joins the metaverse via a virtual reality (VR) human-robot equipment, their avatar, digital assets, and private information may be compromised by cybercriminals. This paper introduces a specific Finger Vein Recognition approach for the virtual reality (VR) human-robot equipment of the metaverse of the Metaverse to prevent others from misappropriating it. Finger vein is a is a biometric feature hidden beneath our skin. It is considerably more secure in person verification than other hand-based biometric characteristics such as finger print and palm print since it is difficult to imitate. Most conventional finger vein recognition systems that use hand-crafted features are ineffective, especially for images with low quality, low contrast, scale variation, translation, and rotation. Deep learning methods have been demonstrated to be more successful than traditional methods in computer vision. This paper develops a finger vein recognition system based on a convolution neural network and anti-aliasing technique. We employ/ utilize a contrast image enhancement algorithm in the preprocessing step to improve performance of the system. The proposed approach is evaluated on three publicly available finger vein datasets. Experimental results show that our proposed method outperforms the current state-of-the-art methods, improvement of 97.66% accuracy on FVUSM dataset, 99.94% accuracy on SDUMLA dataset, and 88.19% accuracy on THUFV2 dataset.
- Published
- 2022
35. Generative knowledge-based transfer learning for few-shot health condition estimation
- Author
-
Weijie Kang, Jiyang Xiao, and Junjie Xue
- Subjects
Computational Mathematics ,Artificial Intelligence ,Engineering (miscellaneous) ,Information Systems - Abstract
In the field of high-end manufacturing, it is valuable to study few-shot health condition estimation. Although transfer learning and other methods have effectively improved the ability of few-shot learning, they still cannot solve the lack of prior knowledge. In this paper, by combining data enhancement, knowledge reasoning, and transfer learning, a generative knowledge-based transfer learning model is proposed to achieve few-shot health condition estimation. First, with the effectiveness of data enhancement on machine learning, a novel batch monotonic generative adversarial network (BM-GAN) is designed for few-shot health condition data generation, which can solve the problem of insufficient data and generate simulated training data. Second, a generative knowledge-based transfer learning model is proposed with the performance advantages of the belief rule base (BRB) method on few-shot learning, which combines expert knowledge and simulated training data to obtain a generalized BRB model and then fine-tunes the generalized model with real data to obtain a dedicated BRB model. Third, through uniform sampling of NASA lithium battery data and simulating few-shot conditions, the generative transfer-belief rule base (GT-BRB) method proposed in this paper is verified to be feasible for few-shot health condition estimation and improves the estimation accuracy of the BRB method by approximately 17.3%.
- Published
- 2022
36. Parallelization of Swarm Intelligence Algorithms: Literature Review
- Author
-
Breno Augusto de Melo Menezes, Herbert Kuchen, and Fernando Buarque de Lima Neto
- Subjects
Software ,Information Systems ,Theoretical Computer Science - Abstract
Swarm Intelligence (SI) algorithms are frequently applied to tackle complex optimization problems. SI is especially used when good solutions are requested for NP hard problems within a reasonable response time. And when such problems possess a very high dimensionality, a dynamic nature, or present intrinsic complex intertwined independent variables, computational costs for SI algorithms may still be too high. Therefore, new approaches and hardware support are needed to speed up processing. Nowadays, with the popularization of GPU and multi-core processing, parallel versions of SI algorithms can provide the required performance on those though problems. This paper aims to describe the state of the art of such approaches, to summarize the key points addressed, and also to identify the research gaps that could be addressed better. The scope of this review considers recent papers mainly focusing on parallel implementations of the most frequently used SI algorithms. The use of nested parallelism is of particular interest, since one level of parallelism is often not sufficient to exploit the computational power of contemporary parallel hardware. The sources were main scientific databases and filtered accordingly to the set requirements of this literature review.
- Published
- 2022
37. Social network analysis and consensus reaching process-driven group decision making method with distributed linguistic information
- Author
-
Feifei Jin, Yu Yang, Jinpei Liu, and Jiaming Zhu
- Subjects
Computational Mathematics ,Artificial Intelligence ,Engineering (miscellaneous) ,Information Systems - Abstract
In group decision making with social network analysis (SNA), determining the weights of experts and constructing the consensus-reaching process (CRP) are hot topics. With respect to the generation of weights of experts, this paper firstly develops a distributed linguistic trust propagation operator and a path order weighted averaging (POWA) operator to explore the trust propagation and aggregation between indirectly connected experts, and the weights of experts can be derived by using relative node in-degree centrality in a complete distributed linguistic trust relationship matrix. Then, three levels of consensus are proposed, in which the most inconsistent evaluation information in distributed linguistic trust decision-making matrices can be pinpointed. Subsequently, the distance between experts’ evaluation information and collective evaluation information is designed to be applied as the adjustment cost in CRP. Finally, a novel feedback mechanism supported by the minimum adjustment cost is activated until the group consensus degree reaches the predefined threshold. The novelties of this paper are as follows: (1) the proposed POWA considers the trust value as well as the propagation efficiency of trust path when aggregating the trust relationship in SNA; (2) the consensus reaching mechanism can gradually improve the value of group consensus degree by continuously adjusting the most inconsistent evaluation information.
- Published
- 2022
38. Highly private blockchain-based management system for digital COVID-19 certificates
- Author
-
M. Magdalena Payeras Capellà, Macià Mut-Puigserver, and Rosa Pericàs Gornals
- Subjects
Computer Networks and Communications ,Safety, Risk, Reliability and Quality ,Software ,Information Systems - Abstract
As a result of the declaration of the COVID-19 pandemic, several proposals of blockchain-based solutions for digital COVID-19 certificates have been presented. Considering that health data have high privacy requirements, a health data management system must fulfil several strict privacy and security requirements. On the one hand, confidentiality of the medical data must be assured, being the data owner (the patient) the actor that maintain control over the privacy of their certificates. On the other hand, the entities involved in the generation and validation of certificates must be supervised by a regulatory authority. This set of requirements are generally not achieved together in previous proposals. Moreover, it is required that a digital COVID-19 certificate management protocol provides an easy verification process and also strongly avoid the risk of forgery. In this paper we present the design and implementation of a protocol to manage digital COVID-19 certificates where individual users decide how to share their private data in a hierarchical system. In order to achieve this, we put together two different technologies: the use of a proxy re-encryption (PRE) service in conjunction with a blockchain-based protocol. Additionally, our protocol introduces an authority to control and regulate the centers that can generate digital COVID-19 certificates and offers two kinds of validation of certificates for registered and non-registered verification entities. Therefore, the paper achieves all the requirements, that is, data sovereignty, high privacy, forgery avoidance, regulation of entities, security and easy verification.
- Published
- 2022
39. Solving arithmetic word problems by synergizing syntax-semantics extractor for explicit relations and neural network miner for implicit relations
- Author
-
Xinguo Yu, Xiaopan Lyu, Rao Peng, and Jun Shen
- Subjects
Computational Mathematics ,Artificial Intelligence ,Engineering (miscellaneous) ,Information Systems - Abstract
This paper presents a relation-centric algorithm for solving arithmetic word problems (AWPs) by synergizing a syntax-semantics extractor for extracting explicit relations, and a neural network miner for mining implicit relations. This is the first algorithm that has a specific component to acquire implicit knowledge items for solving AWPs. This paper proposes a three-phase scheme to decompose the challenging task of designing an algorithm for solving AWPs into three smaller tasks. The first phase proposes a state-action paradigm; the second phase instantiates the paradigm into a relation-centric approach; and the third phase implements a relation-centric algorithm for solving AWPs. There are two main steps in the proposed algorithm: problem understanding and symbolic solver. By adopting the relation-centric approach, problem understanding becomes a task of relation acquisition. For conducting the task of relation acquisition, a relaxed syntax-semantics method first extracts a group of explicit relation candidates. In parallel, a neural network miner acquires implicit relation candidates. The miner computes the vectors encoded by BERT to determine which implicit relations should be added. Thus, problem understanding can acquire both explicit relations and implicit relations, which addresses the challenge of building a problem understanding method that can acquire all the knowledge items to find the solution. In the subsequent step of symbolic solver, a fusion procedure forms a distilled set of relations from all the candidates by discarding unnecessary relations. Experimentation on nine benchmark datasets validates the superiority of the proposed algorithm that outperforms the state-of-the-art algorithms.
- Published
- 2022
40. Angle of arrival estimation in a multi-antenna software defined radio system: impact of hardware and radio environment
- Author
-
Marcin Wachowiak and Pawel Kryszkiewicz
- Subjects
Computer Networks and Communications ,Electrical and Electronic Engineering ,Information Systems - Abstract
Contemporary wireless communication transceivers can utilize their multiple antennas to improve positioning abilities. Angle of Arrival (AoA) estimation utilizes shifts in phase between the signal of interest arriving at different receiving antennas. Software Defined Radio (SDR) USRP B210 and GNU Radio implementing the Root Multiple Signal Classification (Root-MUSIC) algorithm are used in this paper. However, this setup requires the consideration of errors caused by hardware and radio environment. The impact of these effects is shown using an analytical model, assessed, mostly by measurements, and relevant solutions are proposed. The hardware distortions are caused mostly by synchronization errors. The intricacies of the most problematic phase synchronization are investigated with the wired setup showing, e.g., changes with frequency or gain. Both synchronizations emitting calibration tone from the same radio front end, resulting in cross-talk, or a separate one are tested. Recommendations enhancing the performance of the system and alleviating hardware imperfections are provided in this paper. The accuracy of AoA estimation is degraded by multipath propagation or radio interference. A signal processing scheme including filtering has been proposed. A set of over-the-air experiments assessed the performance of the system. The presented and publicly available software package, scalable to a higher number of RX antennas, enables straightforward implementation by other researchers.
- Published
- 2022
41. A multi-objective particle swarm optimization with density and distribution-based competitive mechanism for sensor ontology meta-matching
- Author
-
Aifeng Geng and Qing Lv
- Subjects
Computational Mathematics ,Artificial Intelligence ,Engineering (miscellaneous) ,Information Systems - Abstract
Sensor ontology is a standard conceptual model that describes information of sensor device, which includes the concepts of various sensor modules and the relationships between them. The problem of heterogeneity between sensor ontologies is introduced because different sensor ontology engineers have different ways of describing sensor devices and different structures for the construction of sensor ontologies. Addressing the heterogeneity of sensor ontologies contributes to facilitate the semantic fusion of two sensor ontologies, enabling the sharing and reuse of sensor information. To solve the above problem, an ontology meta-matching method is proposed by this paper to find out the correspondence between entities in distinct sensor ontologies. How to measure the degree of similarity between entities with a set of suitable similarity measures and how to better integrate multiple measures to determine the equivalent entities are the challenges of the ontology meta-matching problem. In this paper, two approximate measurement methods of the quality for ontology matching results are designed, and a multi-objective optimization model for the ontology meta-matching problem is constructed based on these methods. Eventually, a multi-objective particle swarm optimization (MOPSO) algorithm is propounded to dispose of the problem and optimize the quality of ontology meta-matching results, which is named density and distribution-based competitive mechanism multi-objective particle swarm algorithm (D$$^{2}$$ 2 CMOPSO). The sophistication of the D$$^{2}$$ 2 CMOPSO based sensor ontology meta-matching method is verified through experiments. Comparing with other matching systems and advanced systems of Ontology Alignment Evaluation Initiative (OAEI), the proposed method can improve the quality of matching results more effectively.
- Published
- 2022
42. Alternating complexity of counting first-order logic for the subword order
- Author
-
Dietrich Kuske and Christian Schwarz
- Subjects
Computer Networks and Communications ,Software ,Information Systems - Abstract
This paper considers the structure consisting of the set of all words over a given alphabet together with the subword relation, regular predicates, and constants for every word. We are interested in the counting extension of first-order logic by threshold counting quantifiers. The main result shows that the two-variable fragment of this logic can be decided in twofold exponential alternating time with linearly many alternations (and therefore in particular in twofold exponential space as announced in the conference version (Kuske and Schwarz, in: MFCS’20, Leibniz International Proceedings in Informatics (LIPIcs) vol. 170, pp 56:1–56:13. Schloss Dagstuhl - Leibniz-Zentrum für Informatik, 2020) of this paper) provided the regular predicates are restricted to piecewise testable ones. This result improves prior insights by Karandikar and Schnoebelen by extending the logic and saving one exponent in the space bound. Its proof consists of two main parts: First, we provide a quantifier elimination procedure that results in a formula with constants of bounded length (this generalises the procedure by Karandikar and Schnoebelen for first-order logic). From this, it follows that quantification in formulas can be restricted to words of bounded length, i.e., the second part of the proof is an adaptation of the method by Ferrante and Rackoff to counting logic and deviates significantly from the path of reasoning by Karandikar and Schnoebelen.
- Published
- 2022
43. Land consolidation through parcel exchange among landowners using a distributed Spark-based genetic algorithm
- Author
-
Diego Teijeiro, Margarita Amor, Ramón Doallo, Eduardo Corbelle, Juan Porta, and Jorge Parapar
- Subjects
Hardware and Architecture ,Software ,Information Systems ,Theoretical Computer Science - Abstract
Land consolidation is an essential tool for public administrations to reduce the fragmentation of land ownership. In particular, parcel exchange shows promising potential for restructuring parcel holdings, even more when the number of parcels and owners involved is large. Unfortunately, the number of possible exchange combinations grows very quickly with the number of participating landowners and parcels, with the associated challenge of finding an acceptable solution. In this paper, we present a high-performance solution for parcel exchange based on genetic algorithms. Our proposal, using Apache Spark framework, is based on the exploiting of distributed-memory systems with effortless access in order to reduce the execution time. This also allows increasing the search width through multiple populations that share their advances. This can be achieved without compromising the search depth thanks to the higher amount of resources available from using distributed-memory systems. Our proposal is capable of achieving better solutions in lower amounts of time compared to previous works, showing that genetic algorithms on a high performance system can be used to propose fair parcel exchanges under strict time constraints, even in complex scenarios. The performance achieved allows for fast trial of several options, reducing the time usually needed to perform administrative procedures associated with land fragmentation problems. Specifically, our proposal is capable of combining the benefits of both depth-focused and width-focused multithreaded parallelization. It matches the speedup gains of depth-focused multithreaded parallelization. The width-focused parallelization provides local minimum resilience and fitness value reduction potential. In this paper, multithreading solutions and Spark-based solutions are tested.
- Published
- 2022
44. Artificial data in sports forecasting: a simulation framework for analysing predictive models in sports
- Author
-
Marc Garnica-Caparrós, Daniel Memmert, and Fabian Wunderlich
- Subjects
Information Systems - Abstract
Far-reaching decisions in organizations often rely on sophisticated methods of data analysis. However, data availability is not always given in complex real-world systems, and even available data may not fully reflect all the underlying processes. In these cases, artificial data can help shed light on pitfalls in decision making, and gain insights on optimized methods. The present paper uses the example of forecasts targeting the outcomes of sports events, representing a domain where despite the increasing complexity and coverage of models, the proposed methods may fail to identify the main sources of inaccuracy. While the actual outcome of the events provides a basis for validation, it remains unknown whether inaccurate forecasts source from misestimating the strength of each competitor, inaccurate forecasting methods or just from inherently random processes. To untangle this paradigm, the present paper proposes the design of a comprehensive simulation framework that models the sports forecasting process while having full control of all the underlying unknowns. A generalized model of the sports forecasting process is presented as the conceptual basis of the system and is supported by the main challenges of real-world data applications. The framework aims to provide a better understanding of rating procedures and forecasting techniques that will boost new developments and serve as a robust validation system accounting for the predictive quality of forecasts. As a proof of concept, a full data generation is showcased together with the main analytical advantages of using artificial data.
- Published
- 2022
45. Advanced encryption schemes in multi-tier heterogeneous internet of things: taxonomy, capabilities, and objectives
- Author
-
Mahdi R. Alagheband and Atefeh Mashatan
- Subjects
Hardware and Architecture ,Software ,Information Systems ,Theoretical Computer Science - Abstract
The Internet of Things (IoT) is increasingly becoming widespread in different areas such as healthcare, transportation, and manufacturing. IoT networks comprise many diverse entities, including smart small devices for capturing sensitive information, which may be attainable targets for malicious parties. Thus security and privacy are of utmost importance. To protect the confidentiality of data handled by IoT devices, conventional cryptographic primitives have generally been used in various IoT security solutions. While these primitives provide just an acceptable level of security, they typically neither preserve privacy nor support advanced functionalities. Also, they overly count on trusted third parties because of some limitations by design. This multidisciplinary survey paper connects the dots and explains how some advanced cryptosystems can achieve ambitious goals. We begin by describing a multi-tiered heterogeneous IoT architecture that supports the cloud, edge, fog, and blockchain technologies and assumptions and capabilities for each layer. We then elucidate advanced encryption primitives, namely wildcarded, break-glass, proxy re-encryption, and registration-based encryption schemes, as well as IoT-friendly cryptographic accumulators. Our paper illustrates how they can augment the features mentioned above while simultaneously satisfying the architectural IoT requirements. We provide comparison tables and diverse IoT-based use cases for each advanced cryptosystem as well as a guideline for selecting the best one in different scenarios and depict how they can be integrated.
- Published
- 2022
46. When Self-Humanization Leads to Algorithm Aversion
- Author
-
Pascal Oliver Heßler, Jella Pfeiffer, and Sebastian Hafenbrädl
- Subjects
Information Systems - Abstract
Decision support systems are increasingly being adopted by various digital platforms. However, prior research has shown that certain contexts can induce algorithm aversion, leading people to reject their decision support. This paper investigates how and why thecontextin which users are making decisions (for-profit versus prosocial microlending decisions) affects their degree of algorithm aversion and ultimately their preference for more human-like (versus computer-like) decision support systems. The study proposes that contexts vary in their affordances for self-humanization. Specifically, people perceive prosocial decisions as more relevant to self-humanization than for-profit contexts, and, in consequence, they ascribe more importance to empathy and autonomy while making decisions in prosocial contexts. This increased importance of empathy and autonomy leads to a higher degree of algorithm aversion. At the same time, it also leads to a stronger preference for human-like decision support, which could therefore serve as a remedy for an algorithm aversion induced by the need for self-humanization. The results from an online experiment support the theorizing. The paper discusses both theoretical and design implications, especially for the potential of anthropomorphized conversational agents on platforms for prosocial decision-making.
- Published
- 2022
47. Multifactorial evolutionary algorithm with adaptive transfer strategy based on decision tree
- Author
-
Wei Li, Xinyu Gao, and Lei Wang
- Subjects
Computational Mathematics ,Artificial Intelligence ,Engineering (miscellaneous) ,Information Systems - Abstract
Multifactorial optimization (MFO) is a kind of optimization problem that has attracted considerable attention in recent years. The multifactorial evolutionary algorithm utilizes the implicit genetic transfer mechanism characterized by knowledge transfer to conduct evolutionary multitasking simultaneously. Therefore, the effectiveness of knowledge transfer significantly affects the performance of the algorithm. To achieve positive knowledge transfer, this paper proposed an evolutionary multitasking optimization algorithm with adaptive transfer strategy based on the decision tree (EMT-ADT). To evaluate the useful knowledge contained in the transferred individuals, this paper defines an evaluation indicator to quantify the transfer ability of each individual. Furthermore, a decision tree is constructed to predict the transfer ability of transferred individuals. Based on the prediction results, promising positive-transferred individuals are selected to transfer knowledge, which can effectively improve the performance of the algorithm. Finally, CEC2017 MFO benchmark problems, WCCI20-MTSO and WCCI20-MaTSO benchmark problems are used to verify the performance of the proposed algorithm EMT-ADT. Experimental results demonstrate the competiveness of EMT-ADT compared with some state-of-the-art algorithms.
- Published
- 2023
48. Improved content recommendation algorithm integrating semantic information
- Author
-
Ran Huang
- Subjects
Information Systems and Management ,Computer Networks and Communications ,Hardware and Architecture ,Information Systems - Abstract
Content-based recommendation technology is widely used in the field of e-commerce and education because of its intuitive and easy to explain advantages. However, due to the congenital defect of insufficient semantic analysis of TF-IDF vector space model, the traditional content-based recommendation technology has the problem of insufficient semantic analysis in item modeling, fails to consider the role of semantic information in knowledge expression and similarity calculation, and is not accurate enough in calculating item content similarity. The items with semantic relevance in content can not be well mined. The research goal of this paper is to improve the semantic analysis ability of the traditional content-based recommendation algorithm by integrating semantic information with TF-IDF vector space model for item modeling and similarity calculation and proposed an improved content recommendation algorithm integrating semantic information. In order to prove the effectiveness of the proposed method, several groups of experiments are carried out. The experiments results showed that the overall performance of the proposed algorithm in this paper is the best and relatively stable. This verified the validity of our method.
- Published
- 2023
49. IT Professionals in the Gig Economy
- Author
-
Lisa Gussek and Manuel Wiesche
- Subjects
Information Systems - Abstract
When IT work is performed through digital labor markets, IT professionals have a high degree of personal responsibility for their careers and must use appropriate strategies to be successful. This paper investigates the success of IT freelancers on digital labor platforms. Drawing on signaling theory, a dataset of 7166 IT freelancers is used to examine how activating, pointing, and supporting signals lead to success. Analysis was carried out using negative binomial regression. The results indicate that the three signaling types positively influence the objective career success of IT freelancers. This paper contributes to the literature by testing signaling theory in the new context of digital labor platforms, investigating IT specifics, and proposing support as a new type of signal for IT professionals on digital labor platforms. In practice, the results provide guidelines for IT freelancers to improve their success within their careers.
- Published
- 2023
50. Exploring high scientific productivity in international co-authorship of a small developing country based on collaboration patterns
- Author
-
Irena Mitrović, Marko Mišić, and Jelica Protić
- Subjects
Information Systems and Management ,Computer Networks and Communications ,Hardware and Architecture ,Information Systems - Abstract
The number of published scientific paper grows rapidly each year, totaling more than 2.9 million annually. New methodologies and systems have been developed to analyze scientific production and performance indicators from large quantities of data available from the scientific databases, such as Web of Science or Scopus. In this paper, we analyzed the international scientific production and co-authorship patterns for the most productive authors from Serbia based on the obtained Web of Science dataset in the period 2006–2013. We performed bibliometric and scientometric analyses together with statistical and collaboration network analysis, to reveal the causes of extraordinary publishing performance of some authors. For such authors, we found significant inequality in distribution of papers over journals and countries of co-authors, using Gini coefficient and Lorenz curves. Most of the papers belong to multidisciplinary, interdisciplinary, and the field of applied sciences. We have discovered three specific collaboration patterns that lead to high productivity in international collaboration. First pattern corresponds to mega-authorship papers with hundreds of co-authors gathered in specific research groups. The other two collaboration patterns were found in mathematics and multidisciplinary science, mainly application of graph theory and computational methods in physical chemistry. The former pattern results in a star-shaped collaboration network with mostly individual collaborators. The latter pattern includes multiple actors with high betweenness centrality measure and identified brokerage roles. The results are compared with the later period 2014–2023, where high scientific production has been observed in some other fields, such as biology and food science and technology.
- Published
- 2023
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.