161 results on '"streaming analytics"'
Search Results
2. A Multi-Tier Streaming Analytics Model of 0-Day Ransomware Detection Using Machine Learning.
- Author
-
Zuhair, Hiba, Selamat, Ali, and Krejcar, Ondrej
- Subjects
RANSOMWARE ,MACHINE learning ,INFORMATION storage & retrieval systems ,RIVERS ,MALWARE - Abstract
Desktop and portable platform-based information systems become the most tempting target of crypto and locker ransomware attacks during the last decades. Hence, researchers have developed anti-ransomware tools to assist the Windows platform at thwarting ransomware attacks, protecting the information, preserving the users' privacy, and securing the inter-related information systems through the Internet. Furthermore, they utilized machine learning to devote useful anti-ransomware tools that detect sophisticated versions. However, such anti-ransomware tools remain sub-optimal in efficacy, partial to analyzing ransomware traits, inactive to learn significant and imbalanced data streams, limited to attributing the versions' ancestor families, and indecisive about fusing the multi-descent versions. In this paper, we propose a hybrid machine learner model, which is a multi-tiered streaming analytics model that classifies various ransomware versions of 14 families by learning 24 static and dynamic traits. The proposed model classifies ransomware versions to their ancestor families numerally and fuses those of multi-descent families statistically. Thus, it classifies ransomware versions among 40K corpora of ransomware, malware, and good-ware versions through both semi-realistic and realistic environments. The supremacy of this ransomware streaming analytics model among competitive anti-ransomware technologies is proven experimentally and justified critically with the average of 97% classification accuracy, 2.4% mistake rate, and 0.34% miss rate under comparative and realistic test. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
3. The Development of Simulation and Optimisation Tools with an Intuitive User Interface to Improve the Operation of Electric Arc Furnaces.
- Author
-
Tomažič, Simon, Škrjanc, Igor, Andonovski, Goran, and Logar, Vito
- Subjects
ELECTRIC furnaces ,ELECTRIC arc ,MASS transfer ,DECISION support systems ,PROCESS optimization ,ARC furnaces - Abstract
The paper presents a novel decision support system designed to improve the efficiency and effectiveness of decision-making for electric arc furnace (EAF) operators. The system integrates two primary tools: the EAF Simulator, which is based on advanced mechanistic models, and the EAF Optimiser, which uses data-driven models trained on historical data. These tools enable the simulation and optimisation of furnace settings in real time and provide operators with important insights. A key objective was to develop a user-friendly interface with the Siemens Insights Hub Cloud Service and Node-RED that enables interactive management and support. The interface allows operators to analyse and compare past and simulated batches by adjusting the input data and parameters, resulting in improved optimisation and reduced costs. In addition, the system focuses on the collection and pre-processing of input data for the simulator and optimiser and uses Message Queuing Telemetry Transport (MQTT)communication between the user interfaces and models to ensure seamless data exchange. The EAF Simulator uses a comprehensive mathematical model to simulate the complex dynamics of heat and mass transfer, while the EAF Optimiser uses a fuzzy logic-based approach to predict optimal energy consumption. The integration with Siemens Edge Streaming Analytics ensures robust data collection and real-time responsiveness. The dual-interface design improves user accessibility and operational flexibility. This system has significant potential to reduce energy consumption by up to 10% and melting times by up to 15%, improving the efficiency and sustainability of the entire process. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. Empowering Clinical Engineering and Evidence-Based Maintenance with IoT and Indoor Navigation †.
- Author
-
Luschi, Alessio, Daino, Giovanni Luca, Ghisalberti, Gianpaolo, Mezzatesta, Vincenzo, and Iadanza, Ernesto
- Subjects
COMPUTERIZED maintenance management systems ,BIOMEDICAL engineering ,ARTIFICIAL intelligence ,DIGITAL technology ,USER interfaces ,SYSTEM downtime - Abstract
The OHIO (Odin Hospital Indoor cOmpass) project received funding from the European Union's Horizon 2020 research and innovation action program, via ODIN–Open Call, which is issued and executed under the ODIN project and focuses on enhancing hospital safety, productivity, and quality by introducing digital solutions, such as the Internet of Things (IoT), robotics, and artificial intelligence (AI). OHIO aims to enhance the productivity and quality of medical equipment maintenance activities within the pilot hospital, "Le Scotte" in Siena (Italy), by leveraging internal informational resources. OHIO will also be completely integrated with the ODIN platform, taking advantage of the available services and functionalities. OHIO exploits Bluetooth Low Energy (BLE) tags and antennas together with the resources provided by the ODIN platform to develop a complex ontology-based IoT framework, which acts as a central cockpit for the maintenance of medical equipment through a central management web application and an indoor real-time location system (RTLS) for mobile devices. The application programmable interfaces (APIs) are based on REST architecture for seamless data exchange and integration with the hospital's existing computer-aided facility management (CAFM) and computerized maintenance management system (CMMS) software. The outcomes of the project are assessed both with quantitative and qualitative methods, by evaluating key performance indicators (KPIs) extracted from the literature and performing a preliminary usability test on both the whole system and the graphic user interfaces (GUIs) of the developed applications. The test implementation demonstrates improvements in maintenance timings, including a reduction in maintenance operation delays, duration of maintenance tasks, and equipment downtime. Usability post-test questionnaires show positive feedback regarding the usability and effectiveness of the applications. The OHIO framework enhanced the effectiveness of medical equipment maintenance by integrating existing software with newly designed, enhanced interfaces. The research also indicates possibilities for scaling up the developed methods and applications to additional large-scale pilot hospitals within the ODIN network. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Data Lakes: A Survey of Concepts and Architectures.
- Author
-
Azzabi, Sarah, Alfughi, Zakiya, and Ouda, Abdelkader
- Subjects
LITERATURE reviews ,DATA warehousing ,DATA management ,INTERNET of things ,RESEARCH personnel - Abstract
This paper presents a comprehensive literature review on the evolution of data-lake technology, with a particular focus on data-lake architectures. By systematically examining the existing body of research, we identify and classify the major types of data-lake architectures that have been proposed and implemented over time. The review highlights key trends in the development of data-lake architectures, identifies the primary challenges faced in their implementation, and discusses future directions for research and practice in this rapidly evolving field. We have developed diagrammatic representations to highlight the evolution of various architectures. These diagrams use consistent notations across all architectures to further enhance the comparative analysis of the different architectural components. We also explore the differences between data warehouses and data lakes. Our findings provide valuable insights for researchers and practitioners seeking to understand the current state of data-lake technology and its potential future trajectory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. Advancing Hydrology through Machine Learning: Insights, Challenges, and Future Directions Using the CAMELS, Caravan, GRDC, CHIRPS, PERSIANN, NLDAS, GLDAS, and GRACE Datasets.
- Author
-
Hasan, Fahad, Medley, Paul, Drake, Jason, and Chen, Gang
- Subjects
HUMAN activity recognition ,MACHINE learning ,WATER management ,HYDROLOGY ,CAMELS ,WATER table - Abstract
Machine learning (ML) applications in hydrology are revolutionizing our understanding and prediction of hydrological processes, driven by advancements in artificial intelligence and the availability of large, high-quality datasets. This review explores the current state of ML applications in hydrology, emphasizing the utilization of extensive datasets such as CAMELS, Caravan, GRDC, CHIRPS, NLDAS, GLDAS, PERSIANN, and GRACE. These datasets provide critical data for modeling various hydrological parameters, including streamflow, precipitation, groundwater levels, and flood frequency, particularly in data-scarce regions. We discuss the type of ML methods used in hydrology and significant successes achieved through those ML models, highlighting their enhanced predictive accuracy and the integration of diverse data sources. The review also addresses the challenges inherent in hydrological ML applications, such as data heterogeneity, spatial and temporal inconsistencies, issues regarding downscaling the LSH, and the need for incorporating human activities. In addition to discussing the limitations, this article highlights the benefits of utilizing high-resolution datasets compared to traditional ones. Additionally, we examine the emerging trends and future directions, including the integration of real-time data and the quantification of uncertainties to improve model reliability. We also place a strong emphasis on incorporating citizen science and the IoT for data collection in hydrology. By synthesizing the latest research, this paper aims to guide future efforts in leveraging large datasets and ML techniques to advance hydrological science and enhance water resource management practices. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. Federated Learning-Oriented Edge Computing Framework for the IIoT.
- Author
-
Liu, Xianhui, Dong, Xianghu, Jia, Ning, and Zhao, Weidong
- Subjects
ARTIFICIAL intelligence ,FEDERATED learning ,EDGE computing ,TRAIN delays & cancellations ,COMPUTER systems ,ENERGY consumption - Abstract
With the maturity of artificial intelligence (AI) technology, applications of AI in edge computing will greatly promote the development of industrial technology. However, the existing studies on the edge computing framework for the Industrial Internet of Things (IIoT) still face several challenges, such as deep hardware and software coupling, diverse protocols, difficult deployment of AI models, insufficient computing capabilities of edge devices, and sensitivity to delay and energy consumption. To solve the above problems, this paper proposes a software-defined AI-oriented three-layer IIoT edge computing framework and presents the design and implementation of an AI-oriented edge computing system, aiming to support device access, enable the acceptance and deployment of AI models from the cloud, and allow the whole process from data acquisition to model training to be completed at the edge. In addition, this paper proposes a time series-based method for device selection and computation offloading in the federated learning process, which selectively offloads the tasks of inefficient nodes to the edge computing center to reduce the training delay and energy consumption. Finally, experiments carried out to verify the feasibility and effectiveness of the proposed method are reported. The model training time with the proposed method is generally 30% to 50% less than that with the random device selection method, and the training energy consumption under the proposed method is generally 35% to 55% less. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Enhanced Network Intrusion Detection System for Internet of Things Security Using Multimodal Big Data Representation with Transfer Learning and Game Theory.
- Author
-
Ullah, Farhan, Turab, Ali, Ullah, Shamsher, Cacciagrano, Diletta, and Zhao, Yue
- Subjects
BIG data ,INTERNET security ,INTERNET of things ,GAME theory ,EDUCATIONAL games ,MULTIMODAL user interfaces ,EMAIL security ,MULTISPECTRAL imaging - Abstract
Internet of Things (IoT) applications and resources are highly vulnerable to flood attacks, including Distributed Denial of Service (DDoS) attacks. These attacks overwhelm the targeted device with numerous network packets, making its resources inaccessible to authorized users. Such attacks may comprise attack references, attack types, sub-categories, host information, malicious scripts, etc. These details assist security professionals in identifying weaknesses, tailoring defense measures, and responding rapidly to possible threats, thereby improving the overall security posture of IoT devices. Developing an intelligent Intrusion Detection System (IDS) is highly complex due to its numerous network features. This study presents an improved IDS for IoT security that employs multimodal big data representation and transfer learning. First, the Packet Capture (PCAP) files are crawled to retrieve the necessary attacks and bytes. Second, Spark-based big data optimization algorithms handle huge volumes of data. Second, a transfer learning approach such as word2vec retrieves semantically-based observed features. Third, an algorithm is developed to convert network bytes into images, and texture features are extracted by configuring an attention-based Residual Network (ResNet). Finally, the trained text and texture features are combined and used as multimodal features to classify various attacks. The proposed method is thoroughly evaluated on three widely used IoT-based datasets: CIC-IoT 2022, CIC-IoT 2023, and Edge-IIoT. The proposed method achieves excellent classification performance, with an accuracy of 98.2%. In addition, we present a game theory-based process to validate the proposed approach formally. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Secure IoT Communication: Implementing a One-Time Pad Protocol with True Random Numbers and Secure Multiparty Sums.
- Author
-
Fenner, Julio, Galeas, Patricio, Escobar, Francisco, and Neira, Rail
- Subjects
RANDOM numbers ,CYBERTERRORISM ,INTERNET of things ,DISTRIBUTED computing ,PROCESS capability ,CYBER intelligence (Computer security) - Abstract
We introduce an innovative approach for secure communication in the Internet of Things (IoT) environment using a one-time pad (OTP) protocol. This protocol is augmented by incorporating a secure multiparty sum protocol to produce OTP keys from genuine random numbers obtained from the physical phenomena observed in each device. We have implemented our method using ZeroC-Ice v.3.7, dependable middleware for distributed computing, demonstrating its practicality in various hybrid IoT scenarios, particularly in devices with limited processing capabilities. The security features of our protocol are evaluated under the Dolev–Yao threat model, providing a thorough assessment of its defense against potential cyber threats. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Progressive Adoption of RINA in IoT Networks: Enhancing Scalability and Network Management via SDN Integration.
- Author
-
Sarabia-Jácome, David, Giménez-Antón, Sergio, Liatifis, Athanasios, Grasa, Eduard, Catalán, Marisa, and Pliatsios, Dimitrios
- Subjects
INTERNET of things ,SCALABILITY - Abstract
Thousands of devices are connected to the Internet as part of the Internet of Things (IoT) ecosystems. The next generation of IoT networks is expected to support this growing number of Intelligent IoT devices and tactile Internet solutions to provide real-time applications. In view of this, IoT networks require innovative network architectures that offer scalability, security, and adaptability. The Recursive InterNetwork Architecture (RINA) is a clean slate network architecture that provides a scalable, secure, and flexible framework for interconnecting computers. SDN technology is becoming a de facto solution to overcome network requirements, making RINA adoption difficult. This paper presents an architecture for integrating RINA with SDN technologies to lower the barriers of adopting RINA in IoT environments. The architecture relies on a RINA-based distributed application facility (DAF), a RINA southbound driver (SBI), and the RINA L2VPN. The RINA-based DAF manages RINA nodes along the edge–fog–cloud continuum. The SBI driver SDN enables the hybrid centralized management of SDN switches and RINA nodes. Meanwhile, the RINA L2VPN allows seamless communication between edge nodes and the cloud to facilitate the data exchange between network functions (NFs). Such integration has enabled a progressive deployment of RINA in current IoT networks without affecting their operations and performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. e MIFS: A Normalized Hyperbolic Ransomware Deterrence Model Yielding Greater Accuracy and Overall Performance.
- Author
-
Alqahtani, Abdullah and Sheldon, Frederick T.
- Subjects
RANSOMWARE ,FEATURE selection ,HYPERBOLIC functions ,TANGENT function - Abstract
Early detection of ransomware attacks is critical for minimizing the potential damage caused by these malicious attacks. Feature selection plays a significant role in the development of an efficient and accurate ransomware early detection model. In this paper, we propose an enhanced Mutual Information Feature Selection (eMIFS) technique that incorporates a normalized hyperbolic function for ransomware early detection models. The normalized hyperbolic function is utilized to address the challenge of perceiving common characteristics among features, particularly when there are insufficient attack patterns contained in the dataset. The Term Frequency–Inverse Document Frequency (TF–IDF) was used to represent the features in numerical form, making it ready for the feature selection and modeling. By integrating the normalized hyperbolic function, we improve the estimation of redundancy coefficients and effectively adapt the MIFS technique for early ransomware detection, i.e., before encryption takes place. Our proposed method, eMIFS, involves evaluating candidate features individually using the hyperbolic tangent function (tanh), which provides a suitable representation of the features' relevance and redundancy. Our approach enhances the performance of existing MIFS techniques by considering the individual characteristics of features rather than relying solely on their collective properties. The experimental evaluation of the eMIFS method demonstrates its efficacy in detecting ransomware attacks at an early stage, providing a more robust and accurate ransomware detection model compared to traditional MIFS techniques. Moreover, our results indicate that the integration of the normalized hyperbolic function significantly improves the feature selection process and ultimately enhances ransomware early detection performance. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. ComPipe: A Novel Flow Placement and Measurement Algorithm for Programmable Composite Pipelines.
- Author
-
Ran, Dengyu, Chen, Xiao, and Song, Lei
- Subjects
FLOW measurement ,ALGORITHMS ,SOFTWARE-defined networking ,EVICTION - Abstract
Programmable networks comprise heterogeneous network devices based on both hardware and software. Hardware devices provide superior bandwidth and low latency but encounter challenges in managing large table entries. Conversely, software devices offer abundant flow tables but have a limited forwarding capacity. To overcome this limitation, some commercial switches offer implementations that combine both hardware and software devices. In this context, this paper presents the Composite Pipeline (ComPipe), an algorithm for high-performance and high-precision flow placement and measurement. ComPipe utilizes a multi-level hashing algorithm for the real-time identification of heavy hitters, incorporates a unique flow eviction strategy, and is implemented on commercial programmable hardware. For non-heavy flows, ComPipe employs sketch structures to accomplish a high-performance flow synopsis within limited memory constraints. This design allows to replace flow rules entirely in the data plane, ensuring the timely detection and offloading of heavy-hitter flows, and offering a unified interface to the controller. The ComPipe prototype has been implemented in both testbed and simulation environments. The results indicate that ComPipe is an effective solution for dynamic flow placement in programmable networks, distinguished by its low cost, high performance, and high accuracy. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Edge-Distributed IoT Services Assist the Economic Sustainability of LEO Satellite Constellation Construction.
- Author
-
Zhang, Meng, Shi, Hongjian, and Ma, Ruhui
- Abstract
There are thousands or even tens of thousands of satellites in Low Earth Orbit (LEO). How to ensure the economic sustainability of LEO satellite constellation construction is an important issue currently. In this article, we envision integrating the popular and promising Internet of Things (IoT) technology with LEO satellite constellations to indirectly provide economic support for LEO satellite construction through paid IoT services. Of course, this can also bring benefits to the development of IoT. LEO Satellites can provide networks for IoT products in areas with difficult conditions, such as deserts, oceans, etc., and Satellite Edge Computing (SEC) can help to reduce the service latency of IoT. Many IoT products rely on Convolutional Neural Networks (CNNs) to provide services, and it is difficult to perform CNN inference on an edge server solely. Therefore, in this article, we use edge-distributed inference to enable the IoT services in the SEC scenario. How to perform edge-distributed inference to shorten inference time is a challenge. To shorten the inference latency of CNN, we propose a framework based on a joint partition, named EDIJP. We use a joint partition method combining data partition and model partition for distributed partition. We model the data partition as a Linear Programming (LP) problem. To address the challenge of trading off computation latency and communication latency, we designed an iterative algorithm to obtain the final partitioning result. By maintaining the original structure and parameters, our framework ensures that the inference accuracy will not be affected. We simulated the SEC environment, based on two popular CNN models, VGG16 and AlexNet, the performance of our method is varified. Compared with local inference, EdgeFlow, and CoEdge, the inference latency by using EDIJP is shorter. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. A Model for Enhancing Unstructured Big Data Warehouse Execution Time.
- Author
-
Farhan, Marwa Salah, Youssef, Amira, and Abdelhamid, Laila
- Subjects
DATA warehousing ,BIG data ,DECISION support systems - Abstract
Traditional data warehouses (DWs) have played a key role in business intelligence and decision support systems. However, the rapid growth of the data generated by the current applications requires new data warehousing systems. In big data, it is important to adapt the existing warehouse systems to overcome new issues and limitations. The main drawbacks of traditional Extract–Transform–Load (ETL) are that a huge amount of data cannot be processed over ETL and that the execution time is very high when the data are unstructured. This paper focuses on a new model consisting of four layers: Extract–Clean–Load–Transform (ECLT), designed for processing unstructured big data, with specific emphasis on text. The model aims to reduce execution time through experimental procedures. ECLT is applied and tested using Spark, which is a framework employed in Python. Finally, this paper compares the execution time of ECLT with different models by applying two datasets. Experimental results showed that for a data size of 1 TB, the execution time of ECLT is 41.8 s. When the data size increases to 1 million articles, the execution time is 119.6 s. These findings demonstrate that ECLT outperforms ETL, ELT, DELT, ELTL, and ELTA in terms of execution time. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. Data-Driven Process Monitoring and Fault Diagnosis: A Comprehensive Survey.
- Author
-
Melo, Afrânio, Câmara, Maurício Melo, and Pinto, José Carlos
- Subjects
FAULT diagnosis ,ARTIFICIAL neural networks ,MANUFACTURING processes ,PRINCIPAL components analysis ,SYSTEMS engineering - Abstract
This paper presents a comprehensive review of the historical development, the current state of the art, and prospects of data-driven approaches for industrial process monitoring. The subject covers a vast and diverse range of works, which are compiled and critically evaluated based on the different perspectives they provide. Data-driven modeling techniques are surveyed and categorized into two main groups: multivariate statistics and machine learning. Representative models, namely principal component analysis, partial least squares and artificial neural networks, are detailed in a didactic manner. Topics not typically covered by other reviews, such as process data exploration and treatment, software and benchmarks availability, and real-world industrial implementations, are thoroughly analyzed. Finally, future research perspectives are discussed, covering aspects related to system performance, the significance and usefulness of the approaches, and the development environment. This work aims to be a reference for practitioners and researchers navigating the extensive literature on data-driven industrial process monitoring. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. A Dam Safety State Prediction and Analysis Method Based on EMD-SSA-LSTM.
- Author
-
Yang, Xin, Xiang, Yan, Wang, Yakun, and Shen, Guangze
- Subjects
DAM safety ,DAM failures ,HILBERT-Huang transform ,OPTIMIZATION algorithms ,TIME series analysis ,ENGINEERING management - Abstract
The safety monitoring information of the dam is an indicator reflecting the operational status of the dam. It is a crucial source for analyzing and assessing the safety state of reservoir dams, possessing strong real-time capabilities to detect anomalies in the dam at the earliest possible time. When using neural networks for predicting and warning dam safety monitoring data, there are issues such as redundant model parameters, difficulty in tuning, and long computation times. This study addresses real-time dam safety warning issues by first employing the Empirical Mode Decomposition (EMD) method to decompose the effective time-dependent factors and construct a dam in a service state analysis model; it also establishes a multi-dimensional time series analysis equation for dam seepage monitoring. Simultaneously, by combining the Sparrow Optimization Algorithm to optimize the LSTM neural network computation process, it reduces the complexity of model parameter selection. The method is compared to other approaches such as RNN, GRU, BP neural networks, and multivariate linear regression, demonstrating high practicality. It can serve as a valuable reference for reservoir dam state prediction and engineering operation management. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. A New Approach to Data Analysis Using Machine Learning for Cybersecurity.
- Author
-
Hiremath, Shivashankar, Shetty, Eeshan, Prakash, Allam Jaya, Sahoo, Suraj Prakash, Patro, Kiran Kumar, Rajesh, Kandala N. V. P. S., and Pławiak, Paweł
- Subjects
INFORMATION technology ,MACHINE learning ,DATA analysis ,DIGITAL transformation ,INTERNET security - Abstract
The internet has become an indispensable tool for organizations, permeating every facet of their operations. Virtually all companies leverage Internet services for diverse purposes, including the digital storage of data in databases and cloud platforms. Furthermore, the rising demand for software and applications has led to a widespread shift toward computer-based activities within the corporate landscape. However, this digital transformation has exposed the information technology (IT) infrastructures of these organizations to a heightened risk of cyber-attacks, endangering sensitive data. Consequently, organizations must identify and address vulnerabilities within their systems, with a primary focus on scrutinizing customer-facing websites and applications. This work aims to tackle this pressing issue by employing data analysis tools, such as Power BI, to assess vulnerabilities within a client's application or website. Through a rigorous analysis of data, valuable insights and information will be provided, which are necessary to formulate effective remedial measures against potential attacks. Ultimately, the central goal of this research is to demonstrate that clients can establish a secure environment, shielding their digital assets from potential attackers. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
18. Internet-of-Things Edge Computing Systems for Streaming Video Analytics: Trails Behind and the Paths Ahead.
- Author
-
Ravindran, Arun A.
- Subjects
INTERNET of things ,EDGE computing ,STREAMING video & television ,ARTIFICIAL intelligence ,BANDWIDTH allocation - Abstract
The falling cost of IoT cameras, the advancement of AI-based computer vision algorithms, and powerful hardware accelerators for deep learning have enabled the widespread deployment of surveillance cameras with the ability to automatically analyze streaming video feeds to detect events of interest. While streaming video analytics is currently largely performed in the cloud, edge computing has emerged as a pivotal component due to its advantages of low latency, reduced bandwidth, and enhanced privacy. However, a distinct gap persists between state-of-the-art computer vision algorithms and the successful practical implementation of edge-based streaming video analytics systems. This paper presents a comprehensive review of more than 30 research papers published over the last 6 years on IoT edge streaming video analytics (IE-SVA) systems. The papers are analyzed across 17 distinct dimensions. Unlike prior reviews, we examine each system holistically, identifying their strengths and weaknesses in diverse implementations. Our findings suggest that certain critical topics necessary for the practical realization of IE-SVA systems are not sufficiently addressed in current research. Based on these observations, we propose research trajectories across short-, medium-, and long-term horizons. Additionally, we explore trending topics in other computing areas that can significantly impact the evolution of IE-SVA systems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
19. Exploring the State of Machine Learning and Deep Learning in Medicine: A Survey of the Italian Research Community.
- Author
-
Bottrighi, Alessio and Pennisi, Marzio
- Subjects
DEEP learning ,MACHINE learning ,EXPERT systems ,SCIENTIFIC community ,ARTIFICIAL intelligence ,INFORMATION storage & retrieval systems - Abstract
Artificial intelligence (AI) is becoming increasingly important, especially in the medical field. While AI has been used in medicine for some time, its growth in the last decade is remarkable. Specifically, machine learning (ML) and deep learning (DL) techniques in medicine have been increasingly adopted due to the growing abundance of health-related data, the improved suitability of such techniques for managing large datasets, and more computational power. ML and DL methodologies are fostering the development of new "intelligent" tools and expert systems to process data, to automatize human–machine interactions, and to deliver advanced predictive systems that are changing every aspect of the scientific research, industry, and society. The Italian scientific community was instrumental in advancing this research area. This article aims to conduct a comprehensive investigation of the ML and DL methodologies and applications used in medicine by the Italian research community in the last five years. To this end, we selected all the papers published in the last five years with at least one of the authors affiliated to an Italian institution that in the title, in the abstract, or in the keywords present the terms "machine learning" or "deep learning" and reference a medical area. We focused our research on journal papers under the hypothesis that Italian researchers prefer to present novel but well-established research in scientific journals. We then analyzed the selected papers considering different dimensions, including the medical topic, the type of data, the pre-processing methods, the learning methods, and the evaluation methods. As a final outcome, a comprehensive overview of the Italian research landscape is given, highlighting how the community has increasingly worked on a very heterogeneous range of medical problems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
20. Spatiotemporal Analytics of Environmental Sounds and Influencing Factors Based on Urban Sensor Network Data.
- Author
-
Zhao, Yanjie, Cheng, Jin, Wang, Shaohua, Qin, Lei, and Zhang, Xueyan
- Subjects
SENSOR networks ,ACOUSTICS ,SPATIOTEMPORAL processes ,URBAN planning ,URBAN research ,SOUNDS - Abstract
Urban construction has accelerated the deterioration of the urban sound environment, which has constrained urban development and harmed people's health. This study aims to explore the spatiotemporal patterns of environmental sound and determine the influencing factors on the spatial differentiation of sound, thus supporting sustainable urban planning and decision-making. Fine-grained sound data are used in most urban sound-related research, but such data are difficult to obtain. For this problem, this study analyzed sound trends using Array of Things (AoT) sensing data. Additionally, this study explored the influences on the spatial differentiation of sound using GeoDetector (version number: 1.0-4), thus addressing the limitation of previous studies that neglected to explore the influences on spatial heterogeneity. Our experimental results showed that sound levels in different areas of Chicago fluctuated irregularly over time. During the morning peak on weekdays: the four southern areas of Chicago have a high–high sound gathering mode, and the remaining areas are mostly randomly distributed; the sound level of a certain area has a significant negative correlation with population density, park area, and density of bike route; park area and population density are the main factors affecting the spatial heterogeneity of Chicago's sound; and population density and park area play an essential role in factor interaction. This study has some theoretical significance and practical value. Residents can choose areas with lower noise for leisure activities according to the noise map of this study. While planning urban development, urban planners should pay attention to the single and interactive effects of factors in the city, such as parks, road network structures, and points of interest, on the urban sound environment. Researchers can build on this study to conduct studies on larger time scales. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
21. Machine Learning as a Strategic Tool for Helping Cocoa Farmers in Côte D'Ivoire.
- Author
-
Ferraris, Stefano, Meo, Rosa, Pinardi, Stefano, Salis, Matteo, and Sartor, Gabriele
- Subjects
ARTIFICIAL neural networks ,MACHINE learning ,CACAO growers ,CONVOLUTIONAL neural networks ,ARTIFICIAL intelligence ,DEEP learning - Abstract
Machine learning can be used for social good. The employment of artificial intelligence in smart agriculture has many benefits for the environment: it helps small farmers (at a local scale) and policymakers and cooperatives (at regional scale) to take valid and coordinated countermeasures to combat climate change. This article discusses how artificial intelligence in agriculture can help to reduce costs, especially in developing countries such as Côte d'Ivoire, employing only low-cost or open-source tools, from hardware to software and open data. We developed machine learning models for two tasks: the first is improving agricultural farming cultivation, and the second is water management. For the first task, we used deep neural networks (YOLOv5m) to detect healthy plants and pods of cocoa and damaged ones only using mobile phone images. The results confirm it is possible to distinguish well the healthy from damaged ones. For actions at a larger scale, the second task proposes the analysis of remote sensors, coming from the GRACE NASA Mission and ERA5, produced by the Copernicus climate change service. A new deep neural network architecture (CIWA-net) is proposed with a U-Net-like architecture, aiming to forecast the total water storage anomalies. The model quality is compared to a vanilla convolutional neural network. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
22. Social Media Zero-Day Attack Detection Using TensorFlow.
- Author
-
Topcu, Ahmet Ercan, Alzoubi, Yehia Ibrahim, Elbasi, Ersin, and Camalan, Emre
- Subjects
SECURITY systems ,NATURAL languages ,MACHINE learning ,ANTI-malware (Computer software) - Abstract
In the current information era, knowledge can pose risks in the online realm. It is imperative to proactively recognize potential threats, as unforeseen dangers cannot be eliminated entirely. Often, malware exploits and other emerging hazards are only identified after they have occurred. These types of risks are referred to as zero-day attacks since no pre-existing anti-malware measures are available to mitigate them. Consequently, significant damages occur when vulnerabilities in systems are exploited. The effectiveness of security systems, such as IPS and IDS, relies heavily on the prompt and efficient response to emerging threats. Failure to address these issues promptly hinders the effectiveness of security system developers. The purpose of this study is to analyze data from the Twitter platform and deploy machine learning techniques, such as word categorization, to identify vulnerabilities and counteract zero-day attacks swiftly. TensorFlow was utilized to handle the processing and conversion of raw Twitter data, resulting in significant efficiency improvements. Moreover, we integrated the Natural Language Toolkit (NLTK) tool to extract targeted words in various languages. Our results indicate that we have achieved an 80% success rate in detecting zero-day attacks by using our tool. By utilizing publicly available information shared by individuals, relevant security providers can be promptly informed. This approach enables companies to patch vulnerabilities more quickly. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
23. Cleaning Big Data Streams: A Systematic Literature Review.
- Author
-
Alotaibi, Obaid, Pardede, Eric, and Tomy, Sarath
- Subjects
DATA scrubbing ,BIG data ,DATABASE searching ,EVALUATION methodology ,ELECTRONIC data processing - Abstract
In today's big data era, cleaning big data streams has become a challenging task because of the different formats of big data and the massive amount of big data which is being generated. Many studies have proposed different techniques to overcome these challenges, such as cleaning big data in real time. This systematic literature review presents recently developed techniques that have been used for the cleaning process and for each data cleaning issue. Following the PRISMA framework, four databases are searched, namely IEEE Xplore, ACM Library, Scopus, and Science Direct, to select relevant studies. After selecting the relevant studies, we identify the techniques that have been utilized to clean big data streams and the evaluation methods that have been used to examine their efficiency. Also, we define the cleaning issues that may appear during the cleaning process, namely missing values, duplicated data, outliers, and irrelevant data. Based on our study, the future directions of cleaning big data streams are identified. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
24. Towards Efficient Resource Allocation for Federated Learning in Virtualized Managed Environments.
- Author
-
Nikolaidis, Fotis, Symeonides, Moysis, and Trihinas, Demetris
- Subjects
RESOURCE allocation ,MACHINE learning ,POWER resources ,EDGE computing ,COMPUTATIONAL complexity ,WORKFLOW - Abstract
Federated learning (FL) is a transformative approach to Machine Learning that enables the training of a shared model without transferring private data to a central location. This decentralized training paradigm has found particular applicability in edge computing, where IoT devices and edge nodes often possess limited computational power, network bandwidth, and energy resources. While various techniques have been developed to optimize the FL training process, an important question remains unanswered: how should resources be allocated in the training workflow? To address this question, it is crucial to understand the nature of these resources. In physical environments, the allocation is typically performed at the node level, with the entire node dedicated to executing a single workload. In contrast, virtualized environments allow for the dynamic partitioning of a node into containerized units that can adapt to changing workloads. Consequently, the new question that arises is: how can a physical node be partitioned into virtual resources to maximize the efficiency of the FL process? To answer this, we investigate various resource allocation methods that consider factors such as computational and network capabilities, the complexity of datasets, as well as the specific characteristics of the FL workflow and ML backend. We explore two scenarios: (i) running FL over a finite number of testbed nodes and (ii) hosting multiple parallel FL workflows on the same set of testbed nodes. Our findings reveal that the default configurations of state-of-the-art cloud orchestrators are sub-optimal when orchestrating FL workflows. Additionally, we demonstrate that different libraries and ML models exhibit diverse computational footprints. Building upon these insights, we discuss methods to mitigate computational interferences and enhance the overall performance of the FL pipeline execution. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
25. A Real-Time Streaming System for Customized Network Traffic Capture †.
- Author
-
Costin, Adrian-Tiberiu, Zinca, Daniel, and Dobrota, Virgil
- Subjects
TECHNOLOGICAL innovations ,INFORMATION networks ,PROBLEM solving - Abstract
Logging network traffic offers valuable insights into data flow, enabling the proactive analysis and troubleshooting of issues as they arise. Moreover, it provides a means to access and examine the exchanged information among network users that would otherwise be inaccessible. To enhance network traffic analysis, the integration of innovative technologies that facilitate real-time querying and pattern matching proves indispensable. This research paper presents a system that exemplifies such advancements—an innovative network traffic logging tool. The tool specifically focuses on performing real-time network packet transfer to Apache Kafka and ksqlDB, leveraging their capabilities to ensure swift and dependable storage of network packets in Apache Kafka topics. By showcasing this solution, the paper demonstrates the benefits and effectiveness of employing modern technologies for network traffic analysis and management. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
26. Improving the Accuracy of Coal-Rock Dynamic Hazard Anomaly Detection Using a Dynamic Threshold and a Depth Auto-Coding Gaussian Hybrid Model.
- Author
-
Kong, Yulei and Luo, Zhengshan
- Abstract
A coal-rock dynamic disaster is a rapid instability and failure process with dynamic effects and huge disastrous consequences that occurs in coal-rock mass under mining disturbance. Disasters lead to catastrophic consequences, such as mine collapse, equipment damage, and casualties. Early detection can prevent the occurrence of disasters. However, due to the low accuracy of anomaly detection, disasters still occur frequently. In order to improve the accuracy of anomaly detection for coal-rock dynamic disasters, this paper proposes an anomaly detection method based on a dynamic threshold and a deep self-encoded Gaussian mixture model. First, pre-mining data were used as the initial threshold, and the subsequent continuously arriving flow data were used to dynamically update the threshold to solve the impact of artificially setting the threshold on the detection accuracy. Second, feature dimensionality reduction and reorganization of the data were carried out, and low-dimensional feature representation and feature reconstruction error modeling were used to solve the difficulty of extracting features from high-dimensional data in real time. Finally, through the end-to-end optimization calculation of the energy probability distribution between different categories for anomaly detection, the problem that key abnormal information may be lost due to dimensionality reduction was solved and accurate detection of monitoring data was realized. Experimental results showed that this method has better performance than other methods. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
27. Towards a Unified Architecture Powering Scalable Learning Models with IoT Data Streams, Blockchain, and Open Data.
- Author
-
Debauche, Olivier, Nkamla Penka, Jean Bertin, Hani, Moad, Guttadauria, Adriano, Ait Abdelouahid, Rachida, Gasmi, Kaouther, Ben Hardouz, Ouafae, Lebeau, Frédéric, Bindelle, Jérôme, Soyeurt, Hélène, Gengler, Nicolas, Manneback, Pierre, Benjelloun, Mohammed, and Bertozzi, Carlo
- Subjects
MACHINE learning ,INTERNET of things ,BLOCKCHAINS ,DATA modeling ,ALGORITHMS ,EDGE computing - Abstract
The huge amount of data produced by the Internet of Things need to be validated and curated to be prepared for the selection of relevant data in order to prototype models, train them, and serve the model. On the other side, blockchains and open data are also important data sources that need to be integrated into the proposed integrative models. It is difficult to find a sufficiently versatile and agnostic architecture based on the main machine learning frameworks that facilitate model development and allow continuous training to continuously improve them from the data streams. The paper describes the conceptualization, implementation, and testing of a new architecture that proposes a use case agnostic processing chain. The proposed architecture is mainly built around the Apache Submarine, an unified Machine Learning platform that facilitates the training and deployment of algorithms. Here, Internet of Things data are collected and formatted at the edge level. They are then processed and validated at the fog level. On the other hand, open data and blockchain data via Blockchain Access Layer are directly processed at the cloud level. Finally, the data are preprocessed to feed scalable machine learning algorithms. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
28. Exploring Clustering Techniques for Analyzing User Engagement Patterns in Twitter Data.
- Author
-
Kanavos, Andreas, Karamitsos, Ioannis, and Mohasseb, Alaa
- Subjects
SOCIAL media ,MICROBLOGS ,SOCIAL networks ,INFORMATION sharing - Abstract
Social media platforms have revolutionized information exchange and socialization in today's world. Twitter, as one of the prominent platforms, enables users to connect with others and express their opinions. This study focuses on analyzing user engagement levels on Twitter using graph mining and clustering techniques. We measure user engagement based on various tweet attributes, including retweets, replies, and more. Specifically, we explore the strength of user connections in Twitter networks by examining the diversity of edges. Our approach incorporates graph mining models that assign different weights to evaluate the significance of each connection. Additionally, clustering techniques are employed to group users based on their engagement patterns and behaviors. Statistical analysis was conducted to assess the similarity between user profiles, as well as attributes, such as friendship, followings, and interactions within the Twitter social network. The findings highlight the discovery of closely linked user groups and the identification of distinct clusters based on engagement levels. This research emphasizes the importance of understanding both individual and group behaviors in comprehending user engagement dynamics on Twitter. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
29. Enhancing IoT Device Security through Network Attack Data Analysis Using Machine Learning Algorithms.
- Author
-
Koirala, Ashish, Bista, Rabindra, and Ferreira, Joao C.
- Subjects
MACHINE learning ,COMPUTER network security ,BOTNETS ,INTERNET of things ,SMART devices ,DATA analysis - Abstract
The Internet of Things (IoT) shares the idea of an autonomous system responsible for transforming physical computational devices into smart ones. Contrarily, storing and operating information and maintaining its confidentiality and security is a concerning issue in the IoT. Throughout the whole operational process, considering transparency in its privacy, data protection, and disaster recovery, it needs state-of-the-art systems and methods to tackle the evolving environment. This research aims to improve the security of IoT devices by investigating the likelihood of network attacks utilizing ordinary device network data and attack network data acquired from similar statistics. To achieve this, IoT devices dedicated to smart healthcare systems were utilized, and botnet attacks were conducted on them for data generation. The collected data were then analyzed using statistical measures, such as the Pearson coefficient and entropy, to extract relevant features. Machine learning algorithms were implemented to categorize normal and attack traffic with data preprocessing techniques to increase accuracy. One of the most popular datasets, known as BoT-IoT, was cross-evaluated with the generated dataset for authentication of the generated dataset. The research provides insight into the architecture of IoT devices, the behavior of normal and attack networks on these devices, and the prospects of machine learning approaches to improve IoT device security. Overall, the study adds to the growing body of knowledge on IoT device security and emphasizes the significance of adopting sophisticated strategies for detecting and mitigating network attacks. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
30. Deep Neural Networks for Spatial-Temporal Cyber-Physical Systems: A Survey.
- Author
-
Musa, Abubakar Ahmad, Hussaini, Adamu, Liao, Weixian, Liang, Fan, and Yu, Wei
- Subjects
ARTIFICIAL neural networks ,CYBER physical systems ,CONVOLUTIONAL neural networks ,ELECTRONIC data processing ,ANOMALY detection (Computer security) - Abstract
Cyber-physical systems (CPS) refer to systems that integrate communication, control, and computational elements into physical processes to facilitate the control of physical systems and effective monitoring. The systems are designed to interact with the physical world, monitor and control the physical processes while in operation, and generate data. Deep Neural Networks (DNN) comprise multiple layers of interconnected neurons that process input data to produce predictions. Spatial-temporal data represents the physical world and its evolution over time and space. The generated spatial-temporal data is used to make decisions and control the behavior of CPS. This paper systematically reviews the applications of DNNs, namely convolutional, recurrent, and graphs, in handling spatial-temporal data in CPS. An extensive literature survey is conducted to determine the areas in which DNNs have successfully captured spatial-temporal data in CPS and the emerging areas that require attention. The research proposes a three-dimensional framework that considers: CPS (transportation, manufacturing, and others), Target (spatial-temporal data processing, anomaly detection, predictive maintenance, resource allocation, real-time decisions, and multi-modal data fusion), and DNN schemes (CNNs, RNNs, and GNNs). Finally, research areas that need further investigation are identified, such as performance and security. Addressing data quality, strict performance assurance, reliability, safety, and security resilience challenges are the areas that are required for further research. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
31. A Dynamic Trust-Related Attack Detection Model for IoT Devices and Services Based on the Deep Long Short-Term Memory Technique.
- Author
-
Alghofaili, Yara and Rassam, Murad A.
- Subjects
INTERNET of things ,SMART cities ,TECHNOLOGICAL innovations ,TRUST ,INTELLIGENT transportation systems - Abstract
The integration of the cloud and Internet of Things (IoT) technology has resulted in a significant rise in futuristic technology that ensures the long-term development of IoT applications, such as intelligent transportation, smart cities, smart healthcare, and other applications. The explosive growth of these technologies has contributed to a significant rise in threats with catastrophic and severe consequences. These consequences affect IoT adoption for both users and industry owners. Trust-based attacks are the primary selected weapon for malicious purposes in the IoT context, either through leveraging established vulnerabilities to act as trusted devices or by utilizing specific features of emerging technologies (i.e., heterogeneity, dynamic nature, and a large number of linked objects). Consequently, developing more efficient trust management techniques for IoT services has become urgent in this community. Trust management is regarded as a viable solution for IoT trust problems. Such a solution has been used in the last few years to improve security, aid decision-making processes, detect suspicious behavior, isolate suspicious objects, and redirect functionality to trusted zones. However, these solutions remain ineffective when dealing with large amounts of data and constantly changing behaviors. As a result, this paper proposes a dynamic trust-related attack detection model for IoT devices and services based on the deep long short-term memory (LSTM) technique. The proposed model aims to identify the untrusted entities in IoT services and isolate untrusted devices. The effectiveness of the proposed model is evaluated using different data samples with different sizes. The experimental results showed that the proposed model obtained a 99.87% and 99.76% accuracy and F-measure, respectively, in the normal situation, without considering trust-related attacks. Furthermore, the model effectively detected trust-related attacks, achieving a 99.28% and 99.28% accuracy and F-measure, respectively. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
32. Horizontal IoT Platform EMULSION.
- Author
-
Ganchev, Ivan, Ji, Zhanlin, and O'Droma, Máirtín
- Subjects
INTERNET of things ,ARCHITECTURAL details ,ARCHITECTURAL design ,SYSTEMS design ,EMULSIONS ,SCALABILITY - Abstract
This article presents an overview of an Internet of Things (IoT) platform design based on a horizontal architectural principle. The goal in applying this principle is to overcome many of the disadvantages associated with the default design approach which, within this context, could be classed as "vertical" in that the IoT system and service are usually designed as stand-alone "silo-like" entities on their own autonomous platform. In a pure sense, each new IoT system and service is a new design ab initio. With the "horizontal principle", the goal is that in the creation of a new IoT system and service, the provider needs only provide or adapt relevant architectural elements within a horizontal slice of an existing IoT architecture to enable the delivery of the desired IoT service. This article shows how embedding the horizontal principle into an IoT platform design brings the benefits of system design efficiency, effectiveness, and flexibility, together with at least the same scalability attributes inherent in the existing platform, an easily accessible adjustment, fine-tuning, and an openness to new use cases and application scenarios. The vision is the enabling of the realization of a potential multipurpose use of the IoT systems and services built on top of such platforms. The article presents a selective survey on the state of the art in IoT domains of application and in IoT platform architectural design solutions and lines of development from a vertical–horizontal categorization perspective. It presents examples of both IoT platform design solution types in use today. Within the context of strongly recommending the application of the horizontal design principle, the multitiered structure of the authors' own EMULSION IoT platform based on this horizontal principle is presented in detail in the final part of the article. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
33. Implementing Digital Twins That Learn: AI and Simulation Are at the Core.
- Author
-
Biller, Bahar and Biller, Stephan
- Subjects
DIGITAL twins ,INDUSTRY 4.0 ,SUPPLY chains ,SUPPLY chain management ,DIGITAL technology - Abstract
As companies are trying to build more resilient supply chains using digital twins created by smart manufacturing technologies, it is imperative that senior executives and technology providers understand the crucial role of process simulation and AI in quantifying the uncertainties of these complex systems. The resulting digital twins enable users to replay history, gain predictive visibility into the future, and identify corrective actions to optimize future performance. In this article, we define process digital twins and their four foundational elements. We discuss how key digital twin functions and enabling AI and simulation technologies integrate to describe, predict, and optimize supply chains for Industry 4.0 implementations. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
34. Transforming IoT Events to Meaningful Business Events on the Edge: Implementation for Smart Farming Application.
- Author
-
Gkoulis, Dimitris, Bardaki, Cleopatra, Kousiouris, George, and Nikolaidou, Mara
- Subjects
INTERNET of things ,AGRICULTURE ,TWO-way communication ,BIPARTITE graphs ,TRANSFORMATION groups ,ELECTRONIC data processing - Abstract
This paper focuses on Internet of Things (IoT) architectures and knowledge generation out of streams of events as the primary elements concerning the creation of user-centric IoT services. We provide a general, symmetrical IoT architecture, which enables two-way bidirectional communication between things and users within an application domain. We focus on two main components of the architecture (i.e., Event Engine and Process Engine) that handle event transformation by implementing parametric Complex Event Processing (CEP). More specifically, we describe and implement the transformation cycle of events starting from raw IoT data to their processing and transformation of events for calculating information that we need in an IoT-enabled application context. The implementation includes a library of composite transformations grouping the gradual and sequential steps for transforming basic IoT events into business events, which include ingestion, event splitting, and calculation of measurements' average value. The appropriateness and possibility of inclusion and integration of the implementation in an IoT environment are demonstrated by providing our implementation for a smart farming application domain with four scenarios that each reflect a user's requirements. Further, we discuss the quality properties of each scenario. Ultimately, we propose an IoT architecture and, specifically, a parametric CEP model and implementation for future researchers and practitioners who aspire to build IoT applications. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
35. A Study on the Deployment of Mesoscale Chemical Hazard Area Monitoring Points by Combining Weighting and Fireworks Algorithms.
- Author
-
Shi, Yimeng, Zhang, Hongyuan, Chen, Zheng, Sun, Yueyue, Liu, Xuecheng, and Gu, Jin
- Abstract
In order to address the problems of redundancy and waste of resources in the deployment of monitoring points in mesoscale chemical hazard areas, we propose a method for the deployment of monitoring points in mesoscale chemical hazard areas by combining weight and fireworks algorithms. Taking the mesoscale chemical hazard monitoring area as the research background, we take the probabilistic sensing model of telemetry sensor nodes as the research object, make a reasonable grid division of the mesoscale monitoring area, calculate the importance of each grid and perform clustering, utilize the diversity of the fireworks algorithm and the rapidity of the solution to solve the monitoring point deployment model and discuss the relevant factors affecting the deployment scheme. The simulation results show that the proposed algorithm can achieve the optimal coverage monitoring for monitoring areas with different importance and reduce the number of monitoring nodes and redundancy; meanwhile, the relevant factors such as the grid edge length, the number of clusters, and the average importance of monitoring areas have different degrees of influence on the complexity of the algorithm and the deployment scheme. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
36. Vital Signs Prediction and Early Warning Score Calculation Based on Continuous Monitoring of Hospitalised Patients Using Wearable Technology.
- Author
-
Youssef Ali Amer, Ahmed, Wouters, Femke, Vranken, Julie, de Korte-de Boer, Dianne, Smit-Fun, Valérie, Duflot, Patrick, Beaupain, Marie-Hélène, Vandervoort, Pieter, Luca, Stijn, Aerts, Jean-Marie, and Vanrumste, Bart
- Subjects
VITAL signs ,WEARABLE technology ,FORECASTING ,PATIENT monitoring ,BLENDED learning ,RESPIRATION - Abstract
In this prospective, interventional, international study, we investigate continuous monitoring of hospitalised patients' vital signs using wearable technology as a basis for real-time early warning scores (EWS) estimation and vital signs time-series prediction. The collected continuous monitored vital signs are heart rate, blood pressure, respiration rate, and oxygen saturation of a heterogeneous patient population hospitalised in cardiology, postsurgical, and dialysis wards. Two aspects are elaborated in this study. The first is the high-rate (every minute) estimation of the statistical values (e.g., minimum and mean) of the vital signs components of the EWS for one-minute segments in contrast with the conventional routine of 2 to 3 times per day. The second aspect explores the use of a hybrid machine learning algorithm of kNN-LS-SVM for predicting future values of monitored vital signs. It is demonstrated that a real-time implementation of EWS in clinical practice is possible. Furthermore, we showed a promising prediction performance of vital signs compared to the most recent state of the art of a boosted approach of LSTM. The reported mean absolute percentage errors of predicting one-hour averaged heart rate are 4.1, 4.5, and 5% for the upcoming one, two, and three hours respectively for cardiology patients. The obtained results in this study show the potential of using wearable technology to continuously monitor the vital signs of hospitalised patients as the real-time estimation of EWS in addition to a reliable prediction of the future values of these vital signs is presented. Ultimately, both approaches of high-rate EWS computation and vital signs time-series prediction is promising to provide efficient cost-utility, ease of mobility and portability, streaming analytics, and early warning for vital signs deterioration. [ABSTRACT FROM AUTHOR]
- Published
- 2020
- Full Text
- View/download PDF
37. Detection of DoH Traffic Tunnels Using Deep Learning for Encrypted Traffic Classification.
- Author
-
Alzighaibi, Ahmad Reda
- Subjects
TRAFFIC monitoring ,DEEP learning ,INTERNET domain naming system ,BOOSTING algorithms ,MACHINE learning ,FISHER discriminant analysis ,TUNNELS - Abstract
Currently, the primary concerns on the Internet are security and privacy, particularly in encrypted communications to prevent snooping and modification of Domain Name System (DNS) data by hackers who may attack using the HTTP protocol to gain illegal access to the information. DNS over HTTPS (DoH) is the new protocol that has made remarkable progress in encrypting Domain Name System traffic to prevent modifying DNS traffic and spying. To alleviate these challenges, this study explored the detection of DoH traffic tunnels of encrypted traffic, with the aim to determine the gained information through the use of HTTP. To implement the proposed work, state-of-the-art machine learning algorithms were used including Random Forest (RF), Gaussian Naive Bayes (GNB), Logistic Regression (LR), k-Nearest Neighbor (KNN), the Support Vector Classifier (SVC), Linear Discriminant Analysis (LDA), Decision Tree (DT), Adaboost, Gradient Boost (SGD), and LSTM neural networks. Moreover, ensemble models consisting of multiple base classifiers were utilized to carry out a series of experiments and conduct a comparative study. The CIRA-CIC-DoHBrw2020 dataset was used for experimentation. The experimental findings showed that the detection accuracy of the stacking model for binary classification was 99.99%. In the multiclass classification, the gradient boosting model scored maximum values of 90.71%, 90.71%, 90.87%, and 91.18% in Accuracy, Recall, Precision, and AUC. Moreover, the micro average ROC curve for the LSTM model scored 98%. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
38. Organic Memristor with Synaptic Plasticity for Neuromorphic Computing Applications.
- Author
-
Zeng, Jianmin, Chen, Xinhui, Liu, Shuzhi, Chen, Qilai, and Liu, Gang
- Subjects
NEUROPLASTICITY ,TRIPHENYLAMINE ,COMPLEMENTARY metal oxide semiconductors ,OXIDATION-reduction reaction - Abstract
Memristors have been considered to be more efficient than traditional Complementary Metal Oxide Semiconductor (CMOS) devices in implementing artificial synapses, which are fundamental yet very critical components of neurons as well as neural networks. Compared with inorganic counterparts, organic memristors have many advantages, including low-cost, easy manufacture, high mechanical flexibility, and biocompatibility, making them applicable in more scenarios. Here, we present an organic memristor based on an ethyl viologen diperchlorate [EV(ClO
4 )]2 /triphenylamine-containing polymer (BTPA-F) redox system. The device with bilayer structure organic materials as the resistive switching layer (RSL) exhibits memristive behaviors and excellent long-term synaptic plasticity. Additionally, the device's conductance states can be precisely modulated by consecutively applying voltage pulses between the top and bottom electrodes. A three-layer perception neural network with in situ computing enabled was then constructed utilizing the proposed memristor and trained on the basis of the device's synaptic plasticity characteristics and conductance modulation rules. Recognition accuracies of 97.3% and 90% were achieved, respectively, for the raw and 20% noisy handwritten digits images from the Modified National Institute of Standards and Technology (MNIST) dataset, demonstrating the feasibility and applicability of implementing neuromorphic computing applications utilizing the proposed organic memristor. [ABSTRACT FROM AUTHOR]- Published
- 2023
- Full Text
- View/download PDF
39. An Ensemble Tree-Based Model for Intrusion Detection in Industrial Internet of Things Networks.
- Author
-
Awotunde, Joseph Bamidele, Folorunso, Sakinat Oluwabukonla, Imoize, Agbotiname Lucky, Odunuga, Julius Olusola, Lee, Cheng-Chi, Li, Chun-Ta, and Do, Dinh-Thuan
- Subjects
INTRUSION detection systems (Computer security) ,FEATURE selection ,SMART devices ,MACHINE learning ,INTELLIGENT sensors ,RANDOM forest algorithms ,TELEMETRY ,INTERNET of things - Abstract
With less human involvement, the Industrial Internet of Things (IIoT) connects billions of heterogeneous and self-organized smart sensors and devices. Recently, IIoT-based technologies are now widely employed to enhance the user experience across numerous application domains. However, heterogeneity in the node source poses security concerns affecting the IIoT system, and due to device vulnerabilities, IIoT has encountered several attacks. Therefore, security features, such as encryption, authorization control, and verification, have been applied in IIoT networks to secure network nodes and devices. However, the requisite machine learning models require some time to detect assaults because of the diverse IIoT network traffic properties. Therefore, this study proposes ensemble models enabled with a feature selection classifier for Intrusion Detection in the IIoT network. The Chi-Square Statistical method was used for feature selection, and various ensemble classifiers, such as eXtreme gradient boosting (XGBoost), Bagging, extra trees (ET), random forest (RF), and AdaBoost can be used for the detection of intrusion applied to the Telemetry data of the TON_IoT datasets. The performance of these models is appraised based on accuracy, recall, precision, F1-score, and confusion matrix. The results indicate that the XGBoost ensemble showed superior performance with the highest accuracy over other models across the datasets in detecting and classifying IIoT attacks. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
40. A Comprehensive Review on Food Waste Reduction Based on IoT and Big Data Technologies.
- Author
-
Ahmadzadeh, Sahar, Ajmal, Tahmina, Ramanathan, Ramakrishnan, and Duan, Yanqing
- Abstract
Food waste reduction, as a major application area of the Internet of Things (IoT) and big data technologies, has become one of the most pressing issues. In recent years, there has been an unprecedented increase in food waste, which has had a negative impact on economic growth in many countries. Food waste has also caused serious environmental problems. Agricultural production, post-harvest handling, and storage, as well as food processing, distribution, and consumption, can all lead to food wastage. This wastage is primarily caused by inefficiencies in the food supply chain and a lack of information at each stage of the food cycle. In order to minimize such effects, the Internet of Things, big data-based systems, and various management models are used to reduce food waste in food supply chains. This paper provides a comprehensive review of IoT and big data-based food waste management models, algorithms, and technologies with the aim of improving resource efficiency and highlights the key challenges and opportunities for future research. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
41. A Bibliometric and Word Cloud Analysis on the Role of the Internet of Things in Agricultural Plant Disease Detection.
- Author
-
Patil, Rutuja Rajendra, Kumar, Sumit, Rani, Ruchi, Agrawal, Poorva, and Pippal, Sanjeev Kumar
- Published
- 2023
- Full Text
- View/download PDF
42. At the Confluence of Artificial Intelligence and Edge Computing in IoT-Based Applications: A Review and New Perspectives.
- Author
-
Bourechak, Amira, Zedadra, Ouarda, Kouahla, Mohamed Nadjib, Guerrieri, Antonio, Seridi, Hamid, and Fortino, Giancarlo
- Subjects
ARTIFICIAL intelligence ,DEEP learning ,SWARM intelligence ,EDGE computing ,MACHINE learning ,INTELLIGENT networks ,INTERNET of things - Abstract
Given its advantages in low latency, fast response, context-aware services, mobility, and privacy preservation, edge computing has emerged as the key support for intelligent applications and 5G/6G Internet of things (IoT) networks. This technology extends the cloud by providing intermediate services at the edge of the network and improving the quality of service for latency-sensitive applications. Many AI-based solutions with machine learning, deep learning, and swarm intelligence have exhibited the high potential to perform intelligent cognitive sensing, intelligent network management, big data analytics, and security enhancement for edge-based smart applications. Despite its many benefits, there are still concerns about the required capabilities of intelligent edge computing to deal with the computational complexity of machine learning techniques for big IoT data analytics. Resource constraints of edge computing, distributed computing, efficient orchestration, and synchronization of resources are all factors that require attention for quality of service improvement and cost-effective development of edge-based smart applications. In this context, this paper aims to explore the confluence of AI and edge in many application domains in order to leverage the potential of the existing research around these factors and identify new perspectives. The confluence of edge computing and AI improves the quality of user experience in emergency situations, such as in the Internet of vehicles, where critical inaccuracies or delays can lead to damage and accidents. These are the same factors that most studies have used to evaluate the success of an edge-based application. In this review, we first provide an in-depth analysis of the state of the art of AI in edge-based applications with a focus on eight application areas: smart agriculture, smart environment, smart grid, smart healthcare, smart industry, smart education, smart transportation, and security and privacy. Then, we present a qualitative comparison that emphasizes the main objective of the confluence, the roles and the use of artificial intelligence at the network edge, and the key enabling technologies for edge analytics. Then, open challenges, future research directions, and perspectives are identified and discussed. Finally, some conclusions are drawn. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
43. Parallel Processing of Sensor Data in a Distributed Rules Engine Environment through Clustering and Data Flow Reconfiguration.
- Author
-
Alexandrescu, Adrian
- Subjects
DISTRIBUTED computing ,PARALLEL processing ,SENSOR networks ,PARALLEL algorithms ,GENETIC algorithms ,PARALLEL programming - Abstract
An emerging reality is the development of smart buildings and cities, which improve residents' comfort. These environments employ multiple sensor networks, whose data must be acquired and processed in real time by multiple rule engines, which trigger events that enable specific actuators. The problem is how to handle those data in a scalable manner by using multiple processing instances to maximize the system throughput. This paper considers the types of sensors that are used in these scenarios and proposes a model for abstracting the information flow as a weighted dependency graph. Two parallel computing methods are then proposed for obtaining an efficient data flow: a variation of the parallel k-means clustering algorithm and a custom genetic algorithm. Simulation results show that the two proposed flow reconfiguration algorithms reduce the rule processing times and provide an efficient solution for increasing the scalability of the considered environment. Another aspect being discussed is using an open-source cloud solution to manage the system and how to use the two algorithms to increase efficiency. These methods allow for a seamless increase in the number of sensors in the environment by making smart use of the available resources. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
44. Lightweight and Energy-Efficient Deep Learning Accelerator for Real-Time Object Detection on Edge Devices.
- Author
-
Kim, Kyungho, Jang, Sung-Joon, Park, Jonghee, Lee, Eunchong, and Lee, Sang-Seol
- Subjects
DEEP learning ,OBJECT recognition (Computer vision) ,MACHINE learning ,IMAGING systems ,INTERNET of things ,IMAGE sensors - Abstract
Tiny machine learning (TinyML) has become an emerging field according to the rapid growth in the area of the internet of things (IoT). However, most deep learning algorithms are too complex, require a lot of memory to store data, and consume an enormous amount of energy for calculation/data movement; therefore, the algorithms are not suitable for IoT devices such as various sensors and imaging systems. Furthermore, typical hardware accelerators cannot be embedded in these resource-constrained edge devices, and they are difficult to drive real-time inference processing as well. To perform the real-time processing on these battery-operated devices, deep learning models should be compact and hardware-optimized, and hardware accelerator designs also have to be lightweight and consume extremely low energy. Therefore, we present an optimized network model through model simplification and compression for the hardware to be implemented, and propose a hardware architecture for a lightweight and energy-efficient deep learning accelerator. The experimental results demonstrate that our optimized model successfully performs object detection, and the proposed hardware design achieves 1.25× and 4.27× smaller logic and BRAM size, respectively, and its energy consumption is approximately 10.37× lower than previous similar works with 43.95 fps as a real-time process under an operating frequency of 100 MHz on a Xilinx ZC702 FPGA. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
45. An Offline Weighted-Bagging Data-Driven Evolutionary Algorithm with Data Generation Based on Clustering.
- Author
-
Guo, Zongliang, Lin, Sikai, Suo, Runze, and Zhang, Xinming
- Subjects
EVOLUTIONARY algorithms ,RADIAL basis functions - Abstract
In recent years, a variety of data-driven evolutionary algorithms (DDEAs) have been proposed to solve time-consuming and computationally intensive optimization problems. DDEAs are usually divided into offline DDEAs and online DDEAs, with offline DDEAs being the most widely studied and proven to display excellent performance. However, most offline DDEAs suffer from three disadvantages. First, they require many surrogates to build a relatively accurate model, which is a process that is redundant and time-consuming. Second, when the available fitness evaluations are insufficient, their performance tends to be not entirely satisfactory. Finally, to cope with the second problem, many algorithms use data generation methods, which significantly increases the algorithm runtime. To overcome these problems, we propose a brand-new DDEA with radial basis function networks as its surrogates. First, we invented a fast data generation algorithm based on clustering to enlarge the dataset and reduce fitting errors. Then, we trained radial basis function networks and carried out adaptive design for their parameters. We then aggregated radial basis function networks using a unique model management framework and demonstrated its accuracy and stability. Finally, fitness evaluations were obtained and used for optimization. Through numerical experiments and comparisons with other algorithms, this algorithm has been proven to be an excellent DDEA that suits data optimization problems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
46. An In-Depth Survey Demystifying the Internet of Things (IoT) in the Construction Industry: Unfolding New Dimensions.
- Author
-
Khurshid, Kiran, Danish, Aamar, Salim, Muhammad Usama, Bayram, Muhammed, Ozbakkaloglu, Togay, and Mosaberpanah, Mohammad Ali
- Abstract
In this digital era, many industries have widely adopted the Internet of Things (IoT), yet its implementation in the construction industry is relatively limited. Integration of Construction 4.0 drivers, such as business information modeling (BIM), procurement, construction safety, and structural health monitoring (SHM), with IoT devices, provides an effective framework for applications to enhance construction and operational efficiencies. IoT and Construction 4.0 driver integration research, however, is still in its infancy. It is necessary to understand the present state of IoT adoption in the Construction 4.0 context. This paper presented a comprehensive review to identify the IoT adoption status in the Construction 4.0 areas. Furthermore, this work highlighted the potential roadblocks to IoT's seamless adoption that are unique to the areas of Construction 4.0 in developing countries. Altogether, 257 research articles were reviewed to present the current state of IoT adoption in developed and developing countries, as well as the topmost barriers encountered in integrating IoT with the key Construction 4.0 drivers. This study aimed to provide a reference for construction managers to observe challenges, professionals to explore the hybridization possibilities of IoT in the context of Construction 4.0, and laymen to understand the high-level scientific research that underpins IoT in the construction industry. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
47. Application of Internet of Things (IoT) in Sustainable Supply Chain Management.
- Author
-
Khan, Yasser, Su'ud, Mazliham Bin Mohd, Alam, Muhammad Mansoor, Ahmad, Syed Fayaz, Ahmad, Ahmad Y. A. Bani, and Khan, Nasir
- Abstract
The traditional supply chain system included smart objects to enhance intelligence, automation capabilities, and intelligent decision-making. Internet of Things (IoT) technologies are providing unprecedented opportunities to enhance efficiency and reduce the cost of the existing system of the supply chain. This article aims to study the prevailing supply chain system and explore the benefits obtained after smart objects and embedded networks of IoT are implanted. Short-range communication technologies, radio frequency identification (RFID), middleware, and cloud computing are extensively comprehended to conceptualize the smart supply chain management system. Moreover, manufacturers are achieving maximum benefits in terms of safety, cost, intelligent management of inventory, and decision-making. This study also offers concepts of smart carriage, loading/unloading, transportation, warehousing, and packaging for the secure distribution of products. Furthermore, the tracking of customers to convince them to make more purchases and the modification of shops with the assistance of the Internet of Things are thoroughly idealized. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
48. An Overview of Fog Data Analytics for IoT Applications.
- Author
-
Bhatia, Jitendra, Italiya, Kiran, Jadeja, Kuldeepsinh, Kumhar, Malaram, Chauhan, Uttam, Tanwar, Sudeep, Bhavsar, Madhuri, Sharma, Ravi, Manea, Daniela Lucia, Verdes, Marina, and Raboaca, Maria Simona
- Subjects
REAL-time computing ,INTERNET of things ,ELECTRONIC data processing ,TELECOMMUNICATION systems ,BIG data ,NEXT generation networks - Abstract
With the rapid growth in the data and processing over the cloud, it has become easier to access those data. On the other hand, it poses many technical and security challenges to the users of those provisions. Fog computing makes these technical issues manageable to some extent. Fog computing is one of the promising solutions for handling the big data produced by the IoT, which are often security-critical and time-sensitive. Massive IoT data analytics by a fog computing structure is emerging and requires extensive research for more proficient knowledge and smart decisions. Though an advancement in big data analytics is taking place, it does not consider fog data analytics. However, there are many challenges, including heterogeneity, security, accessibility, resource sharing, network communication overhead, the real-time data processing of complex data, etc. This paper explores various research challenges and their solution using the next-generation fog data analytics and IoT networks. We also performed an experimental analysis based on fog computing and cloud architecture. The result shows that fog computing outperforms the cloud in terms of network utilization and latency. Finally, the paper is concluded with future trends. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF
49. Self-Sensing Antenna for Soil Moisture: Beacon Approach.
- Author
-
Škiljo, Maja, Blažević, Zoran, Dujić-Rodić, Lea, Perković, Toni, and Šolić, Petar
- Subjects
ANTENNAS (Electronics) ,ANTENNA design ,SPIRAL antennas ,INTERNET of things ,REFLECTANCE ,SOIL moisture ,SOIL physics - Abstract
On the way from the Internet of things (IoT) to the Internet of underground things (IoUT) the main challenge is antenna design. The enabling technologies still rely on simple design and low cost, but the systems are more complex. The LoRa-based system combined with a machine learning approach can be used for the estimation of soil moisture by using signal strength data, but for the improvement of the system performance we propose the optimization of the antenna for underground use. The soil properties are frequency-dependent and varying in time, which may cause variations in the signal wavelength and input impedance of the antenna underground. Instead of using wideband antenna design or standard helical antenna provided in LoRa module, which are typical in the IoUT research community for communication links, we propose a narrow-band antenna design for the application in soil moisture sensing. It is shown that the approach of simply matching the antenna buried in dry sand can provide a substantial signal level difference, ranging from approximately 10 dB (achieved by proof-of-concept measurements) to as much as 40 dB (calculated by a full wave simulator) in reflection coefficient when the moisture content is being increased by 20%. This can ensure more reliable radio sensing in novel sensorless technology where soil moisture information is extracted from the signal strength of a transmitting device. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
50. What Is (Not) Big Data Based on Its 7Vs Challenges: A Survey.
- Author
-
González García, Cristian and Álvarez-Fernández, Eva
- Subjects
BIG data ,DATA mining - Abstract
Big Data has changed how enterprises and people manage knowledge and make decisions. However, when talking about Big Data, so many times there are different definitions about what it is and what it is used for, as there are many interpretations and disagreements. For these reasons, we have reviewed the literature to compile and provide a possible solution to the existing discrepancies between the terms Data Analysis, Data Mining, Knowledge Discovery in Databases, and Big Data. In addition, we have gathered the patterns used in Data Mining, the different phases of Knowledge Discovery in Databases, and some definitions of Big Data according to some important companies and organisations. Moreover, Big Data has challenges that sometimes are the same as its own characteristics. These characteristics are known as the Vs. Nonetheless, depending on the author, these Vs can be more or less, from 3 to 5, or even 7. Furthermore, the 4Vs or 5Vs are not the same every time. Therefore, in this survey, we reviewed the literature to explain how many Vs have been detected and explained according to different existing problems. In addition, we detected 7Vs, three of which had subtypes. [ABSTRACT FROM AUTHOR]
- Published
- 2022
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.