13,784 results on '"CLOUD"'
Search Results
2. A Model of Cloud-Based System for Monitoring Air Quality in Urban Traffic Environment
- Author
-
Stojanov, Zeljko, Brtka, Vladimir, Jotanovic, Gordana, Jausevac, Goran, Perakovic, Dragan, Dobrilovic, Dalibor, Akan, Ozgur, Editorial Board Member, Bellavista, Paolo, Editorial Board Member, Cao, Jiannong, Editorial Board Member, Coulson, Geoffrey, Editorial Board Member, Dressler, Falko, Editorial Board Member, Ferrari, Domenico, Editorial Board Member, Gerla, Mario, Editorial Board Member, Kobayashi, Hisashi, Editorial Board Member, Palazzo, Sergio, Editorial Board Member, Sahni, Sartaj, Editorial Board Member, Shen, Xuemin, Editorial Board Member, Stan, Mircea, Editorial Board Member, Jia, Xiaohua, Editorial Board Member, Zomaya, Albert Y., Editorial Board Member, Perakovic, Dragan, editor, and Knapcikova, Lucia, editor
- Published
- 2025
- Full Text
- View/download PDF
3. Hybrid Optimization Model for Secure Task Scheduling in Cloud: Combining Seagull and Black Widow Optimization.
- Author
-
Verma, Garima
- Abstract
Task scheduling is the act of allocating tasks in a certain way to make the best use of the resources at hand. Users of the service must make their demands online since cloud computing is the method used to offer services through the internet. In this paper, a new hybrid optimization model is introduced for secure task scheduling in cloud which includes six fold objective functions such as makespan, execution time, Quality of Service (QoS), utilization cost and security. In security constraint, trust evaluation and risk probability was determined. Black Widow Combined Seagull Optimization (BWCSO) algorithm was proposed for obtaining the best optimization result by combining Black Widow Optimization (BWO) and Seagull Optimization Algorithm (SOA). Cycle crossover (CX) was introduced to produce an offspring from its parents in which each slot is filled by an element from a different parent. Finally, the suggested algorithm's performance was assessed, and the best outcome was found with respect to makespan. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
4. An autonomous blockchain-based workflow execution broker for e-science.
- Author
-
Alimoğlu, Alper and Özturan, Can
- Subjects
- *
COMPUTER workstation clusters , *CLOUD computing , *BLOCKCHAINS , *WORKFLOW , *CONTRACTS , *SCHEDULING - Abstract
Scientific workflows are essential for many applications, enabling the configuration and execution of complex tasks across distributed resources. In this paper, we contribute an Ethereum blockchain-based scientific workflow execution manager, which distributes workflows to run on cluster computing providers that utilize the Slurm workload manager to execute them. We extended our blockchain-based autonomous resource broker called eBlocBroker, which is a DAO-based decentralized coordinator, by providing distributed workflow execution via blockchain. Through various tests, we demonstrate how our eBlockBroker autonomous organization, which is programmed as a smart contract, can manage scientific workflow submission, scheduling, and execution on cluster computing providers. The utilization of blockchain for distributed workflow execution is a new concept. We are motivated because our system has been developed with e-Science in mind where scientific workflows are widely utilized. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
5. Testing Cloud Adjustment Hypotheses for the Maintenance of Earth's Hemispheric Albedo Symmetry With Natural Experiments.
- Author
-
Diamond, Michael S., Gristey, Jake J., and Feingold, Graham
- Subjects
- *
PARTICULATE matter , *AIR pollution , *VOLCANIC eruptions , *ALBEDO , *CLOUDINESS - Abstract
Earth's Northern and Southern Hemispheres reflect essentially equal amounts of sunlight. How—and whether—this hemispheric albedo symmetry is maintained remains a mystery. We decompose Earth's hemispheric albedo symmetry into components associated with the surface, clear‐sky atmosphere, and different cloud types as defined by cloud effective pressure and optical thickness. Climatologically, greater reflection by the surface, aerosols, and high clouds in the Northern Hemisphere is balanced by greater low and mid‐level cloudiness in the Southern Hemisphere. Both hemispheres have darkened at similar rates over the past two decades; whether the darkening from more rapidly declining aerosol in the Northern Hemisphere is causing a departure from all‐sky symmetry remains uncertain. Natural experiments including long‐term trends, sea ice loss, and volcanic eruptions provide strong evidence against the hypothesis that extratropical low clouds compensate changing clear‐sky asymmetries on annual‐to‐decadal timescales but some evidence that tropical high cloud shifts may do so. Plain Language Summary: Mysteriously, the Northern and Southern Hemispheres reflect the same amount of sunlight as each other, but scientists are not yet sure why, how, or even whether this phenomenon is sustained by the Earth system. The Northern Hemisphere is brighter in clear skies because it contains more pollution particles in the atmosphere and has more land area, whereas the Southern Hemisphere is cloudier. We break down this cloudiness contrast into components related to different cloud types defined by their height and thickness. Tropical high‐altitude clouds increase reflection preferentially in the Northern Hemisphere but are overcompensated by low‐ and mid‐level clouds in the Southern Hemisphere, especially in the subtropics and midlatitudes. Both hemispheres have darkened over the past two decades, but whether the Northern Hemisphere is darkening faster than the Southern Hemisphere due to decreasing particulate pollution or if they are darkening at the exact same rate remains uncertain. Based on long‐term trends and "natural experiments" like sea ice loss and volcanic eruptions, we can rule out the hypothesis that low‐level clouds in the Southern Ocean act to balance out clear‐sky asymmetries at yearly‐to‐decadal timescales, but we cannot rule out the hypothesis that high‐altitude tropical clouds do so. Key Points: Greater reflection from Northern Hemisphere clear‐skies and high clouds is balanced by Southern Hemisphere low and mid‐level cloudsEvidence points against extratropical low cloud adjustments compensating for clear‐sky albedo changes on decadal timescalesLimited and conflicting evidence for whether shifts in tropical high clouds compensate clear‐sky changes at annual to decadal timescales [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
6. A collaborative system for recommending a service within the cloud using deep learning.
- Author
-
Bourenane, Djihene, Sad-Houari, Nawal, and Taghezout, Noria
- Abstract
The purpose of this article is to conceive an advanced decision support environment that assists a company in selecting the best service requiring several constraints (expert or machine). The proposed contribution integrates the recommendation in deep learning technique deployed in Cloud. The first step involves preprocessing data using deep learning techniques, particularly NMT and BERT encoders, while the second step uses the neural network to generate recommendations. The neural network's architecture includes two hidden layers consisting of 16 and 8 neurons, configured with the "ReLU" activation function, while the output layer uses the "Softmax" function. The experiments have been conducted on a dataset of 20, 000 services. Results demonstrate that migrating the DL-based approach to cloud computing significantly reduces response time and memory consumption by approximately 15% and 10% for classification tasks, and 50% and 30% for recommendation tasks, respectively. Compared to an earlier version of the proposed approach based on machine learning, the DL-approach improves recommendation accuracy by approximately 3%. However, the results in terms of response time and memory usage remain variable, suggesting that deep learning requires considerable computing resources. The proposed method was benchmarked using evaluation metrics such as accuracy, MAE, response time, and user satisfaction, demonstrating its practical superiority and applicability across various industries. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
7. An efficient localization-based secure resource allocation using e-fso with ss-ddnn-based cm-lsgeo techniques.
- Author
-
Y R, Sampath Kumar and N, Champa H
- Subjects
ARTIFICIAL neural networks ,OPTIMIZATION algorithms ,RESOURCE allocation ,ENERGY consumption ,ALGORITHMS ,DATA transmission systems - Abstract
Complex applications, as well as multimedia services, are increasing swiftly with the substantial growth of Mobile Users (MUs) and IoT devices; thus, demanding extra computations along with higher data communication. Nevertheless, with limited computation power along with energy, these devices are still resource-constrained. Numerous methodologies have been developed for efficient Resource Allocation (RA) in the cloud; however, owing to the loss of network resources, energy consumption, along lower battery lifetime, the prevailing methodologies are deemed to be ineffective. So, by utilizing an Entropy-centric Fish Swarm Optimization Algorithm (E-FSO) with Soft Swish Deep Dense Neural Network (SS-DDNN)-guided Crossover Mutation – Linear Scaling-centered Golden Eagle Optimization (CM-LSGEO) technique, a practical localization-centric safe RA has been suggested to help with the previously mentioned issues. Initially, the mobile's optimal location is detected by utilizing the E-FSO algorithm. Next, for the secure transmission of packets, the failure node is detected effectively by employing the SS-DDNN algorithm. By utilizing the CM-LSGEO algorithm, a model has been constructed to perform efficient RA. The performance of the suggested model is compared to that of the current models in the experimental assessment, which takes into account some metrics. The outcomes displayed that in comparison with the prevailing methodologies, the proposed framework is more efficient. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
8. Workflow scheduling and optimization using evaluationary method and deep learning algorithm in cloud.
- Author
-
Lalitha, S. P. and Murugan, A.
- Subjects
MACHINE learning ,OPTIMIZATION algorithms ,GABOR transforms ,CONVOLUTIONAL neural networks ,GENETIC algorithms ,DEEP learning - Abstract
Cloud environment is used for its high efficient utilization of bandwidth and its high processing speed. In 5G mobile communication environment, most of users sends and receives more number of highly occupied multimedia information. Hence, It is important to handle such cloud environment with minimal bandwidth and maximize speed. Therefore, it is mandatory to have proper schedule for processing of multimedia videos over the cloud environment. In this paper, Workflow Scheduling and Optimization Algorithm (WSOA) using deep learning model is proposed for the processing of multimedia video. The frames in each multimedia video are separated and processed individually. The Gabor transform is applied on each spatial frame to convert them into time–frequency frame. From this time–frequency frame, various frame parameters and features such as Local Binary Pattern (LBP), Local Ternary Pattern (LTP) and statistical features are computed to form the feature set. This feature set is scrutinized using Evolutionary Approach as Genetic Algorithm (GA) and the final optimized feature set is classified by the proposed Convolutional Neural Networks (CNN) architecture which produces the priority results. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
9. Development of a Storm-Tracking Algorithm for the Analysis of Radar Rainfall Patterns in Athens, Greece.
- Author
-
Bournas, Apollon and Baltas, Evangelos
- Subjects
FLOOD warning systems ,THUNDERSTORMS ,STORMS ,TRACKING radar ,RAINFALL ,RADAR meteorology - Abstract
This research work focuses on the development and application of a storm-tracking algorithm for identifying and tracking storm cells. The algorithm first identifies storm cells on the basis of reflectivity thresholds and then matches the cells in the tracking procedure on the basis of their geometrical characteristics and the distance within the weather radar image. A sensitivity analysis was performed to evaluate the preferable thresholds for each case and test the algorithm's ability to perform in different time step resolutions. Following this, we applied the algorithm to 54 rainfall events recorded by the National Technical University X-Band weather radar, the rainscanner system, from 2018 to 2023 in the Attica region of Greece. Testing of the algorithm demonstrated its efficiency in tracking storm cells over various time intervals and reflecting changes such as merging or dissipation. The results reveal the predominant southwest-to-east storm directions in 40% of cases examined, followed by northwest-to-east and south-to-north patterns. Additionally, stratiform storms showed slower north-to-west trajectories, while convective storms exhibited faster west-to-east movement. These findings provide valuable insights into storm behavior in Athens and highlight the algorithm's potential for integration into nowcasting systems, particularly for flood early warning systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
10. Evaluation of Autoconversion Representation in E3SMv2 Using an Ensemble of Large‐Eddy Simulations of Low‐Level Warm Clouds.
- Author
-
Ovchinnikov, Mikhail, Ma, Po‐Lun, Kaul, Colleen M., Pressel, Kyle G., Huang, Meng, Shpund, Jacob, and Tang, Shuaiqi
- Subjects
- *
CLOUD droplets , *ATMOSPHERIC models , *PRECIPITATION variability , *WEATHER , *LOCAL foods - Abstract
In numerical atmospheric models that treat cloud and rain droplet populations as separate condensate categories, precipitation initiation in warm clouds is often represented by an autoconversion rate (Au) $(Au)$, which is the rate of formation of new rain droplets through the collisions of cloud droplets. Being a function of the cloud droplet size distribution (DSD), the local Au $Au$ is commonly parameterized as a function of DSD moments: cloud droplet number nc $\left({n}_{c}\right)$ and mass qc $\left({q}_{c}\right)$ concentrations. When applied in a large‐scale model, the grid‐mean Au $Au$ must also include a correction, or enhancement factor, to account for the horizontal variability of the cloud properties across the model grid. In this study, we evaluate the Au representation in the Energy Exascale Earth System Model version 2 (E3SMv2) climate model using large‐eddy simulations (LES), which explicitly resolve cloud droplet spectra, and therefore the local Au $Au$, as well as its spatial variability. The analysis of an ensemble of warm low‐level cloud cases shows that the E3SMv2 formulation represents the Au $Au$ reasonably well compared to the horizontally averaged explicitly computed rate from LES. The agreement, however, comes from a combination of an underestimated E3SM‐tuned local Au $Au$ rate and an overestimated subgrid cloud variability enhancement factor. The latter bias is traced to neglecting the horizontal variability of nc ${n}_{c}$ and its co‐variability with qc ${q}_{c}$ in parameterizing the grid‐mean Au $Au$. Plain Language Summary: When representing clouds in large‐scale weather and climate systems, models often compute the rate of formation of rain droplets through cloud droplet collisions as a function of cloud droplet number concentration and liquid water content averaged over the model horizontal grid, typically extending to 10–100 km. This rate, also known as an autoconversion rate, is then multiplied by an enhancement factor that accounts for the non‐linearity of that function and the cloud variability at smaller unresolved scales. We assess how well this rain initiation process is captured in the Energy Exascale Earth System Model (E3SM) by utilizing advanced high‐resolution cloud simulations that explicitly resolve cloud droplet spectra and spatial variability on scales from few tens of meters to 10 km. Our findings reveal that, while E3SM reasonably represents the overall rain droplet formation rate, it stems from a combination of underestimating the local rate and overestimating the enhancement factor accounting for unresolved subgrid cloud variability. This latter overestimation is traced to neglecting horizontal variability in the cloud droplet number concentration, offering valuable insights for improving the performance of global atmospheric models. Key Points: Formulations of a local autoconversion rate (Au) and its enhancement factor in E3SM are comprehensively evaluated using large‐eddy simulationsA reasonable grid‐mean Au in E3SM is produced by an underestimated local rate combined with an overestimated enhancement factorDroplet number concentration variability and co‐variability with liquid water content must be considered to improve the enhancement factor [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
11. Optimizing multi-tenant database architecture for efficient software as a service delivery.
- Author
-
Pippal, Sanjeev Kumar, Kumar, Sumit, and Rani, Ruchi
- Subjects
- *
DATABASES , *DATABASE design , *SOFTWARE as a service , *SOFTWARE architecture , *MEMORY - Abstract
A multi-tenant database (MTDB) is the backbone for any cloud app that employs a software as a service (SaaS) delivery paradigm. Every cloud-based SaaS delivery strategy relies heavily on the architecture of multitenant databases. The hardware and performance costs for quicker query execution and space savings provided by the architecture of MTDBs are implementation costs. All tenants’ data may be kept in a single table with a common schema and database format, making it the most cost-effective MTDB configuration. The arrangement becomes congested if tenants have varying storage needs. In this research, we present a space-saving architecture that improves transactional query execution while avoiding the waste of space due to different attribute needs. Extensible markup language (XML) and JavaScript object notation (JSON) compare the proposed system against the state of the art. The suggested multitenant database architecture reduces unnecessary space and improves query performance. The experimental findings show that the suggested system outperforms the state-ofthe-art extension table method. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
12. An innovative method using data acquisition and MATLAB for the electrochemical oxidation of formalin and the conversion of the oxidized products into a sound signal.
- Author
-
Duraikannu, Gajalakshmi
- Subjects
- *
CHEMICAL processes , *ELECTRIC batteries , *NEGATIVE electrode , *BINDING agents , *NANOPARTICLES - Abstract
Aim: Herein, the oxidation of chemical compounds as sound signals, prepared either by chemical, physical, mechanical, biological methods were reported. Objectives: To fabricate the synthesized material for example, nanoparticle, ceramic, electro catalyst as electrode, by mixing the synthesized material with a suitable binder or they may be mixed with a solvent to function as an electrolyte. In this case, 40 % formalin as electrolyte, platinum and calomel electrode as positive and negative electrodes respectively have been used to formulate an electrochemical cell. Methodology: This cell is connected with the sound card to process the sound signals and analyzed using Sig view software. The sound signals after noise deduction were further processed using MATLAB to get information about the signals. Results: For example, Frequency, Amplitude, etc. of those cells can be obtained. The FFT spectrum obtained by this method correlates well with the FTIR spectrum of formalin. Any Conductive chemical oxidation could be processed in this way and their chemical information could be digitized and saved in cloud. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
13. Aerosol in the subarctic region impacts on Atlantic meridional overturning circulation under global warming.
- Author
-
Chen, Di, Sun, Qizhen, and Fu, Min
- Subjects
- *
ATLANTIC meridional overturning circulation , *OCEAN temperature , *CLIMATE change , *GLOBAL warming , *CLOUDINESS - Abstract
The Atlantic Meridional Overturning Circulation (AMOC) is a crucial system influencing regional and even global climate changes, with the Subpolar North Atlantic (SPNA) being a key region affecting this system. In the context of global climate change, the impact and mechanisms of aerosols on the AMOC remain unclear. This study, based on state-of-the-art ensemble model simulations paired with latest observations, reveals that aerosols in the SPNA region significantly influence sea surface temperature anomalies by affecting cloud cover, which in turn affects ocean stratification, thus impacting the strength of the AMOC. Based on this, we present the trends and spatiotemporal characteristics of future AMOC under different forcings. Our research contributes to a deeper understanding and prediction of the AMOC. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
14. Optimization enabled deep learning method in container-based architecture of hybrid cloud for portability and interoperability-based application migration.
- Author
-
Hiremath, Tej. C. and K. S., Rekha
- Subjects
- *
OPTIMIZATION algorithms , *HYBRID cloud computing , *VIRTUAL machine systems , *CLOUD computing , *RESOURCE allocation , *DEEP learning - Abstract
Virtualisation is a major part of the cloud as it permits the deployment of several virtual servers over the same physical layer. Due to the adaption of cloud services, the count of the application running on repositories increases, resulting in overload. However, the application migration in the cloud with optimal resource allocation is still a challenging task. The application migration is employed to reduce the dilemma of resource allocation. Hence, this paper proposes a technique for portability and interoperability-based application migration in the cloud platform. The cloud simulation is done with the Physical Machine (PM), Virtual Machine (VM), and container. The interoperable application migration is provided using the newly devised Lion-based shuffled shepherd (Lion-SS) optimisation algorithm. The Lion-SS algorithm combines the shuffled shepherd optimisation algorithm (SSOA) and the Lion optimisation algorithm (LOA). The new objective function is devised based on predicted load, demand, transmission cost, and resource capacity. Besides, the prediction of the load is performed using Deep long short-term memory (Deep LSTM). The proposed technique obtained the minimal load of 0.007 and resource capacity of 0.342. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
15. The shape of the cloud: Contesting date centre construction in North Holland.
- Author
-
Rone, Julia
- Subjects
- *
DIGITAL technology , *SERVER farms (Computer network management) , *SOVEREIGNTY , *DECISION making , *CIVIL society - Abstract
The article analyses local contestation of data centres in the Dutch province of North Holland. I explore why and how local councillors and citizen groups mobilized against data centres and demanded democratization of decision-making processes about digital infrastructure. This analysis is used as a vantage point to problematize existing policy and academic narratives on digital sovereignty in Europe. I show, first, that most debates on digital sovereignty so far have overlooked the sub-national level, which is especially relevant for decision making on digital infrastructure. Second, I insist that what matters is not only where digital sovereignty lies, that is, who has the power to decide over digital infrastructural projects: for example, corporations, states, regions, or municipalities. What matters is also how power is exercised. Emphasizing the popular democratic dimension of sovereignty, I argue for a comprehensive democratization of digital sovereignty policies. Democratization in this context is conceived as a multimodal multi-level process, including parliaments, civil society and citizens at the national, regional and local levels alike. The shape of the cloud should be citizens' to decide. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
16. Machine Learning for Cloud Data Classification and Anomaly Intrusion Detection.
- Author
-
Megouache, Leila, Zitouni, Abdelhafid, Sadouni, Salheddine, and Djoudi, Mahieddine
- Subjects
INFORMATION technology security ,MACHINE learning ,ARTIFICIAL intelligence ,K-means clustering ,ANOMALY detection (Computer security) ,INTRUSION detection systems (Computer security) - Abstract
The sheer volume of applications, data and users working in the cloud creates an ecosystem far too large to protect against possible attacks. Several attack detection mechanisms have been proposed to minimize the risk of data loss backed up to the cloud. However, these techniques are not reliable enough to protect them; this is due to the reasons of scalability, distribution and resource limitations. As a result, Information Technology Security experts may feel powerless against the growing threats plaguing the cloud. For that, we provide a reliable way to detect attackers who want to break into cloud data. In our framework, we have no labels and no predefined classes on historical data, and we wish to identify similar models to form homogeneous groups from our observations. Then, we will use a k-means clustering algorithm to handle unlabelled data, and a combination approach of clustering and classification. We start with a k-means clustering algorithm for generating a labelled dataset from an unlabelled dataset. By harnessing the power of a labelled dataset, we can train the extreme learning machine classifier to become an exceptional tool for intrusion detection. By utilizing this resampling technique, we can generate additional data sets to significantly enhance the system's capability to identify and thwart attacks. The innovation of this approach stems from its integration of clustering and classification into a unified learning model. The cutting-edge framework has been successfully implemented on the renowned KDD99 dataset, producing impressive numerical results that not only affirm its exceptional accuracy but also highlight the significant time-saving advantages of this innovative approach. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
17. Extended water wave optimization (EWWO) technique: a proposed approach for task scheduling in IoMT and healthcare applications.
- Author
-
Bapuram, Bhasker, Subramanian, Murali, Mahendran, Anand, Ghafir, Ibrahim, Ellappan, Vijayan, and Hamada, Mohammed
- Abstract
The Internet of Medical Things (IoMT) is a version of the Internet of Things. It is getting the attention of researchers because it can be used in a wide range of smart healthcare systems. One of the main advancements employed recently is the IoMT-cloud, which allows users to access cloud services remotely over the internet. These cloud services require an efficient task scheduling approach that satisfies the Quality of Service parameters with a low energy consumption. This paper presents an overview of the integration of IoMT and cloud computing technologies. Besides,this work proposes an efficient Extended Water Wave Optimization (EWWO) task scheduling in the IoMT Cloud for healthcare applications. EWWO algorithm performs based on its operations propagation, refraction and breaking. The proposed EWWO scheduling technique minimizes the energy consumption, makespan time, execution time and increases the resource utilization. Cloudsim simulator is used to simulate the IoMT-Cloud environment to verify the effectiveness of EWWO technique. The performance has been evaluated based on various parameters such as energy consumption, makespan time and execution time. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
18. Evaluation of groundwater quality in the rural environment using geostatistical analysis and WebGIS methods in a Hungarian settlement, Báránd.
- Author
-
Balla, Dániel, Kiss, Emőke, Zichar, Marianna, and Mester, Tamás
- Subjects
GROUNDWATER quality ,WATER quality ,WATER pollution ,GROUNDWATER pollution ,ENVIRONMENTAL quality ,WATER quality monitoring - Abstract
The evaluation, visualization of environmental data from long-term monitoring, and making them accessible in a processed form in user-friendly interfaces on the Internet are important tasks of our time. The pollution of groundwater resources in settlements is a global phenomenon, the mitigation of which requires a number of environmental measures. In this study, water quality changes following the construction of a sewerage network were examined in the course of long-term monitoring between 2013 and 2022, during which 40 municipal groundwater wells were regularly sampled. Classifying the monitoring data into pollution categories based on water quality index (WQI) and degree of contamination index (Cd), a high degree of contamination was found in the period before the installation of the sewerage network (2014), as the majority of the wells were classified as contaminated and heavily contaminated. In the monitoring period following the installation of the sewerage network, a significant positive change was found in the case of most of the water chemical parameters tested (EC, NH
4 + , NO2 − , NO3 − , PO4 3− ). Based on interpolated maps, it was found that an increasing part of the area shows satisfactory or good water quality. This was confirmed by the discriminant analysis as well, as it is possible to determine with an accuracy of 80.4% whether the given sample originates from the period before or after the installation of the sewerage network based on the given water chemical parameters. However, 8 years after setting up the sewerage network, the concentration of inorganic nitrogen forms and organic matter remains high, indicating that the accumulated pollutants in the area are still present. To understand the dynamics of purification processes, additional, long-term monitoring is necessary. Making these data available to members of the society can contribute to appropriate environmental measures and strategies. [ABSTRACT FROM AUTHOR]- Published
- 2024
- Full Text
- View/download PDF
19. EdgeAuth: An intelligent token‐based collaborative authentication scheme.
- Author
-
Jiang, Xutong, Dou, Ruihan, He, Qiang, Zhang, Xuyun, and Dou, Wanchun
- Subjects
CLOUD computing security measures ,DENIAL of service attacks ,EDGE computing ,INTERNET of things ,INTERNET - Abstract
Edge computing is regarded as an extension of cloud computing that brings computing and storage resources to the network edge. For some Industrial Internet of Things (IIoT) applications such as supply‐chain supervision and collaboration, Internet of Vehicles, real‐time video analysis and so forth, users should be authenticated before visiting the geographically distributed edge servers. Limited by the considerable latency between the cloud and edge servers, and the limited capacity of edge servers, it is infeasible to copy the authentication method from cloud servers when users are authenticated in edge servers. In view of this challenge, this paper proposes a novel token‐based authentication scheme, named EdgeAuth, that enables fast edge user authentication through collaboration among cloud servers and edge servers. Under the EdgeAuth scheme, edge servers can rapidly verify the credentials of users who have been authenticated by the cloud server. EdgeAuth can also protect users from a series of authentication attacks, for example, the replay attack, DoS attack and man‐in‐the‐middle attack. The results of experiments conducted on a simulated edge computing environment validate the usefulness of EdgeAuth through a comparison in latency and throughput against two baseline schemes. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
20. Modelling a Request and Response-Based Cryptographic Model For Executing Data Deduplication in the Cloud.
- Author
-
Kumar, Doddi Suresh and Srinivasu, Nulaka
- Subjects
DATA privacy ,ELLIPTIC curve cryptography ,INTERSTELLAR communication ,ACCESS control ,DATA warehousing ,CLOUD storage - Abstract
Cloud storage is one of the most crucial components of cloud computing because it makes it simpler for users to share and manage their data on the cloud with authorized users. Secure deduplication has attracted much attention in cloud storage because it may remove redundancy from encrypted data to save storage space and communication overhead. Many current safe deduplication systems usually focus on accomplishing the following characteristics regarding security and privacy: Access control, tag consistency, data privacy and defence against various attacks. But as far as we know, none can simultaneously fulfil all four conditions. In this research, we offer a safe deduplication method that is effective and provides user-defined access control to address this flaw. Because it only allows the cloud service provider to grant data access on behalf of data owners, our proposed solution (Request-response-based Elliptic Curve Cryptography) may effectively delete duplicates without compromising the security and privacy of cloud users. A thorough security investigation reveals that our approved safe deduplication solution successfully thwarts brute-force attacks while dependably maintaining tag consistency and data confidentiality. Comprehensive simulations show that our solution surpasses the evaluation in computing, communication, storage overheads, and deduplication efficiency. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
21. CO2 Dependence in Global Estimation of All‐Sky Downwelling Longwave: Parameterization and Model Comparison.
- Author
-
Kawaguchi, Koh, Shakespeare, Callum J., and Roderick, Michael L.
- Subjects
- *
SURFACE of the earth , *STANDARD deviations , *TEMPERATURE inversions , *RADIATIVE forcing , *CARBON dioxide , *ATMOSPHERIC carbon dioxide - Abstract
The downwelling longwave radiation at the surface (DLR) is a key component of the Earth's surface energy budget. We present a novel set of equations that explicitly account for both clouds and the CO2 $\mathrm{C}{\mathrm{O}}_{\mathrm{2}}$ effect to calculate the all‐sky DLR. This paper first extends the clear‐sky DLR model of Shakespeare and Roderick (2021, https://doi.org/10.1002/qj.4176) to include temperature inversions and clouds. We parameterize relevant cloud properties through theoretical and empirical considerations to formulate an all‐sky model. Our model is more accurate than existing methods (reduces Root Mean Squared Error by 2.1–8.7 W/m2 $\mathrm{W}/{\mathrm{m}}^{\mathrm{2}}$ and 1.2–10.1 W/m2 $\mathrm{W}/{\mathrm{m}}^{\mathrm{2}}$ compared to ERA5 reanalysis and in‐situ data respectively), and provides a strong physical basis for the estimation of the downwelling longwave from near‐surface information. We highlight the important role of CO2 $\mathrm{C}{\mathrm{O}}_{\mathrm{2}}$ dependence by showing our model largely captures the change in atmospheric emissivity purely due to CO2 $\mathrm{C}{\mathrm{O}}_{\mathrm{2}}$ (i.e., the instantaneous radiative forcing) in CMIP6 models. Plain Language Summary: The downwelling longwave radiation (DLR) at the surface is a key component of the energy balance at the Earth's surface. Understanding how the DLR will change under future climate conditions is vital. For the first time, we explicitly write a set of equations to calculate the DLR that sufficiently account for the impact of CO2 $\mathrm{C}{\mathrm{O}}_{\mathrm{2}}$ and clouds simultaneously. Our model is more accurate than existing methods, and provides a much stronger physical basis for the estimation of the downwelling longwave from near‐surface information. In this paper, we extend an existing method for estimating the DLR under clear‐sky conditions (i.e., no clouds) to operate under all sky conditions. This method can be used to inform models where the DLR is needed, but only basic observations are available. Key Points: Downwelling longwave radiation (DLR) is a poorly estimated element of the surface energy budget by existing analytical modelsExplicitly accounting for temperature inversions and cloud emissivities improves the accuracy of DLR estimationConsidering the radiative forcing from increasing CO2 $\mathrm{C}{\mathrm{O}}_{\mathrm{2}}$ is necessary to produce unbiased future estimates of DLR [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
22. Considering the Effects of Horizontal Heterogeneities in Satellite-Based Large-Scale Statistics of Cloud Optical Properties.
- Author
-
Várnai, Tamás and Marshak, Alexander
- Subjects
- *
CUMULUS clouds , *OPTICAL measurements , *OPTICAL properties , *HETEROGENEITY , *HOMOGENEITY - Abstract
This paper explores a new approach to improving satellite measurements of cloud optical thickness and droplet size by considering the radiative impacts of horizontal heterogeneity in boundary-layer cumulus clouds. In contrast to the usual bottom-up approach that retrieves cloud properties for individual pixels and subsequently compiles large-scale statistics, the proposed top-down approach first determines the effect of 3D heterogeneity on large-scale cloud statistics and then distributes the overall effects to individual pixels. The potential of this approach is explored by applying a regression-based scheme to a simulated dataset containing over 3000 scenes generated through large eddy simulations. The results show that the new approach can greatly reduce the errors in widely used bispectral retrievals that assume horizontal homogeneity. Errors in large-scale mean values and cloud variability are typically reduced by factors of two to four for 1 km resolution retrievals—and the reductions remain significant even for a 4 km resolution. The calculations also reveal that over vegetation heterogeneity-caused droplet size retrieval biases are often opposite to the biases found over oceans. Ultimately, the proposed approach shows potential for improving the accuracy of both old and new satellite datasets. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
23. Efficient microservices offloading for cost optimization in diverse MEC cloud networks.
- Author
-
Mahesar, Abdul Rasheed, Li, Xiaoping, and Sajnani, Dileep Kumar
- Subjects
MOBILE computing ,EDGE computing ,ARCHITECTURAL style ,MOBILE apps ,CLOUD computing - Abstract
In recent years, mobile applications have proliferated across domains such as E-banking, Augmented Reality, E-Transportation, and E-Healthcare. These applications are often built using microservices, an architectural style where the application is composed of independently deployable services focusing on specific functionalities. Mobile devices cannot process these microservices locally, so traditionally, cloud-based frameworks using cost-efficient Virtual Machines (VMs) and edge servers have been used to offload these tasks. However, cloud frameworks suffer from extended boot times and high transmission overhead, while edge servers have limited computational resources. To overcome these challenges, this study introduces a Microservices Container-Based Mobile Edge Cloud Computing (MCBMEC) environment and proposes an innovative framework, Optimization Task Scheduling and Computational Offloading with Cost Awareness (OTSCOCA). This framework addresses Resource Matching, Task Sequencing, and Task Scheduling to enhance server utilization, reduce service latency, and improve service bootup times. Empirical results validate the efficacy of MCBMEC and OTSCOCA, demonstrating significant improvements in server efficiency, reduced service latency, faster service bootup times, and notable cost savings. These outcomes underscore the pivotal role of these methodologies in advancing mobile edge computing applications amidst the challenges of edge server limitations and traditional cloud-based approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
24. ZTCloudGuard: Zero Trust Context-Aware Access Management Framework to Avoid Medical Errors in the Era of Generative AI and Cloud-Based Health Information Ecosystems.
- Author
-
Al-hammuri, Khalid, Gebali, Fayez, and Kanan, Awos
- Subjects
- *
MACHINE learning , *GENERATIVE artificial intelligence , *HEALTH information systems , *LANGUAGE models , *MEDICAL errors , *TELEMEDICINE - Abstract
Managing access between large numbers of distributed medical devices has become a crucial aspect of modern healthcare systems, enabling the establishment of smart hospitals and telehealth infrastructure. However, as telehealth technology continues to evolve and Internet of Things (IoT) devices become more widely used, they are also increasingly exposed to various types of vulnerabilities and medical errors. In healthcare information systems, about 90% of vulnerabilities emerge from medical error and human error. As a result, there is a need for additional research and development of security tools to prevent such attacks. This article proposes a zero-trust-based context-aware framework for managing access to the main components of the cloud ecosystem, including users, devices, and output data. The main goal and benefit of the proposed framework is to build a scoring system to prevent or alleviate medical errors while using distributed medical devices in cloud-based healthcare information systems. The framework has two main scoring criteria to maintain the chain of trust. First, it proposes a critical trust score based on cloud-native microservices for authentication, encryption, logging, and authorizations. Second, a bond trust scoring system is created to assess the real-time semantic and syntactic analysis of attributes stored in a healthcare information system. The analysis is based on a pre-trained machine learning model that generates the semantic and syntactic scores. The framework also takes into account regulatory compliance and user consent in the creation of the scoring system. The advantage of this method is that it applies to any language and adapts to all attributes, as it relies on a language model, not just a set of predefined and limited attributes. The results show a high F 1 score of 93.5%, which proves that it is valid for detecting medical errors. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
25. QoS and reliability aware matched bald eagle task scheduling framework based on IoT-cloud in educational applications.
- Author
-
Chowdhary, Sunil Kumar and Rao, A. L. N.
- Subjects
- *
VIRTUAL machine systems , *BALD eagle , *PRODUCTION scheduling , *SCHEDULING , *INTERNET of things - Abstract
Cloud computing is a popular paradigm that enables on-demand access to shared resources over the internet. Task scheduling is an important aspect of cloud computing that involves allocating resources to tasks in an efficient manner. The rapid growth of cloud computing has led to an increasing demand for efficient task scheduling algorithms. In cloud computing, task scheduling is critical for achieving high performance and resource utilization, while minimizing costs. However, traditional task scheduling algorithms often struggle to handle the intricacy and fluctuation in cloud computing environments. Therefore, a novel task scheduling framework called Matched Bald Eagle (MABLE) task scheduling framework for Cloud Computing to schedule tasks on Virtual Machines (VMs) in a cloud environment. The proposed framework consists of three major phases: matching, sorting and scheduling. The matching phase identifies the most suitable VMs for each task, while the sorting phase prioritizes the tasks based on their requirements and the types of VMs available. Finally, the scheduling phase uses the Enhanced Bald Eagle optimization (EBEO) algorithm in scheduling tasks on the chosen VMs. The simulated MABLE technique is proposed and its performance is compared with existing methods in terms of load balance, cost, resource utilization and makespan under two different scenarios. The outcomes demonstrate that the MABLE method outperforms existing techniques and is an efficient task scheduling framework for cloud computing. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
26. Efficient fog node placement using nature-inspired metaheuristic for IoT applications.
- Author
-
Naouri, Abdenacer, Nouri, Nabil Abdelkader, Khelloufi, Amar, Sada, Abdelkarim Ben, Ning, Huansheng, and Dhelim, Sahraoui
- Subjects
- *
INTELLIGENT networks , *NETWORK performance , *QUALITY of service , *COMPUTATIONAL complexity , *ALGORITHMS - Abstract
Managing the explosion of data from the edge to the cloud requires intelligent supervision, such as fog node deployments, which is an essential task to assess network operability. To ensure network operability, the deployment process must be carried out effectively regarding two main factors: connectivity and coverage. The network connectivity is based on fog node deployment, which determines the network's physical topology, while the coverage determines the network accessibility. Both have a significant impact on network performance and guarantee the network quality of service. Determining an optimum fog node deployment method that minimizes cost, reduces computation and communication overhead, and provides a high degree of network connection coverage is extremely hard. Therefore, maximizing coverage and preserving network connectivity is a non-trivial problem. In this paper, we propose a fog deployment algorithm that can effectively connect the fog nodes and cover all edge devices. Firstly, we formulate fog deployment as an instance of multi-objective optimization problems with a large search space. Then, we leverage Marine Predator Algorithm (MPA) to tackle the deployment problem and prove that MPA is well-suited for fog node deployment due to its rapid convergence and low computational complexity, compared to other population-based algorithms. Finally, we evaluate the proposed algorithm on a different benchmark of generated instances with various fog scenario configurations. Our algorithm outperforms state-of-the-art methods, providing promising results for optimal fog node deployment. It demonstrates a 50% performance improvement compared to other algorithms, aligning with the No Free Lunch Theorem (NFL Theorem) Theorem's assertion that no algorithm has a universal advantage across all problem domains. This underscores the significance of selecting tailored algorithms based on specific problem characteristics. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
27. A background-based new scheduling approach for scheduling the IoT network task with data storage in cloud environment.
- Author
-
Shakya, Santosh and Tripathi, Priyanka
- Subjects
- *
VIRTUAL machine systems , *CLOUD storage , *ENERGY consumption , *DATA warehousing , *CLOUD computing - Abstract
Cloud computing is very popular due to its unique features, such as scalability, flexibility, on-demand service, and security. A competent task scheduler is necessary to boost the efficiency of a cloud system, which executes several tasks at once. With the incorporation of cloud computing, the Internet of Things (IoT) has seen tremendous improvements recently. IoT devices produce various data sets they want to store in particular places on the cloud. Data and resources may be spread across several locations and accessible from the VM through suitable technology. Cloud computing has changed how resources are used, stored, and shared for industrial applications, including data, services, and applications. Virtual Machine Placement (VMP)'s main goal is to map Virtual Machines (VMs) to Physical Machines (PMs), allowing the VMs to be used to their fullest potential without interfering with the PMs that are currently running. VM performs the selected task and stores the data in a cloud location. Recently, many algorithms have worked on only VM selection or VM migration; instead of this, we also work on the data storage related to a particular IoT device in the cloud. It considerably lowers energy usage and offers a list of live VM migrations that must be completed to obtain the best solution and store the data on the cloud in a linked location. The authors present a novel scheduling technique that outperforms other widely recognized scheduling algorithms regarding load balancing in compression, specifically the quality of service parameters. In this research, we propose a Background-based task scheduling (BBTS) algorithm to decrease energy usage and boost throughput while choosing a light VM for any activity. Furthermore, the proposed approach is compared to various task scheduling methodologies, considering multiple performance metrics such as makespan time, resource utilization, success rate, and computation time. The evaluation is conducted on a task set ranging from 100 to 1000, with makespan time ranging from 55 to 654, resource utilization ranging from 35.45 to 42.13, success rate ranging from 88.63 to 96.48, and computation time ranging from 39.12 to 529.46. These metrics are compared to the corresponding algorithms (IWC, WOA, and ALO) utilized in this research study. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
28. Review of the Usage of Cloud Technologies in the Covid-19 Pandemic Period.
- Author
-
Velinov, Aleksandar, Nikolova, Aleksandra, and Zdravev, Zoran
- Subjects
- *
COVID-19 pandemic , *CLOUD computing , *INFORMATION technology industry - Abstract
Cloud technologies have a huge impact in the IT sphere. With their appearance, they made a significant contribution in various fields. They also provided a quick, easy and cost-effective access to services that significantly improved the availability of IT resources for companies and organizations. This was especially evident during the period of the Covid- 19 pandemic. During this period, a large number of organizations experienced the benefits offered by the cloud. Many of them have migrated their applications to the cloud in order to cope with the increasing number of requests and insufficient resources of their servers. This paper presents a review for the usage of cloud technologies in different areas during the Covid-19 pandemic. [ABSTRACT FROM AUTHOR]
- Published
- 2024
29. Multidimensional Scenario Calculations Using Cloud‐Based Co‐Simulation.
- Author
-
Wack, Thorsten and Schröder, Andreas
- Subjects
- *
FACTORIES , *CARBON emissions , *SYSTEM integration , *FACTORY design & construction , *MATHEMATICAL models - Abstract
As part of the joint research project Carbon2Chem®, mathematical models are used to simulate operating scenarios of cross‐industry networks, in particular chemical plants in industrial environments with high CO2 emissions. In this context, a cloud‐based distributed co‐simulation is used to enable the parallelized simulation of a large number of different variants. This enables optimal design of the plant networks and the efficient determination of optimal operating conditions. The architecture of this approach is described, and its performance is examined and evaluated. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
30. 新疆和周边“一带一路”地区不同云天条件 下地表太阳辐射.
- Author
-
孙琳琳, 刘 琼, 黄 观, 陈勇航, 魏 鑫, 郭玉琳, 张太西, 高天一, and 许赟红
- Subjects
SOLAR radiation ,SOLAR surface ,SOLAR oscillations ,EARTH stations ,CLOUDINESS - Abstract
Copyright of Arid Zone Research / Ganhanqu Yanjiu is the property of Arid Zone Research Editorial Office and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder's express written permission. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)
- Published
- 2024
- Full Text
- View/download PDF
31. Efficient Collaborative Edge Computing for Vehicular Network Using Clustering Service.
- Author
-
Al-Allawee, Ali, Lorenz, Pascal, and Munther, Alhamza
- Subjects
VEHICULAR ad hoc networks ,EDGE computing ,CLOUD computing ,DATA privacy ,BANDWIDTHS - Abstract
Internet of Vehicles applications are known to be critical and time-sensitive. The value proposition of edge computing comprises its lower latency, advantageous bandwidth consumption, privacy, management, efficiency of treatments, and mobility, which aim to improve vehicular and traffic services. Successful stories have been observed between IoV and edge computing to support smooth mobility and the use of local resources. However, vehicle travel, especially due to high-speed movement and intersections, can result in IoV devices losing connection and/or processing with high latency. This paper proposes a Cluster Collaboration Vehicular Edge Computing (CCVEC) framework that aims to guarantee and enhance the connectivity between vehicle sensors and the cloud by utilizing the edge computing paradigm in the middle. The objectives are achieved by utilizing the cluster management strategies deployed between cloud and edge computing servers. The framework is implemented in OpenStack cloud servers and evaluated by measuring the throughput, latency, and memory parameters in two different scenarios. The results obtained show promising indications in terms of latency (approximately 390 ms of the ideal status) and throughput (30 kB/s) values, and thus appears acceptable in terms of performance as well as memory. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
32. Adaptive Multi-Objective Resource Allocation for Edge-Cloud Workflow Optimization Using Deep Reinforcement Learning.
- Author
-
Lahza, Husam, B R, Sreenivasa, Lahza, Hassan Fareed M., and J, Shreyas
- Subjects
DEEP reinforcement learning ,REINFORCEMENT learning ,ENERGY conservation ,COST control ,URBAN growth - Abstract
This study investigates the transformative impact of smart intelligence, leveraging the Internet of Things and edge-cloud platforms in smart urban development. Smart urban development, by integrating diverse digital technologies, generates substantial data crucial for informed decision-making in disaster management and effective urban well-being. The edge-cloud platform, with its dynamic resource allocation, plays a crucial role in prioritizing tasks, reducing service delivery latency, and ensuring critical operations receive timely computational power, thereby improving urban services. However, the current method has struggled to meet the strict quality of service (QoS) requirements of complex workflow applications. In this study, these shortcomings in edge-cloud are addressed by introducing a multi-objective resource optimization (MORO) scheduler for diverse urban setups. This scheduler, with its emphasis on granular task prioritization and consideration of diverse makespans, costs, and energy constraints, underscores the complexity of the task and the need for a sophisticated solution. The multi-objective makespan–energy optimization is achieved by employing a deep reinforcement learning (DRL) model. The simulation results indicate consistent improvements with average makespan enhancements of 31.6% and 70.09%, average cost reductions of 62.64% and 73.24%, and average energy consumption reductions of 25.02% and 17.77%, respectively, by MORO over-reliability enhancement strategies for workflow scheduling (RESWS) and multi-objective priority workflow scheduling (MOPWS) for SIPHT workflow. Similarly, consistent improvements with average makespan enhancements of 37.98% and 74.44%, average cost reductions of 65.53% and 74.89%, and average energy consumption reductions of 29.52% and 24.73%, respectively, by MORO over RESWS and MOPWS for CyberShake workflow, highlighting the proposed model's efficiency gains. These findings substantiate the model's potential to enhance computational efficiency, reduce costs, and improve energy conservation in real-world smart urban scenarios. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
33. MobileNet based secured compliance through open web application security projects in cloud system.
- Author
-
Vallabhaneni, Rohith, Vaddadi, Srinivas A, Vadakkethil Somanathan Pillai, Sanjaikanth E., Addula, Santosh Reddy, and Ananthan, Bhuvanesh
- Subjects
VIRTUAL machine systems ,WEB-based user interfaces ,WEBSITES ,CYBERTERRORISM ,DATA integrity - Abstract
The daunting issues that are promptly faced worldwide are the sophisticated cyber-attacks in all kinds of organizations and applications. The development of cloud computing pushed organizations to shift their business towards the virtual machines of the cloud. Nonetheless, the lack of security throughout the programmatic and declarative levels explicitly prone to cyber-attacks in the cloud platform. The exploitation of web pages and the cloud is due to the uncrated open web application security projects (OWASP) fragilities and fragilities in the cloud containers and network resources. With the utilization of advanced hacking vectors, the attackers attack data integrity, confidentiality, and availability. Hence, it’s ineluctable to frame the application security-based technique for the reduction of attacks. In concern to this, we propose a novel Deep learning-based secured advanced web application firewall to overcome the lack of missing programmatic and declarative level securities in the application. For this, we adopted the MobileNet-based technique to ensure the assurance of security. Simulations are effectuated and analyzed the robustness with the statistical parameters such as accuracy, precision, sensitivity, and specificity and made the comparative study with the existing works. Our proposed technique surpasses all the other techniques and provides better security in the cloud. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
34. Measuring the Effectiveness of Carbon-Aware AI Training Strategies in Cloud Instances: A Confirmation Study.
- Author
-
Vergallo, Roberto and Mainetti, Luca
- Subjects
NATURAL language processing ,ARTIFICIAL intelligence ,COMPUTER vision ,CARBON emissions ,APPLICATION software - Abstract
While the massive adoption of Artificial Intelligence (AI) is threatening the environment, new research efforts begin to be employed to measure and mitigate the carbon footprint of both training and inference phases. In this domain, two carbon-aware training strategies have been proposed in the literature: Flexible Start and Pause & Resume. Such strategies—natively Cloud-based—use the time resource to postpone or pause the training algorithm when the carbon intensity reaches a threshold. While such strategies have proved to achieve interesting results on a benchmark of modern models covering Natural Language Processing (NLP) and computer vision applications and a wide range of model sizes (up to 6.1B parameters), it is still unclear whether such results may hold also with different algorithms and in different geographical regions. In this confirmation study, we use the same methodology as the state-of-the-art strategies to recompute the saving in carbon emissions of Flexible Start and Pause & Resume in the Anomaly Detection (AD) domain. Results confirm their effectiveness in two specific conditions, but the percentage reduction behaves differently compared with what is stated in the existing literature. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
35. Relative Humidity Correction Method of Microwave Radiometer Combined with Cloud Radar
- Author
-
Zhang Ting, Jiao Zhimin, Mao Jiajia, Zhang Xuefen, Wang Yanfei, Chen Peiyu, and Jin Long
- Subjects
microwave radiometer ,millimeter-wave cloud radar ,relative humidity ,cloud ,Meteorology. Climatology ,QC851-999 - Abstract
The microwave radiometer can detect and retrieve temperature and humidity profiles with high spatial and temporal resolution throughout the day. However, microwave radiometers have few detection frequencies in the middle and upper layers, making them easily affected by clouds. After integrating cloud information into brightness temperature data, the improvement in detection accuracy in the middle and upper layers still remains insufficient, failing to meet accuracy standards required for relative humidity. With the construction of a national ground-based remote sensing vertical profile system, the continuous observation of cloud radar and microwave radiometer at the same site has been achieved, enhancing the spatial and temporal resolution. Combined with the relationship between humidity and cloud formation, a comprehensive quality control method is proposed for relative humidity using cloud radar and microwave radiometer. It plays a crucial role in enhancing the accuracy of humidity profile of microwave radiometer under cloudy conditions.By analyzing the characteristic relationship between the cloud radar reflectivity factor and the radiosonde relative humidity, a piecewise correction method for the microwave radiometer relative humidity of the combined cloud radar is proposed. Error correction results are analyzed using the radiosonde and ERA5 data. It shows that there is a positive linear correlation between the relative humidity and the reflectivity factor, the relative humidity in the middle of the cloud region is approximately saturated, and the relative humidity variation trend with the height of the cloud exit region and the cloud entry region is approximately symmetric about a certain height. Under the condition of stratiform clouds, the root mean square error of relative humidity by microwave radiometer decreases by 7.99% and 8.91% when comparing with radiosonde and ERA5, and the absolute value of median deviation decreases by 12.62% and 13.05%, respectively. The absolute median deviation also decreases. Further investigation indicates the method is also effective under the condition of convection cloud, but the relative humidity in the cloud region after correction is larger than that of sounding and ERA5, and the median deviation changes from negative deviation to positive deviation. Therefore, the relative humidity segment correction method of combing cloud radar can realize the continuous real-time correction of the relative humidity profile of microwave radiometer, which partly improves the observation quality of microwave radiometer under cloud conditions.
- Published
- 2024
- Full Text
- View/download PDF
36. Adaptive Multi-Objective Resource Allocation for Edge-Cloud Workflow Optimization Using Deep Reinforcement Learning
- Author
-
Husam Lahza, Sreenivasa B R, Hassan Fareed M. Lahza, and Shreyas J
- Subjects
cloud ,cost ,energy ,optimization ,task execution ,time ,Engineering design ,TA174 - Abstract
This study investigates the transformative impact of smart intelligence, leveraging the Internet of Things and edge-cloud platforms in smart urban development. Smart urban development, by integrating diverse digital technologies, generates substantial data crucial for informed decision-making in disaster management and effective urban well-being. The edge-cloud platform, with its dynamic resource allocation, plays a crucial role in prioritizing tasks, reducing service delivery latency, and ensuring critical operations receive timely computational power, thereby improving urban services. However, the current method has struggled to meet the strict quality of service (QoS) requirements of complex workflow applications. In this study, these shortcomings in edge-cloud are addressed by introducing a multi-objective resource optimization (MORO) scheduler for diverse urban setups. This scheduler, with its emphasis on granular task prioritization and consideration of diverse makespans, costs, and energy constraints, underscores the complexity of the task and the need for a sophisticated solution. The multi-objective makespan–energy optimization is achieved by employing a deep reinforcement learning (DRL) model. The simulation results indicate consistent improvements with average makespan enhancements of 31.6% and 70.09%, average cost reductions of 62.64% and 73.24%, and average energy consumption reductions of 25.02% and 17.77%, respectively, by MORO over-reliability enhancement strategies for workflow scheduling (RESWS) and multi-objective priority workflow scheduling (MOPWS) for SIPHT workflow. Similarly, consistent improvements with average makespan enhancements of 37.98% and 74.44%, average cost reductions of 65.53% and 74.89%, and average energy consumption reductions of 29.52% and 24.73%, respectively, by MORO over RESWS and MOPWS for CyberShake workflow, highlighting the proposed model’s efficiency gains. These findings substantiate the model’s potential to enhance computational efficiency, reduce costs, and improve energy conservation in real-world smart urban scenarios.
- Published
- 2024
- Full Text
- View/download PDF
37. Efficient Collaborative Edge Computing for Vehicular Network Using Clustering Service
- Author
-
Ali Al-Allawee, Pascal Lorenz, and Alhamza Munther
- Subjects
cloud ,edge computing ,vehicle ,clustering ,VEC ,Computer engineering. Computer hardware ,TK7885-7895 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Internet of Vehicles applications are known to be critical and time-sensitive. The value proposition of edge computing comprises its lower latency, advantageous bandwidth consumption, privacy, management, efficiency of treatments, and mobility, which aim to improve vehicular and traffic services. Successful stories have been observed between IoV and edge computing to support smooth mobility and the use of local resources. However, vehicle travel, especially due to high-speed movement and intersections, can result in IoV devices losing connection and/or processing with high latency. This paper proposes a Cluster Collaboration Vehicular Edge Computing (CCVEC) framework that aims to guarantee and enhance the connectivity between vehicle sensors and the cloud by utilizing the edge computing paradigm in the middle. The objectives are achieved by utilizing the cluster management strategies deployed between cloud and edge computing servers. The framework is implemented in OpenStack cloud servers and evaluated by measuring the throughput, latency, and memory parameters in two different scenarios. The results obtained show promising indications in terms of latency (approximately 390 ms of the ideal status) and throughput (30 kB/s) values, and thus appears acceptable in terms of performance as well as memory.
- Published
- 2024
- Full Text
- View/download PDF
38. Efficient microservices offloading for cost optimization in diverse MEC cloud networks
- Author
-
Abdul Rasheed Mahesar, Xiaoping Li, and Dileep Kumar Sajnani
- Subjects
Mobile edge computing ,Cloud ,Task scheduling ,Microservices ,Optimization ,Container ,Computer engineering. Computer hardware ,TK7885-7895 ,Information technology ,T58.5-58.64 ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Abstract In recent years, mobile applications have proliferated across domains such as E-banking, Augmented Reality, E-Transportation, and E-Healthcare. These applications are often built using microservices, an architectural style where the application is composed of independently deployable services focusing on specific functionalities. Mobile devices cannot process these microservices locally, so traditionally, cloud-based frameworks using cost-efficient Virtual Machines (VMs) and edge servers have been used to offload these tasks. However, cloud frameworks suffer from extended boot times and high transmission overhead, while edge servers have limited computational resources. To overcome these challenges, this study introduces a Microservices Container-Based Mobile Edge Cloud Computing (MCBMEC) environment and proposes an innovative framework, Optimization Task Scheduling and Computational Offloading with Cost Awareness (OTSCOCA). This framework addresses Resource Matching, Task Sequencing, and Task Scheduling to enhance server utilization, reduce service latency, and improve service bootup times. Empirical results validate the efficacy of MCBMEC and OTSCOCA, demonstrating significant improvements in server efficiency, reduced service latency, faster service bootup times, and notable cost savings. These outcomes underscore the pivotal role of these methodologies in advancing mobile edge computing applications amidst the challenges of edge server limitations and traditional cloud-based approaches.
- Published
- 2024
- Full Text
- View/download PDF
39. Context-aware resource allocation for IoRT-aware business processes based on decentralized multi-agent reinforcement learning.
- Author
-
Fattouch, Najla, Ben Lahmar, Imen, and Boukadi, Khouloud
- Abstract
In Industry 4.0 (I4.0), IoRT-aware Business Processes aim to automate classic Business Processes (BP) by integrating IoT and robotics capacities. However, executing these processes inside the enterprise may be costly due to the required software and hardware components. To overcome this deficiency, the Business Process Outsourcing (BPO) strategy can be used to execute an IoRT-aware BP using external environments such as Fog and Cloud. In these environments, heterogeneous resources may have different specifications, which makes allocating Fog and Cloud resources challenging. Therefore, this work addresses the resource allocation (RA) issue for outsourcing an IoRT-aware BP. Toward this objective, we propose an optimal context-aware RA approach based on the decentralized multi-agent reinforcement learning (MARL) technique. The effectiveness and feasibility of the proposed context-aware RA-based decentralized MARL are demonstrated through a set of experiments. The preliminary experimental evaluation of the proposed approach demonstrates a high precision value and an encouraging recall value regarding other RA approaches. [ABSTRACT FROM AUTHOR]
- Published
- 2025
- Full Text
- View/download PDF
40. Secure and efficient content-based image retrieval using dominant local patterns and watermark encryption in cloud computing.
- Author
-
Sucharitha, G., Godavarthi, Deepthi, Ramesh, Janjhyam Venkata Naga, and Khan, M. Ijaz
- Subjects
- *
CONTENT-based image retrieval , *IMAGE encryption , *DATABASES , *IMAGE retrieval , *IMAGE databases - Abstract
The relevance of images in people's daily lives is growing, and content-based image retrieval (CBIR) has received a lot of attention in research. Images are much better at communicating information than text documents. This paper deals with security and efficient retrieval of images based on the texture features extracted by the dominant local patterns of an image in cloud. Here, we proposed a method that supports secure and efficient image retrieval over cloud. The images are encrypted with the watermark before deploying the image database to the cloud, this process prevents from the outflow of sensitive information to the cloud. A reduced dimension feature vector database has been created for all the images using relative directional edge patterns (RDEP), facilitating efficient storage and retrieval. The significance of the specified local pattern for effectively extracting texture information has been demonstrated. A notable level of accuracy has been established when compared to existing algorithms in terms of precision and recall. Additionally, a watermark-based system is proposed to prevent unauthorized query users from illicitly copying and distributing the acquired images to others. An inimitable watermark is entrenched into the image by the encryption module before storing into the cloud. Hence, when an image copy is discovered, the watermark extraction can be used to track down the illegal query image user who circulated the image. The proposed method's significance is assessed by comparing it to other existing feature extractors incorporating watermark encryption. Additionally, the effectiveness of the method is demonstrated across various numbers of watermark bits. Trials and security analyses affirm that the suggested approach is both robust and efficient. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
41. A novel hybrid Artificial Gorilla Troops Optimizer with Honey Badger Algorithm for solving cloud scheduling problem.
- Author
-
Hussien, Abdelazim G., Chhabra, Amit, Hashim, Fatma A., and Pop, Adrian
- Subjects
- *
UBIQUITOUS computing , *COMPUTER systems , *CLOUD computing , *NUMBER systems , *PRODUCTION scheduling , *METAHEURISTIC algorithms , *HEURISTIC algorithms - Abstract
Cloud computing has revolutionized the way a variety of ubiquitous computing resources are provided to users with ease and on a pay-per-usage basis. Task scheduling problem is an important challenge, which involves assigning resources to users' Bag-of-Tasks applications in a way that maximizes either system provider or user performance or both. With the increase in system size and the number of applications, the Bag-of-Tasks scheduling (BoTS) problem becomes more complex due to the expansion of search space. Such a problem falls in the category of NP-hard optimization challenges, which are often effectively tackled by metaheuristics. However, standalone metaheuristics generally suffer from certain deficiencies which affect their searching efficiency resulting in deteriorated final performance. This paper aims to introduce an optimal hybrid metaheuristic algorithm by leveraging the strengths of both the Artificial Gorilla Troops Optimizer (GTO) and the Honey Badger Algorithm (HBA) to find an approximate scheduling solution for the BoTS problem. While the original GTO has demonstrated effectiveness since its inception, it possesses limitations, particularly in addressing composite and high-dimensional problems. To address these limitations, this paper proposes a novel approach by introducing a new updating equation inspired by the HBA, specifically designed to enhance the exploitation phase of the algorithm. Through this integration, the goal is to overcome the drawbacks of the GTO and improve its performance in solving complex optimization problems. The initial performance of the GTOHBA algorithm tested on standard CEC2017 and CEC2022 benchmarks shows significant performance improvement over the baseline metaheuristics. Later on, we applied the proposed GTOHBA on the BoTS problem using standard parallel workloads (CEA-Curie and HPC2N) to optimize makespan and energy objectives. The obtained outcomes of the proposed GTOHBA are compared to the scheduling techniques based on well-known metaheuristics under the same experimental conditions using standard statistical measures and box plots. In the case of CEA-Curie workloads, the GTOHBA produced makespan and energy consumption reduction in the range of 8.12–22.76% and 6.2–18.00%, respectively over the compared metaheuristics. Whereas for the HPC2N workloads, GTOHBA achieved 8.46–30.97% makespan reduction and 8.51–33.41% energy consumption reduction against the tested metaheuristics. In conclusion, the proposed hybrid metaheuristic algorithm provides a promising solution to the BoTS problem, that can enhance the performance and efficiency of cloud computing systems. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
42. CLOUD COMPUTING WEB-BASED LMS: INTERACTIVE LEARNING MEDIA TO ENHANCE STUDENTS' ACCOUNTING PARTICIPATION AND SKILLS
- Author
-
Said Nur Octavianto, Jarot Tri Bowo Santoso, Sandy Arief, and Satsya Yoga Baswara
- Subjects
cloud ,lms ,web ,participation ,skills ,partisipasi ,keterampilan ,Education ,Education (General) ,L7-991 ,Accounting. Bookkeeping ,HF5601-5689 - Abstract
ABSTRACT The need for more competitiveness among high school students can be attributed to the dearth of interactive learning media. This study aims to assess the efficacy of integrating LMS-based Cloud Computing Web Media with a scientific approach to enhance student engagement and accounting proficiency. Adopting a quantitative methodology with a quasi-experimental design, the study employs Class XII IPS 1 (the experimental class) and Class XII IPS 2 (the control class) as the sample set. The data collection techniques and instruments utilized pre-tests and post-tests in the form of HOTS-based multiple choice. The data analysis techniques employed the t-test, which included the Independent Samples t-test and paired Samples t-test, and was supported by the N-Gain Test. This study asserts that LMS-based cloud computing web media integrated with a scientific approach is effective in increasing student participation and accounting skills. Future research may develop LMS-based cloud learning media presented in the form of a website to support online accounting practicum learning. ABSTRAK Daya saing siswa SMA yang rendah disebabkan oleh kurangnya pemanfaatan media pembelajaran interaktif. Penelitian ini berupaya menganalisis efektivitas penggunaan Media Cloud Computing Web berbasis LMS terintegrasi pendekatan saintifik untuk meningkatkan partisipasi dan keterampilan akuntansi siswa. Penelitian ini menggunakan pendekatan kuantitatif dengan metode eksperimen semu (quasi experiment). Sampel yang ditetapkan yaitu Kelas XII IPS 1 (kelas eksperimen) dan Kelas XII IPS 2 (kelas kontrol). Teknik dan instrumen pengumpulan data menggunakan pre test dan post test berupa pilihan ganda berbasis HOTS, sedangkan teknik analisis data menggunakan Uji t yang mencakup Independent Samples t – Test, Paired Samples t – Test, dan didukung oleh Uji N – Gain. Penelitian ini menyatakan Media Cloud Computing Web berbasis LMS terintegrasi pendekatan saintifik efektif untuk meningkatkan partisipasi dan keterampilan akuntansi siswa. Penelitian selanjutnya dapat mengembangkan media pembelajaran Cloud berbasis LMS yang dihadirkan dalam bentuk web (website) untuk menunjang pembelajaran praktikum akuntansi online.
- Published
- 2024
- Full Text
- View/download PDF
43. The Future of Pain Medicine: Emerging Technologies, Treatments, and Education
- Author
-
Slitzky M, Yong RJ, Lo Bianco G, Emerick T, Schatman ME, and Robinson CL
- Subjects
virtual reality ,artificial intelligence ,wearables ,cloud ,neuromodulation. ,Medicine (General) ,R5-920 - Abstract
Matthew Slitzky,1 R Jason Yong,2 Giuliano Lo Bianco,3 Trent Emerick,4 Michael E Schatman,5,6 Christopher L Robinson2 1Burke Rehabilitation, Montefiore Health System, White Plains, NY, USA; 2Department of Anesthesiology, Perioperative, and Pain Medicine, Harvard Medical School, Brigham and Women’s Hospital, Boston, MA, USA; 3Anesthesiology and Pain Department, Fondazione Istituto G. Giglio Cefalù, Palermo, Italy; 4Department of Anesthesiology and Perioperative Medicine, Chronic Pain Division, University of Pittsburgh Medical Center, Pittsburgh, PA, USA; 5Department of Anesthesiology, Perioperative Care, and Pain Medicine, NYU Grossman School of Medicine, New York, NY, USA; 6Department of Population Health-Division of Medical Ethics, NYU Grossman School of Medicine, New York, NY, USACorrespondence: Christopher L Robinson, Department of Anesthesiology, Perioperative, and Pain Medicine, Harvard Medical School, Brigham and Women’s Hospital, 75 Francis Street Boston, Boston, MA, 02118, USA, Email crobinson48@bwh.harvard.edu
- Published
- 2024
44. ZTCloudGuard: Zero Trust Context-Aware Access Management Framework to Avoid Medical Errors in the Era of Generative AI and Cloud-Based Health Information Ecosystems
- Author
-
Khalid Al-hammuri, Fayez Gebali, and Awos Kanan
- Subjects
access management ,zero-trust ,distributed medical devices ,cloud ,health information system ,medical errors ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Managing access between large numbers of distributed medical devices has become a crucial aspect of modern healthcare systems, enabling the establishment of smart hospitals and telehealth infrastructure. However, as telehealth technology continues to evolve and Internet of Things (IoT) devices become more widely used, they are also increasingly exposed to various types of vulnerabilities and medical errors. In healthcare information systems, about 90% of vulnerabilities emerge from medical error and human error. As a result, there is a need for additional research and development of security tools to prevent such attacks. This article proposes a zero-trust-based context-aware framework for managing access to the main components of the cloud ecosystem, including users, devices, and output data. The main goal and benefit of the proposed framework is to build a scoring system to prevent or alleviate medical errors while using distributed medical devices in cloud-based healthcare information systems. The framework has two main scoring criteria to maintain the chain of trust. First, it proposes a critical trust score based on cloud-native microservices for authentication, encryption, logging, and authorizations. Second, a bond trust scoring system is created to assess the real-time semantic and syntactic analysis of attributes stored in a healthcare information system. The analysis is based on a pre-trained machine learning model that generates the semantic and syntactic scores. The framework also takes into account regulatory compliance and user consent in the creation of the scoring system. The advantage of this method is that it applies to any language and adapts to all attributes, as it relies on a language model, not just a set of predefined and limited attributes. The results show a high F1 score of 93.5%, which proves that it is valid for detecting medical errors.
- Published
- 2024
- Full Text
- View/download PDF
45. Real-time collection of the functional parameters for a passive house
- Author
-
Eduard Nicolae Pătru, Petru Crăciun, Vladimir Tanasiev, Duong Minh Quan, and Le Xuan Chau
- Subjects
passive house ,sensors ,smart meter ,real-time data ,cloud ,Technology - Abstract
This paper explores the concept of a real-time collection of functional parameters for passive houses and its significance in achieving energy efficiency and occupant comfort. In this case, both the specific parameters of the HVAC system and the electrical parameters will be analyzed with high accuracy. In addition, the microgrid power system application includes wind power, solar power and batteries connected online to the grid to help ensure continuous power supply for the house. The implemented solution aims at various technologies, sensors, and smart meters used for collecting meaningful information in passive houses. It also discusses the role of data analytics and visualization techniques with the purpose of providing a user-friendly interface. For straightforward analysis, all collected information will be stored both in the cloud and on a personal server.
- Published
- 2024
- Full Text
- View/download PDF
46. Development and Implementation of a Cognitive Cloud Assistant for Optimal Cloud Service Provider Selection
- Author
-
Vaľko Dávid, Kapa Miroslav, Ádám Norbert, and Khorshidiyeh Heidar
- Subjects
aws ,cloud ,cloud assistant ,microsoft azure ,reference architectures ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
This paper explores the design and implementation of a cognitive assistant powered by cloud computing. We begin by reviewing existing cloud service providers and assistant solutions. Building on this foundation, we define a set of requirements for our cognitive assistant and utilize them to guide the development process. To validate the functionality of our creation, we compare its outputs with established reference architectures.
- Published
- 2024
- Full Text
- View/download PDF
47. Evidence of linear relationships between clear‑sky indices in photosynthetically active radiation and broadband ranges
- Author
-
William Wandji Nyamsi, Yves-Marie Saint-Drenan, John A. Augustine, Antti Arola, and Lucien Wald
- Subjects
broadband irradiance ,clear‑sky ,clear‑sky index ,cloud ,photosynthetically active radiation ,radiative transfer simulations ,Meteorology. Climatology ,QC851-999 - Abstract
This study provides empirical relationships between photosynthetically active radiation (PAR) and broadband clear‑sky indices at ground level for both the PAR global irradiance and its direct component. Once multiplied by the irradiance in clear‑sky conditions, the clear‑sky index provides the irradiance under cloudy conditions. The relationships are developed by the means of radiative transfer simulations of various realistic atmospheric states including both ice and water cloud phases. For the direct component, the PAR clear‑sky index is equal to the broadband clear‑sky index. For global irradiance, several linear relationships are proposed depending on the availability of cloud properties namely cloud phase and cloud optical depth. The developed relationships are validated numerically and experimentally by using ground-based measurements from the SURFRAD network in the U.S.A. Overall, it has been found, a squared correlation coefficient R2$R^{2}$ close to 1.00 and a relative bias (relative root mean square error) in absolute value less than 3 % (6 %) with respect to the means of the relevant measurements, demonstrating a high level of accuracy of the proposed relationships.
- Published
- 2024
- Full Text
- View/download PDF
48. Data replication and scheduling in the cloud with optimization assisted work flow management.
- Author
-
Rambabu, D. and Govardhan, A.
- Subjects
DATA replication ,WORKFLOW ,OPTIMIZATION algorithms ,BOTTLENECKS (Manufacturing) ,K-means clustering ,SCHEDULING ,WORKFLOW management - Abstract
Data-intensive applications must be run on systems with high-performance processing and enough storage. When compared to conventional distributed systems like the data grid, cloud computing offers these features on a platform that is more flexible, scalable, and inexpensive. Moreover, retrieving data files is crucial to operate these services. Typically, accessing data causes the entire cloud workflow system to experience a bottleneck, thus significantly reducing system performance. Two key strategies that can enhance the efficiency of data-intensive applications are task scheduling and data replication. This research proposes a novel Data replication and scheduling in the cloud. Initially, the workflow management process is performed with 3 phases (1) workflow placement, (2) clustering of tasks, and (3) scheduling and replication. Initially, the workflow placement takes place. Then, the clustering of tasks is performed via an improved K-means algorithm. Finally, the tasks and datasets are replicated during the scheduling and replication phase. Further, the scheduling and replication are performed using the Self Modified Pelican Optimization Algorithm (SM-POA) based on the execution cost, migration cost, storage cost and replication. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
49. Decentralized System Synchronization among Collaborative Robots via 5G Technology.
- Author
-
Celik, Ali Ekber, Rodriguez, Ignacio, Ayestaran, Rafael Gonzalez, and Yavuz, Sirma Cekirdek
- Subjects
- *
INDUSTRIAL robots , *PROGRAMMABLE controllers , *NETWORK performance , *ROBOTIC assembly , *ASSEMBLY line methods - Abstract
In this article, we propose a distributed synchronization solution to achieve decentralized coordination in a system of collaborative robots. This is done by leveraging cloud-based computing and 5G technology to exchange causal ordering messages between the robots, eliminating the need for centralized control entities or programmable logic controllers in the system. The proposed solution is described, mathematically formulated, implemented in software, and validated over realistic network conditions. Further, the performance of the decentralized solution via 5G technology is compared to that achieved with traditional coordinated/uncoordinated cabled control systems. The results indicate that the proposed decentralized solution leveraging cloud-based 5G wireless is scalable to systems of up to 10 collaborative robots with comparable efficiency to that from standard cabled systems. The proposed solution has direct application in the control of producer–consumer and automated assembly line robotic applications. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
50. A Survey on IoT Application Architectures.
- Author
-
Dauda, Abdulkadir, Flauzac, Olivier, and Nolot, Florent
- Subjects
- *
COMPUTER network traffic , *DATA privacy , *PROCESS capability , *MICROSOFT Azure (Computing platform) , *DATA warehousing - Abstract
The proliferation of the IoT has led to the development of diverse application architectures to optimize IoT systems' deployment, operation, and maintenance. This survey provides a comprehensive overview of the existing IoT application architectures, highlighting their key features, strengths, and limitations. The architectures are categorized based on their deployment models, such as cloud, edge, and fog computing approaches, each offering distinct advantages regarding scalability, latency, and resource efficiency. Cloud architectures leverage centralized data processing and storage capabilities to support large-scale IoT applications but often suffer from high latency and bandwidth constraints. Edge architectures mitigate these issues by bringing computation closer to the data source, enhancing real-time processing, and reducing network congestion. Fog architectures combine the strengths of both cloud and edge paradigms, offering a balanced solution for complex IoT environments. This survey also examines emerging trends and technologies in IoT application management, such as the solutions provided by the major IoT service providers like Intel, AWS, Microsoft Azure, and GCP. Through this study, the survey identifies latency, privacy, and deployment difficulties as key areas for future research. It highlights the need to advance IoT Edge architectures to reduce network traffic, improve data privacy, and enhance interoperability by developing multi-application and multi-protocol edge gateways for efficient IoT application management. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF
Catalog
Discovery Service for Jio Institute Digital Library
For full access to our library's resources, please sign in.