Back to Search
Start Over
Unchaining Microservice Chains : Machine Learning Driven Optimization in Cloud Native Systems
- Publication Year :
- 2023
-
Abstract
- As the cloud native landscape flourishes, microservices emerge as a central pillar for contemporary software development, enabling agility, resilience, and scalability in modern computing environments. While these modular services promise opportunities, particularly in the transformative ecosystem of 5G and beyond, they also introduce a myriad of complexities. Notably, the migration from hardware-centric to software-defined environments, culminating in Virtual Network Functions (VNF), has facilitated dynamic deployments across cloud data centers. In this transition, VNFs are often deployed within cloud native environments as independent services, mirroring the microservices model. However, the advantage of flexibility in cloud native systems is shadowed by bottlenecks in computational resource allocation, sub-optimal service chain placements, and the perpetual quest for performance enhancement. Addressing these concerns is not just pivotal but indispensable for harnessing the true potential of microservice chains. In this thesis, the inherent challenges presented by cloud native microservice chains are addressed through the development and application of various tools and methodologies. The NFV-Inspector is introduced as a foundational tool, employing a systematic approach to profile and analyze Virtual Network Functions, subsequently extracting essential system KPIs essential for further modeling. Subsequent research introduced a Machine Learning (ML) based SLA-Aware resource recommendation system for cloud native functions. This system leveraged regression modeling techniques to correlate key performance metrics. Following this, PerfSim is proposed as a performance simulation tool designed specifically for cloud native computing environments, aiming to improve the accuracy of microservice chain simulations. Further research is conducted on Service Function Chain (SFC) Placement, emphasizing the equilibrium between cost-efficiency and latency optimization. The thes<br />In the dynamic cloud native landscape, microservices stand out as pivotal for modern software development, enhancing agility, resilience, and scalability. These services, crucial in the transformative 5G era, introduce complexities such as resource allocation, service chain placement, and performance optimization challenges. This thesis delves into these challenges, emphasizing the development and application of tools and methodologies specific to microservice chains. Key contributions include the NFV-Inspector, which, while focusing on Virtual Network Functions, is instrumental in profiling and analyzing microservices, extracting vital KPIs for advanced modeling. Further, a Machine Learning-based SLA-Aware system is introduced for resource recommendation in cloud-native functions, utilizing regression modeling to link performance metrics. PerfSim, another simulation framework, is proposed for simulating microservice chains in cloud environments. The thesis also explores Service Function Chain (SFC) placement, aiming to balance cost-efficiency with latency optimization. The thesis concludes by integrating Deep Learning (DL) for service chain optimization, employing both Graph Attention Networks (GAT) and Deep Q-Learning (DQN), showcasing the potentials of DL in SFC optimization.
Details
- Database :
- OAIster
- Notes :
- application/pdf, English
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1416071517
- Document Type :
- Electronic Resource