Bhoyar, Manoj (2025) AI-driven cloud optimization: Leveraging machine learning for dynamic resource allocation. World Journal of Advanced Engineering Technology and Sciences, 15 (2). pp. 877-884. ISSN 2582-8266
![WJAETS-2025-0608.pdf [thumbnail of WJAETS-2025-0608.pdf]](https://eprint.scholarsrepository.com/style/images/fileicons/text.png)
WJAETS-2025-0608.pdf - Published Version
Available under License Creative Commons Attribution Non-commercial Share Alike.
Abstract
This research paper explores the application of artificial intelligence (AI) and machine learning (ML) techniques in optimizing cloud resource allocation. The study investigates how AI-driven approaches can enhance the efficiency and effectiveness of cloud computing systems through dynamic resource allocation. We present a comprehensive review of existing methodologies, propose novel algorithms, and conduct extensive experiments to validate the effectiveness of our approach. The results demonstrate significant improvements in resource utilization, cost reduction, and overall system performance compared to traditional static allocation methods.
Item Type: | Article |
---|---|
Official URL: | https://doi.org/10.30574/wjaets.2025.15.2.0608 |
Uncontrolled Keywords: | Cloud Computing; Resource Allocation; Artificial Intelligence; Machine Learning; Deep Q-Network; LSTM; Genetic Algorithms; Dynamic Optimization; SLA Violations |
Depositing User: | Editor Engineering Section |
Date Deposited: | 04 Aug 2025 16:24 |
Related URLs: | |
URI: | https://eprint.scholarsrepository.com/id/eprint/3616 |