Incorporating Serverless Computing into P2P Networks for ML Training: In-Database Tasks and Their Scalability Implications (Student Abstract)
DOI:
https://doi.org/10.1609/aaai.v38i21.30419Keywords:
AI Architectures, Machine Learning, Multiagent Learning, Multiagent SystemsAbstract
Distributed ML addresses challenges from increasing data and model complexities. Peer to peer (P2P) networks in distributed ML offer scalability and fault tolerance. However, they also encounter challenges related to resource consumption, and communication overhead as the number of participating peers grows. This research introduces a novel architecture that combines serverless computing with P2P networks for distributed training. Serverless computing enhances this model with parallel processing and cost effective scalability, suitable for resource-intensive tasks. Preliminary results show that peers can offload expensive computational tasks to serverless platforms. However, their inherent statelessness necessitates strong communication methods, suggesting a pivotal role for databases. To this end, we have enhanced an in memory database to support ML training tasks.Downloads
Published
2024-03-24
How to Cite
Barrak, A. (2024). Incorporating Serverless Computing into P2P Networks for ML Training: In-Database Tasks and Their Scalability Implications (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 38(21), 23439-23440. https://doi.org/10.1609/aaai.v38i21.30419
Issue
Section
AAAI Student Abstract and Poster Program