The Promise of Serverless Computing within Peer-to-Peer Architectures for Distributed ML Training
DOI:
https://doi.org/10.1609/aaai.v38i21.30392Keywords:
Distributed Machine Learning, Serverless Computing, Peer-to-Peer Architectures, Parameter Server Architecture, In-Database Model Updates, Fault ToleranceAbstract
My thesis focuses on the integration of serverless computing with Peer to Peer (P2P) architectures in distributed Machine Learning (ML). This research aims to harness the decentralized, resilient nature of P2P systems, combined with the scalability and automation of serverless platforms. We explore using databases not just for communication but also for in-database model updates and gradient averaging, addressing the challenges of statelessness in serverless environments.Downloads
Published
2024-03-24
How to Cite
Barrak, A. (2024). The Promise of Serverless Computing within Peer-to-Peer Architectures for Distributed ML Training. Proceedings of the AAAI Conference on Artificial Intelligence, 38(21), 23383-23384. https://doi.org/10.1609/aaai.v38i21.30392
Issue
Section
AAAI Doctoral Consortium Track