TY - JOUR AU - Strubell, Emma AU - Ganesh, Ananya AU - McCallum, Andrew PY - 2020/04/03 Y2 - 2024/03/29 TI - Energy and Policy Considerations for Modern Deep Learning Research JF - Proceedings of the AAAI Conference on Artificial Intelligence JA - AAAI VL - 34 IS - 09 SE - Sister Conference Track DO - 10.1609/aaai.v34i09.7123 UR - https://ojs.aaai.org/index.php/AAAI/article/view/7123 SP - 13693-13696 AB - <p>The field of artificial intelligence has experienced a dramatic methodological shift towards large neural networks trained on plentiful data. This shift has been fueled by recent advances in hardware and techniques enabling remarkable levels of computation, resulting in impressive advances in AI across many applications. However, the massive computation required to obtain these exciting results is costly both financially, due to the price of specialized hardware and electricity or cloud compute time, and to the environment, as a result of non-renewable energy used to fuel modern tensor processing hardware. In a paper published this year at ACL, we brought this issue to the attention of NLP researchers by quantifying the approximate financial and environmental costs of training and tuning neural network models for NLP (Strubell, Ganesh, and McCallum 2019). In this extended abstract, we briefly summarize our findings in NLP, incorporating updated estimates and broader information from recent related publications, and provide actionable recommendations to reduce costs and improve equity in the machine learning and artificial intelligence community.</p> ER -