SecDD: Efficient and Secure Method for Remotely Training Neural Networks (Student Abstract)
Keywords:Deep Learning, Dataset Distillation, Privacy Preservation, Synthetic Data
AbstractWe leverage what are typically considered the worst qualities of deep learning algorithms - high computational cost, requirement for large data, no explainability, high dependence on hyper-parameter choice, overfitting, and vulnerability to adversarial perturbations - in order to create a method for the secure and efficient training of remotely deployed neural networks over unsecure channels.
How to Cite
Sucholutsky, I., & Schonlau, M. (2021). SecDD: Efficient and Secure Method for Remotely Training Neural Networks (Student Abstract). Proceedings of the AAAI Conference on Artificial Intelligence, 35(18), 15897-15898. Retrieved from https://ojs.aaai.org/index.php/AAAI/article/view/17945
AAAI Student Abstract and Poster Program