Swanand Kadhe (UC Berkeley)

Date: Sep 2, 2020

Title and Abstract

FastSecAgg: Scalable Secure Aggregation for Privacy-Preserving Federated Learning

Recent attacks on federated learning demonstrate that model parameters shared by clients can leak information about their training data. Therefore, it is crucial to develop ‘secure aggregation’ protocols that allow the server to aggregate client models in a privacy-preserving manner. In this talk, we present a secure aggregation protocol, FastSecAgg, that is efficient in terms of computation and communication, and robust to client dropouts. The main building block of FastSecAgg is a novel multi-secret sharing scheme, FastShare, based on the fast Fourier transform. For N clients, FastShare enables the server to perform secure aggregation with the computational complexity of O(N log N) as opposed to the O(N^2) complexity of conventional schemes. It is (i) information-theoretically secure even if the server colludes with ‘any’ subset of some constant fraction (e.g.  10 %) of the clients in the honest-but-curious setting; and (ii) tolerates dropouts of a ‘random’ subset of some constant fraction (e.g.  10 %) of clients. In general, FastShare can achieve a trade-off between the number of secrets, privacy threshold, and dropout tolerance. Riding on the capabilities of FastShare, FastSecAgg achieves smaller computation cost than existing secure aggregation schemes. Further, when model updates are larger than the number of clients (which is typical in federated learning), FastSecAgg achieves the same orderwise communication cost as existing schemes.

Bio

Swanand Kadhe is a postdoc at the EECS Department of University of California, Berkeley, working with Kannan Ramchandran. He is also affiliated with the Berkeley Laboratory for Information and System Sciences (BLISS) and Berkeley Audio-visual Signal processing and Communication Systems (BASiCS). He received his Ph.D. degree in the ECE Department at Texas A&M University in 2017. His research interests lie in the areas of machine learning in distributed and federated environments, blockchains, distributed computing and storage systems: in particular, dealing with the challenges of privacy, security, and scalability