Towards Fair and Privacy-Preserving Federated Deep Models

Date

2020

Authors

Lyu, Lingjuan
Yu, Jiangshan
Nandakumar, Karthik
Li, Yitong
Ma, Xingjun
Jin, Jiong
Yu, Han
Ng, Kee Siong

Journal Title

Journal ISSN

Volume Title

Publisher

Institute of Electrical and Electronics Engineers (IEEE Inc)

Abstract

The current standalone deep learning framework tends to result in overfitting and low utility. This problem can be addressed by either a centralized framework that deploys a central server to train a global model on the joint data from all parties, or a distributed framework that leverages a parameter server to aggregate local model updates. Server-based solutions are prone to the problem of a single-point-of-failure. In this respect, collaborative learning frameworks, such as federated learning (FL), are more robust. Existing federated learning frameworks overlook an important aspect of participation: fairness. All parties are given the same final model without regard to their contributions. To address these issues, we propose a decentralized Fair and Privacy-Preserving Deep Learning (FPPDL) framework to incorporate fairness into federated deep learning models. In particular, we design a local credibility mutual evaluation mechanism to guarantee fairness, and a three-layer onion-style encryption scheme to guarantee both accuracy and privacy. Different from existing FL paradigm, under FPPDL, each participant receives a different version of the FL model with performance commensurate with his contributions. Experiments on benchmark datasets demonstrate that FPPDL balances fairness, privacy and accuracy. It enables federated learning ecosystems to detect and isolate low-contribution parties, thereby promoting responsible participation.

Description

Keywords

Federated learning, privacy-preserving, deep learning, fairness, encryption

Citation

Source

IEEE Transactions on Parallel and Distributed Systems

Type

Journal article

Book Title

Entity type

Access Statement

Open Access

License Rights

Restricted until