Privacy-Preserving Multi-Task Learning
Multi-task learning (MTL), improving learning performance by transferring information between related tasks, has drawn more and more attention in the data mining field. To tackle tasks whose data are stored at different locations (or nodes), distributed MTL was proposed. It not only enhances the learning performance but also improves the computing efficiency since it transforms the original centralized computing framework into a distributed computing framework under which computations can be done in parallel. The major drawback of the distributed MTL is a potential violation of confidentiality when the data stored at each node contain sensitive information (e.g., medical records). Some distributed MTL algorithms were designed to protect the original by only transferring aggregate information (e.g., supports or gradients) from each node to a server who combines the received information to produce the desired models. However, since aggregate data may still leak sensitive information, the security guarantee of the existing solutions cannot be formally proved or verified. Thus, the goal of this paper is to develop a provable privacy-preserving multi-task learning (PP-MTL) protocol that incorporates the state of the art cryptographic techniques to achieve the best security guarantee. We also conducted experiments to demonstrate the efficiency of our proposed method.
K. Liu et al., "Privacy-Preserving Multi-Task Learning," Proceedings of the 2018 IEEE International Conference on Data Mining (2018, Singapore, Singapore), pp. 1128 - 1133, Institute of Electrical and Electronics Engineers (IEEE), Nov 2018.
The definitive version is available at https://doi.org/10.1109/ICDM.2018.00147
2018 IEEE International Conference on Data Mining, ICDM 2018 (2018: Nov. 17-20, Singapore, Singapore)
Intelligent Systems Center
Keywords and Phrases
Aggregates; Data privacy; Distributed computer systems; Efficiency; Learning systems; Linearization; Computing efficiency; Computing frameworks; Cryptographic techniques; Distributed computing frameworks; Learning performance; Multitask learning; Privacy preserving; Sensitive informations; Data mining; Multi task learning
International Standard Book Number (ISBN)
International Standard Serial Number (ISSN)
Article - Conference proceedings
© 2018 Institute of Electrical and Electronics Engineers (IEEE), All rights reserved.
01 Nov 2018