一种基于“云-边”协同计算的新安全联邦学习方案

A novel federated learning scheme based on cloud-edge collaborative computing

  • 摘要: 在联邦学习过程中, 如何高效保护本地训练模型和全局训练模型的隐私性与完整性, 是一个亟待解决的问题。传统基于差分隐私的安全联邦学习方法存在计算开销大、通信能耗高且执行时间长等问题。为此, 基于“云-边”协同计算, 提出了一种新的安全高效的联邦学习方案(Secure and Efficient Federated Learning, 简称SEFL)。SEFL通过在云服务器(Cloud Sever, 简称CS)上配置基于英特尔SGX的TEE (Trusted Execution Environment)来保障模型汇聚的安全性, 综合利用对称加密技术与非对称加密技术来保护CS与边缘服务器(Edge Sever, 简称ES)间的通信安全性, 并通过在ES上构建链式存储结构来提高模型存储的安全性。理论分析和实验结果显示, SEFL具有较高的安全性, 且能够有效提高联邦学习的训练效率。

     

    Abstract: In the process of federated learning, how to efficiently protect the privacy and the integrity of local and global training models is an urgent problem to be solved. Traditional secure federated learning methods based on differential privacy have shortcomings such as high computational overhead, high communication energy consumption and long execution time. Therefore, a novel secure and efficient federated learning scheme (SEFL) based on "cloud edge" collaborative computing was proposed. SEFL ensured the security of model aggregation by configuring the Intel SGX-based TEE (Trusted Execution Environment) on the Cloud Server (CS). It combined the symmetric and the asymmetric encryption technologies to protect the communication security between CS and Edge Servers (ESs), and improved the security of model storage by constructing a chained storage structure on ESs. Theoretical analyses and experimental results showed that SEFL could secure FL and effectively improve the FL training efficiency.

     

/

返回文章
返回