详细信息
文献类型:期刊文献
中文题名:基于差分隐私的自适应个性化联邦学习方法
英文题名:Adaptive personalized federated learning base on differential privacy
作者:张芳 龙士工 郭晟南 刘光源 张珺铭
第一作者:张芳
机构:[1]贵州大学公共大数据国家重点实验室,贵州贵阳550025;[2]贵州大学计算机科学与技术学院,贵州贵阳550025;[3]贵州理工学院大数据学院,贵州贵阳550003;[4]贵州建设职业技术学院基础部,贵州贵阳551400
第一机构:贵州大学公共大数据国家重点实验室,贵州贵阳550025
年份:2025
卷号:46
期号:9
起止页码:2525-2532
中文期刊名:计算机工程与设计
外文期刊名:Computer Engineering and Design
收录:;北大核心:【北大核心2023】;
基金:国家自然科学基金项目(62062020)。
语种:中文
中文关键词:个性化联邦学习;隐私保护;差分隐私;非独立同分布;自适应正则化;自适应步长;联邦优化
外文关键词:personalized federated learning;privacy protection;differential privacy;non-independent and identically distributed;adaptive regularization;adaptive step size;federated optimization
摘要:为减少联邦学习中用户隐私泄露风险和增强非独立同分布下的模型性能,提出基于差分隐私的自适应个性化联邦学习方法(FedADRL)。FedADRL动态划分本地参数为个性化参数和全局参数,并为其对应的损失函数增加自适应正则项,以增强本地模型的稳定性和减少噪声对全局模型的影响。同时,结合用户级本地化差分隐私增强隐私保护。最后,根据自适应步长对本地模型训练步长进行动态调整,加快模型收敛。实验结果验证,FedADRL在多个数据集下能够提高模型的精度和收敛效率。
To mitigate the risk of user privacy leakage in federated learning and improve model performance in non-IID data scenarios,an adaptive personalized federated learning method based on differential privacy(FedADRL)was proposed.Client local parameters were dynamically partitioned into personalized and global parameters,and adaptive regularization terms were introduced in the corresponding loss functions to enhance local training stability and mitigate the impact of noisy updates on the global model.Additionally,user-level localized differential privacy was incorporated to bolster privacy protection.Finally,the training step size of local models was dynamically adjusted using an adaptive method to expedite model convergence.Experimental results validate that both accuracy and convergence efficiency can be improved across multiple datasets with the implementation of FedADRL.
参考文献:
正在载入数据...
