RMIT University
Browse

FedFTHA: A Fine-Tuning and Head Aggregation Method in Federated Learning

journal contribution
posted on 2024-11-03, 09:25 authored by Yansong Wang, Hui Xu, Waqar Ali, Miaobo Li, Xiangmin ZhouXiangmin Zhou, Jie Shao
Personalized federated learning is a sub-field of federated learning. Contrary to conventional federated learning that expects to find a general global model, personalized federated learning generates a personalized model adapted to its local data distribution for each client. Some existing personalized federated learning methods only consider improving the client-side personalization ability, discarding the server-side generalization capacity. To address this issue, we propose a fine-tuning and head aggregation method in federated learning (FedFTHA). It allows each client to maintain a personalized model head and fine-tune it after each local update to generate a local model containing the personalized head. Specifically, during FedFTHA training, these personalized heads are aggregated to generate a generalized head for the global model. FedFTHA meets the needs of both client-side model personalization and server-side model generalization. In addition, a universal optimization framework is employed to prove its convergence under convex and non-convex conditions. We verify the personalization ability and generalization performance of FedFTHA under heterogeneous settings with benchmark datasets. The comparative analysis authenticates the significance of our proposal.

History

Journal

IEEE Internet of Things Journal

Volume

10

Issue

14

Start page

12749

End page

12761

Total pages

13

Publisher

IEEE

Place published

United States

Language

English

Copyright

© 2023 IEEE

Former Identifier

2006123133

Esploro creation date

2023-10-13

Usage metrics

    Scholarly Works

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC