Improving the Learning Performance of Client’s Local Distribution in Cyclic Federated Learning


  • Li Kang
  • Bin Luo Shenzhen University
  • Jianjun Huang



federated learning, medical image processing, transfer learning


Cyclic federated learning based on distribution information sharing and knowledge distillation (CFL_DS_KD) aims to address the challenges of non-iid data distribution and reduce communication requirements. However, when client data is extremely heterogeneous and scarce, it becomes challenging for clients to fully learn the distribution of local data using GANs, thereby affecting the overall model performance. To overcome this limitation, we propose a transfer learning approach where clients first pretrain their generators on a source domain and then fine-tune them on their local datasets. Our results on the classification of Alzheimer’s disease demonstrate that this method effectively improves client distribution learning performance and enhances the overall model performance.


da Nóbrega RVM, Rebouças Filho PP, Rodrigues MB, da Silva SPP, Dourado Júnior CMJM, de Albu-querque VHC(2020). Lung nodule malignancy classification in chest computed tomography im-ages using transfer learning and convolutional neural networks. Neural Comput Appl 32: 11065-82.

Gessert N, Bengs M, Wittig L, Drömann D, Keck T, Schlaefer A, B.Ellebrecht D(2019). Deep transfer learning methods for colon cancer classification in confocal laser microscopy images. INT J Comput Ass Rad 14: 1837–45.

Hassan M, Ali S, Alquhayz H, Safdar K (2020). De-veloping intelligent medical image modality classification system using deep transfer learning and LDA. Sci Rep 10: 12868.

Hsu TMH, Qi H, Brown M (2019). Measuring the ef-fects of non-identical data distribution for feder-ated visual classification. arXiv preprint arXiv:1909.06335

Karimireddy SP, Kale S, Mohri M, Reddi S, Stich S, Suresh AT (2020). Scaffold: Stochastic controlled averaging for federated learning. International Conference on Machine Learning: 5132-43.

Li T, Sahu AK, Zaheer M, Sanjabi M, Talwalkar A, Smith V(2020). Federated optimization in heter-ogeneous networks. MLSys 2: 429-50.

McMahan HB, Moore E, Ramage D, Hampson S, and y. Arcas BA(2017). Communication-Efficient Learning of Deep Networks from Decentralized Data. Artificial Intelligence and Statistics: 1273-82.

Miotto R, Wang F, Wang S, Jiang X, and Dudley JT (2017). Deep learning for healthcare: review, opportunities and challenges. Brief Bioinform 19: 1236-46.

Pan S, Yang Q (2010). A Survey on Transfer Learning. IEEE T on Knowl Data En 22: 1345-59.

Shoham A, Avidor T, Keren A, Israel N, Benditkis D, Mor-Yosef L, Zeitak I (2019). Overcoming for-getting in federated learning on non-iid data. 31st Conference on Neural Information Processing Systems .

Swati ZNK, Zhao Q, Kabir M, Ali, F, Ali Z, Ahmed S, Lu J(2019). Brain tumor classification for MR images using transfer learning and fine-tuning. Comput Med Imag Grap 75: 34-46.

Wang H, Yurochkin M, Sun Y, Papailiopoulos D, Khazaeni Y (2020). Federated learning with matched averaging. International Conference on Learning Representations 2020.

Wang J, Liu Q, Liang H, Joshi G, Poor HV (2020). Tackling the objective inconsistency problem in heterogeneous federated optimization. 34th Con-ference on Neural Information Processing Sys-tems 33: 7611-23.

Xiao J, Du C, Duan Z, Guo W (2021). A novel serv-er-side aggregation strategy for federated learning in non-iid situations. 20th International Sympo-sium on Parallel and Distributed Computing (ISPDC): 17-24.

Yang Q, Liu Y, Chen T, and Tong Y (2019). Federated Machine Learning: Concept and Applications. ACM T Intel Syst Tec 10: 1-19.

Yu L and Huang J (2022). Cyclic Federated Learning Method Based on Distribution Information Shar-ing and Knowledge Distillation for Medical Data. Electronics 11: 4039.




How to Cite

Kang, L., Luo, B., & Huang, J. (2024). Improving the Learning Performance of Client’s Local Distribution in Cyclic Federated Learning. Image Analysis and Stereology, 43(1), 1–8.



Original Research Paper