Electrical and Computer Engineering ETDs

Publication Date

Fall 12-13-2025

Abstract

As machine learning adoption expands, data privacy concerns have grown significantly. Federated Learning (FL) addresses this challenge by allowing clients to train locally and share only their model parameters with the server. However, conventional FL methods such as FedAvg suffer from high communication overhead, as all clients must participate in every global round.

This paper proposes FedChae (Federated Learning with Client Clustering and Hybrid Adaptive Engagement) to balance communication efficiency and model accuracy. FedChae alternates between Grouping Rounds, where all clients perform clustering, and Conventional Rounds, where one client per cluster updates the model.

Simulations with 100 clients using Multi-class Logistic Regression models on MNIST datasets show that FedChae maintains accuracy within ±0.38% of FedAvg while reducing communication by up to 77%. This result demonstrates that clustering and adaptive engagement can significantly lower communication costs without degrading global accuracy.

Keywords

Federated Learning, Client Clustering, Communication Efficiency, Non-IID Distribution, Representative Selection, FedChae

Document Type

Thesis

Language

English

Degree Name

Computer Engineering

Level of Degree

Masters

Department Name

Electrical and Computer Engineering

First Committee Member (Chair)

Michael Devetsikiotis

Second Committee Member

Hyunsang Son

Third Committee Member

Milad Marvian

approval.pdf (51 kB)

Share

COinS