AUB ScholarWorks

FedSAM: Sharpness-Aware Minimization for Improved Generalization Under FL Settings

Show simple item record

dc.contributor.advisor Nouiehed, Maher
dc.contributor.author Al Kakoun, Razan
dc.date.accessioned 2024-05-08T08:17:04Z
dc.date.available 2024-05-08T08:17:04Z
dc.date.issued 2024-05-08
dc.date.submitted 2024-05-03
dc.identifier.uri http://hdl.handle.net/10938/24412
dc.description.abstract While being extensively studied in machine learning community, the problem of improving generalization in Federated Learning (FL) is still in its infancy. The main challenge stems from the heterogeneous nature of client data and the varying computational capacity of clients. Many researchers have recently linked the generalization gap to the sharpness of the landscape of the optimization model. In Foret P. et al, Sun Y. et al, and Qu Z. et al, a Sharpness-Aware Minimization (SAM) framework that seeks flat minima by penalizing sharp regions was introduced. In this thesis, we propose a SAM-like approach for improving generalization in FL settings. Unlike several existing methods that incorporate SAM when training local models, our proposed framework penalizes the loss of the global function. To motivate our approach, we first provide a counter-example that shows that finding flat minima for local clients does not necessarily result in a flat aggregation for the global model. Furthermore, we develop an efficient sharpness-aware algorithm that adaptively computes global gradient similarity parameters for penalizing sharp regions. Harnessing these similarity parameters, a distinct sharpness penalty parameter is shared with each client. In particular, clients with varying local data distribution receive different penalty terms. We mathematically established the convergence of our suggested algorithm. Then, to demonstrate the efficiency of our algorithm, we perform several experiments on MNIST, FMNIST, and CIFAR datasets. Our results show a significant increase in generalization performance compared to existing approaches.
dc.language.iso en
dc.subject Federated Learning
dc.subject Sharpness_Aware_Minimization
dc.subject Generalization
dc.subject SAM
dc.subject FedSAM
dc.title FedSAM: Sharpness-Aware Minimization for Improved Generalization Under FL Settings
dc.type Thesis
dc.contributor.department Graduate Program in Computational Science
dc.contributor.faculty Faculty of Arts and Sciences
dc.contributor.commembers Nassif, Nabil
dc.contributor.commembers Maddah, Bacel
dc.contributor.degree MS
dc.contributor.AUBidnumber 201907039


Files in this item

This item appears in the following Collection(s)

Show simple item record

Search AUB ScholarWorks


Browse

My Account