Fast converging Federated Learning with Non-IID Data

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorNaas, Si Ahmeden_US
dc.contributor.authorSigg, Stephanen_US
dc.contributor.departmentDepartment of Communications and Networkingen
dc.contributor.departmentDepartment of Information and Communications Engineeringen
dc.contributor.groupauthorAmbient Intelligenceen
dc.date.accessioned2024-01-17T08:08:15Z
dc.date.available2024-01-17T08:08:15Z
dc.date.issued2023en_US
dc.descriptionPublisher Copyright: © 2023 IEEE.
dc.description.abstractWith the advancement of device capabilities, Internet of Things (IoT) devices can employ built-in hardware to perform machine learning (ML) tasks, extending their horizons in many promising directions. In traditional ML, data are sent to a server for training. However, this approach raises user privacy concerns. On the other hand, transferring user data to a cloud-centric environment results in increased latency. A decentralized ML technique, Federated learning (FL), has been proposed to enable devices to train locally on personal data and then send the data to a server for model aggregation. In these models, malicious devices, or devices with a minor contribution to a global model, increase communication rounds and resource usage. Likewise, heterogeneous data, such as non-independent and identically distributed (Non-IID), may decrease accuracy of the FL model. This paper proposes a mechanism to quantify device contributions based on weight divergence. We propose an outlier-removal approach which identifies irrelevant device updates. Client selection probabilities are computed using a Bayesian model. To obtain a global model, we employ a novel merging algorithm utilizing weight shifting values to ensure convergence towards more accurate predictions. A simulation using the MNIST dataset employing both non-iid and iid devices, distributed on 10 Jetson Nano devices, shows that our approach converges faster, significantly reduces communication cost, and improves accuracy.en
dc.description.versionPeer revieweden
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationNaas, S A & Sigg, S 2023, Fast converging Federated Learning with Non-IID Data. in 2023 IEEE 97th Vehicular Technology Conference, VTC 2023-Spring - Proceedings. IEEE Vehicular Technology Conference, vol. 2023-June, IEEE, IEEE Vehicular Technology Conference, Florence, Italy, 20/06/2023. https://doi.org/10.1109/VTC2023-Spring57618.2023.10200108en
dc.identifier.doi10.1109/VTC2023-Spring57618.2023.10200108en_US
dc.identifier.isbn979-8-3503-1114-3
dc.identifier.issn1550-2252
dc.identifier.otherPURE UUID: 09b0b96a-3847-4d49-a2dd-654d2f07ae8ben_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/09b0b96a-3847-4d49-a2dd-654d2f07ae8ben_US
dc.identifier.otherPURE LINK: http://www.scopus.com/inward/record.url?scp=85169783103&partnerID=8YFLogxK
dc.identifier.otherPURE FILEURL: https://research.aalto.fi/files/134171980/Fast_converging_Federated_Learning_with_Non-IID_Data_final.pdfen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/125750
dc.identifier.urnURN:NBN:fi:aalto-202401171425
dc.language.isoenen
dc.relation.ispartofIEEE Vehicular Technology Conferenceen
dc.relation.ispartofseries2023 IEEE 97th Vehicular Technology Conference, VTC 2023-Spring - Proceedingsen
dc.relation.ispartofseriesIEEE Vehicular Technology Conference ; Volume 2023-Juneen
dc.rightsopenAccessen
dc.subject.keywordcommunication reductionen_US
dc.subject.keywordedge computingen_US
dc.subject.keywordFederated learningen_US
dc.subject.keywordfog networksen_US
dc.subject.keywordInternet of thingsen_US
dc.subject.keywordnon-iid dataen_US
dc.titleFast converging Federated Learning with Non-IID Dataen
dc.typeA4 Artikkeli konferenssijulkaisussafi
dc.type.versionacceptedVersion

Files