Enhancing trash classification in smart cities using federated deep learning

Efficient Waste management plays a crucial role to ensure clean and green environment in the smart cities. This study investigates the critical role of efficient trash classification in achieving sustainable solid waste management within smart city environments. We conduct a comparative analysis of...

Full description

Saved in:
Bibliographic Details
Published inScientific reports Vol. 14; no. 1; p. 11816
Main Authors Ahmed Khan, Haroon, Naqvi, Syed Saud, Alharbi, Abeer A. K., Alotaibi, Salihah, Alkhathami, Mohammed
Format Journal Article
LanguageEnglish
Published London Nature Publishing Group UK 23.05.2024
Nature Publishing Group
Nature Portfolio
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Efficient Waste management plays a crucial role to ensure clean and green environment in the smart cities. This study investigates the critical role of efficient trash classification in achieving sustainable solid waste management within smart city environments. We conduct a comparative analysis of various trash classification methods utilizing deep learning models built on convolutional neural networks (CNNs). Leveraging the PyTorch open-source framework and the TrashBox dataset, we perform experiments involving ten unique deep neural network models. Our approach aims to maximize training accuracy. Through extensive experimentation, we observe the consistent superiority of the ResNext-101 model compared to others, achieving exceptional training, validation, and test accuracies. These findings illuminate the potential of CNN-based techniques in significantly advancing trash classification for optimized solid waste management within smart city initiatives. Lastly, this study presents a distributed framework based on federated learning that can be used to optimize the performance of a combination of CNN models for trash detection.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:2045-2322
2045-2322
DOI:10.1038/s41598-024-62003-4