Information Bottleneck Classification in Extremely Distributed Systems

We present a new decentralized classification system based on a distributed architecture. This system consists of distributed nodes, each possessing their own datasets and computing modules, along with a centralized server, which provides probes to classification and aggregates the responses of node...

Full description

Saved in:
Bibliographic Details
Published inEntropy (Basel, Switzerland) Vol. 22; no. 11; p. 1237
Main Authors Ullmann, Denis, Rezaeifar, Shideh, Taran, Olga, Holotyak, Taras, Panos, Brandon, Voloshynovskiy, Slava
Format Journal Article
LanguageEnglish
Published Switzerland MDPI 30.10.2020
MDPI AG
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:We present a new decentralized classification system based on a distributed architecture. This system consists of distributed nodes, each possessing their own datasets and computing modules, along with a centralized server, which provides probes to classification and aggregates the responses of nodes for a final decision. Each node, with access to its own training dataset of a given class, is trained based on an auto-encoder system consisting of a fixed a pre-trained and a Hence, these auto-encoders are highly dependent on the class probability distribution for which the reconstruction distortion is minimized. Alternatively, when an encoding-quantizing-decoding node observes data from different distributions, unseen at training, there is a mismatch, and such a decoding is not optimal, leading to a significant increase of the reconstruction distortion. The final classification is performed at the centralized classifier that votes for the class with the minimum reconstruction distortion. In addition to the system applicability for applications facing big-data communication problems and or requiring private classification, the above distributed scheme creates a theoretical bridge to the information bottleneck principle. The proposed system demonstrates a very promising performance on basic datasets such as MNIST and FasionMNIST.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1099-4300
1099-4300
DOI:10.3390/e22111237