SENetCount: An Optimized Encoder-Decoder Architecture with Squeeze-and-Excitation for Crowd Counting
Crowd management is critical to preventing stampedes and directing crowds, especially in India and China, where there are more than one billion people. With the continuous growth of the population, crowded events caused by rallies, parades, tourism, and other reasons occur from time to time. Crowd c...
Saved in:
Published in | Wireless communications and mobile computing Vol. 2022; pp. 1 - 13 |
---|---|
Main Author | |
Format | Journal Article |
Language | English |
Published |
Oxford
Hindawi
20.06.2022
Hindawi Limited |
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Crowd management is critical to preventing stampedes and directing crowds, especially in India and China, where there are more than one billion people. With the continuous growth of the population, crowded events caused by rallies, parades, tourism, and other reasons occur from time to time. Crowd count estimation is the linchpin of the crowd management system and has become an increasingly important task and challenging research direction. This work proposes an optimized encoder-decoder architecture with the squeeze-and-excitation block for crowd counting, called SENetCount, which includes SE-ResNetCount and SE-ResNeXtCount. The deeper and stronger backbone network increases the quality of feature representations. The squeeze-and-excitation block utilizes global information to impress worthy informative feature representations and suppress unworthy ones selectively. The encoder-decoder architecture with the dense atrous spatial pyramid pooling module recovers the spatial information and captures the contextual information at multiple scales. The modified loss function considers the local consistency measure compared with the foregoing Euclidean loss function. The experiments on challenging datasets prove that our approach is competitive compared to thoughtful approaches, and analyses show that our architecture is extensible and robust. |
---|---|
ISSN: | 1530-8669 1530-8677 |
DOI: | 10.1155/2022/2964683 |