RoadNet: An 80-mW Hardware Accelerator for Road Detection

As a fundamental feature of intelligent vehicles, vision-based road detection must be executed on a real-time embedded platform with high accuracy. Road detection is often applied in conjunction with lane detection to determine the drivable regions. Although some existing research based on large dee...

Full description

Saved in:
Bibliographic Details
Published inIEEE embedded systems letters Vol. 11; no. 1; pp. 21 - 24
Main Authors Zhou, Yuteng, Lyu, Yecheng, Huang, Xinming
Format Journal Article
LanguageEnglish
Published Piscataway IEEE 01.03.2019
The Institute of Electrical and Electronics Engineers, Inc. (IEEE)
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:As a fundamental feature of intelligent vehicles, vision-based road detection must be executed on a real-time embedded platform with high accuracy. Road detection is often applied in conjunction with lane detection to determine the drivable regions. Although some existing research based on large deep learning models achieved high accuracy using the road detection dataset, they often did not consider the low power requirement of a typical embedded system. In this letter, an ultralow-power hardware accelerator for road detection is proposed. By adopting a top-down convolutional neural network (CNN) structure, a small CNN, namely RoadNet, is trained that can achieve near state-of-the-art detection accuracy. Furthermore, each CNN layer is trimmed to be computationally identical and every processing element in the architecture is fully utilized. When implemented using 32-nm process technology, the proposed hardware accelerator requires the chip area of 0.45 mm 2 and the power consumption of only 80 mW, which results in an equivalent power efficiency of about 300 GOP/s/W. The RoadNet chip is capable of processing 241 frames/s at 1080P image resolution. It stands out as an ultralow-power hardware accelerator in an embedded system for road detection.
ISSN:1943-0663
1943-0671
DOI:10.1109/LES.2018.2841199