Robust Lane Detection Based On Convolutional Neural Network and Random Sample Consensus

In this paper, we introduce a robust lane detection method based on the combined convolutional neural network (CNN) with random sample consensus (RANSAC) algorithm. At first, we calculate edges in an image using a hat shape kernel and then detect lanes using the CNN combined with the RANSAC. If the...

Full description

Saved in:
Bibliographic Details
Published inNeural Information Processing pp. 454 - 461
Main Authors Kim, Jihun, Lee, Minho
Format Book Chapter
LanguageEnglish
Published Cham Springer International Publishing 2014
SeriesLecture Notes in Computer Science
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:In this paper, we introduce a robust lane detection method based on the combined convolutional neural network (CNN) with random sample consensus (RANSAC) algorithm. At first, we calculate edges in an image using a hat shape kernel and then detect lanes using the CNN combined with the RANSAC. If the road scene is simple, we can easily detect the lane by using the RANSAC algorithm only. But if the road scene is complex and includes roadside trees, fence, or intersection etc., then it is hard to detect lanes robustly because of noisy edges. To alleviate that problem, we use CNN in the lane detection before and after applying the RANSAC algorithm. In training process of CNN, input data consist of edge images in a region of interest (ROI) and target data become the images that have only drawn real white color lane in black background. The CNN structure consists of 8 layers with 3 convolutional layers, 2 subsampling layers and multi-layer perceptron (MLP) including 3 fully-connected layers. Convolutional and subsampling layers are hierarchically arranged and their arrangement represents a deep structure in deep learning. As a result, proposed lane detection algorithm successfully eliminates noise lines and the performance is found to be better than other formal line detection algorithms such as RANSAC and hough transform.
ISBN:3319126369
9783319126364
ISSN:0302-9743
1611-3349
DOI:10.1007/978-3-319-12637-1_57