ScatterFormer: Efficient Voxel Transformer with Scattered Linear Attention
Window-based transformers excel in large-scale point cloud understanding by capturing context-aware representations with affordable attention computation in a more localized manner. However, the sparse nature of point clouds leads to a significant variance in the number of voxels per window. Existin...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
31.12.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Window-based transformers excel in large-scale point cloud understanding by
capturing context-aware representations with affordable attention computation
in a more localized manner. However, the sparse nature of point clouds leads to
a significant variance in the number of voxels per window. Existing methods
group the voxels in each window into fixed-length sequences through extensive
sorting and padding operations, resulting in a non-negligible computational and
memory overhead. In this paper, we introduce ScatterFormer, which to the best
of our knowledge, is the first to directly apply attention to voxels across
different windows as a single sequence. The key of ScatterFormer is a Scattered
Linear Attention (SLA) module, which leverages the pre-computation of key-value
pairs in linear attention to enable parallel computation on the variable-length
voxel sequences divided by windows. Leveraging the hierarchical structure of
GPUs and shared memory, we propose a chunk-wise algorithm that reduces the SLA
module's latency to less than 1 millisecond on moderate GPUs. Furthermore, we
develop a cross-window interaction module that improves the locality and
connectivity of voxel features across different windows, eliminating the need
for extensive window shifting. Our proposed ScatterFormer demonstrates 73.8 mAP
(L2) on the Waymo Open Dataset and 72.4 NDS on the NuScenes dataset, running at
an outstanding detection rate of 23 FPS.The code is available at
\href{https://github.com/skyhehe123/ScatterFormer}{https://github.com/skyhehe123/ScatterFormer}. |
---|---|
DOI: | 10.48550/arxiv.2401.00912 |