Neural 3D Strokes: Creating Stylized 3D Scenes with Vectorized 3D Strokes
We present Neural 3D Strokes, a novel technique to generate stylized images of a 3D scene at arbitrary novel views from multi-view 2D images. Different from existing methods which apply stylization to trained neural radiance fields at the voxel level, our approach draws inspiration from image-to-pai...
Saved in:
Main Authors | , , , |
---|---|
Format | Journal Article |
Language | English |
Published |
27.11.2023
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | We present Neural 3D Strokes, a novel technique to generate stylized images
of a 3D scene at arbitrary novel views from multi-view 2D images. Different
from existing methods which apply stylization to trained neural radiance fields
at the voxel level, our approach draws inspiration from image-to-painting
methods, simulating the progressive painting process of human artwork with
vector strokes. We develop a palette of stylized 3D strokes from basic
primitives and splines, and consider the 3D scene stylization task as a
multi-view reconstruction process based on these 3D stroke primitives. Instead
of directly searching for the parameters of these 3D strokes, which would be
too costly, we introduce a differentiable renderer that allows optimizing
stroke parameters using gradient descent, and propose a training scheme to
alleviate the vanishing gradient issue. The extensive evaluation demonstrates
that our approach effectively synthesizes 3D scenes with significant geometric
and aesthetic stylization while maintaining a consistent appearance across
different views. Our method can be further integrated with style loss and
image-text contrastive models to extend its applications, including color
transfer and text-driven 3D scene drawing. Results and code are available at
http://buaavrcg.github.io/Neural3DStrokes. |
---|---|
DOI: | 10.48550/arxiv.2311.15637 |