Panoptic segmentation forecasting for augmented reality

Panoptic segmentation forecasting predicts future positions of foreground objects and background objects separately. An egomotion model may be implemented to estimate egomotion of the camera. Pixels in frames of captured video are classified between foreground and background. The foreground pixels a...

Full description

Saved in:
Bibliographic Details
Main Authors SCHWING, ALEXANDER, FIRMAN, MICHAEL DAVID, TSAI, GRACE SHIN-YEE, GRABER, COLIN, BROSTOW, GABRIEL J
Format Patent
LanguageChinese
English
Published 01.02.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Panoptic segmentation forecasting predicts future positions of foreground objects and background objects separately. An egomotion model may be implemented to estimate egomotion of the camera. Pixels in frames of captured video are classified between foreground and background. The foreground pixels are grouped into foreground objects. A foreground motion model forecasts motion of the foreground objects to a future timestamp. A background motion model backprojects the background pixels into point clouds in a three-dimensional space. The background motion model predicts future positions of the point clouds based on egomotion. The background motion model may further generate novel point clouds to fill in occluded space. With the predicted future positions, the foreground objects and the background pixels are combined into a single panoptic segmentation forecast. An augmented reality mobile game may utilize the panoptic segmentation forecast to accurately portray movement of virtual elements in relation to the rea
Bibliography:Application Number: TW202211113142