SCENE EXTRACTION METHOD, DEVICE, AND PROGRAM

To provide a scene extraction method and device capable of automatically extracting, when a specific posture scene that is a basis of the target scene is detected, a target scene by tracing back to a related posture scene related to the specific posture scene.SOLUTION: A camera image acquisition uni...

Full description

Saved in:
Bibliographic Details
Main Author TASAKA KAZUYUKI
Format Patent
LanguageEnglish
Japanese
Published 16.09.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:To provide a scene extraction method and device capable of automatically extracting, when a specific posture scene that is a basis of the target scene is detected, a target scene by tracing back to a related posture scene related to the specific posture scene.SOLUTION: A camera image acquisition unit 101 acquires camera images from a plurality of cameras Cam that shoot a competition field. The camera image is recorded in an image DB 103, and a frame image is acquired from the camera image by a frame image acquisition unit 102. A posture estimation unit 104 estimates the posture of a person extracted from each frame image for each camera. A scene detection unit 105 detects a scene Qs of the specific posture and a scene Rs of the related posture on the basis of the result of the posture estimation. A target scene determination unit 106 determines a target scene on the basis of a reproduction time of the specific posture scene Qs and a reproduction time of the related posture scene Rs. A target scene reproduction unit 107 reproduces the target scene.SELECTED DRAWING: Figure 1 【課題】注目シーンの根拠となる特定姿勢シーンが検知されると、このシーンに関連した関連姿勢シーンまで遡って注目シーンを自動的に抽出できるシーン抽出方法および装置を提供する。【解決手段】カメラ映像取得部101は、競技フィールドを撮影する複数のカメラCamからカメラ映像を取得する。カメラ映像は映像DB103に録画されると共に、フレーム画像取得部102によりカメラ映像からフレーム画像で取得される。姿勢推定部104は、カメラごとに各フレーム画像から抽出した人物の姿勢を推定する。シーン検知部105は、姿勢推定の結果に基づいて特定姿勢のシーンQsおよび関連姿勢のシーンRsを検知する。注目シーン決定部106は、特定姿勢シーンQsの再生時刻および関連姿勢シーンRsの再生時刻に基づいて注目シーンを決定する。注目シーン再生部107は注目シーンを再生する。【選択図】図1
Bibliography:Application Number: JP20200037619