Method and apparatus for tracking an object

There is provided a computer-implemented method and apparatus for determining temporal behaviour of an object. The method comprises receiving track data indicative of a plurality of tracks. Each track identifies an association between features corresponding to an object in each of a plurality of ima...

Full description

Saved in:
Bibliographic Details
Main Authors Stagg, Matthew, Russell, James, Langley, Kevin
Format Patent
LanguageEnglish
Published 30.01.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:There is provided a computer-implemented method and apparatus for determining temporal behaviour of an object. The method comprises receiving track data indicative of a plurality of tracks. Each track identifies an association between features corresponding to an object in each of a plurality of images forming image data representative of one or more objects. The method further comprises identifying, in at least some of the plurality of tracks, one or more faults, wherein each fault is associated with at least one respective fault image of the image data. The method further comprises iteratively performing a correction process on the track data. The correction process comprises selecting one of the plurality of tracks having a fault identified therein; determining one or more candidate features in each respective fault image, wherein each candidate feature is determined as a candidate for correcting at least one fault associated with one or more of the plurality of tracks; determining one or more candidate corrections for the selected track, wherein at least some of the candidate corrections are associated with one or more of the candidate features; selecting one of the candidate corrections for the selected track in dependence on a metric indicative of an effect of the candidate correction on the plurality of tracks; and applying the selected candidate correction to the selected track in dependence on one or more criteria.
Bibliography:Application Number: US201917311301