USER TRAJECTORY ESTIMATION BASED ON INERTIAL MEASUREMENT UNIT DATA

The present disclosure relates generally to the field of network navigation, and particularly to techniques for estimating of a user trajectory in a wireless communication network. For this purpose, inertial measurement unit (IMU) data are used, which comprise first-type measurements and second-type...

Full description

Saved in:
Bibliographic Details
Main Authors PIKHLETSKY, Mikhail Viktorovich, GONCHAROV, Alexey, GADAEV, Tamaz, VASILIEV, Ilia, STRIJOV, Vadim, NIKITIN, Filipp, GARTSEEV, Ilya Borisovich, ZHARIKOV, Ilya
Format Patent
LanguageEnglish
French
German
Published 19.07.2023
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The present disclosure relates generally to the field of network navigation, and particularly to techniques for estimating of a user trajectory in a wireless communication network. For this purpose, inertial measurement unit (IMU) data are used, which comprise first-type measurements and second-type measurements from an IMU being part of a mobile user equipment (UE) carried by a user. The first-type measurements and the second-type measurements are initially taken in an IMU-linked coordinate frame. Further, the first-type measurements or the second-type measurements are used to obtain a representation of rotational orientation of the IMU. Then, the first-type measurements and the second-type measurements are transferred from the IMU-linked coordinate frame to a user-linked coordinate frame, whereupon velocity vectors of the UE in the user-linked coordinate frame are determined by using a pre-trained neural network (NN). The velocity vectors are next transferred from the user-linked coordinate frame to a world coordinate frame by using the orientation of rotational orientation of the IMU. The user trajectory coordinates are calculated by integrating the velocity vectors in the world coordinate frame. By so doing, it is possible to estimate the user trajectory with high accuracy, especially, in indoor scenarios.
Bibliography:Application Number: EP20200828572