V originále
Current motion-capture technologies produce continuous streams of 3D human joint trajectories. One of the challenges is to automatically annotate such streams of complex spatio-temporal data in real time. In this paper, we propose an efficient approach to label motion stream data in real time with a limited usage of main memory. Based on a set of user-defined motion profiles, each of them specified by multiple representative samples, the currently visible part of an input motion stream is processed by identifying a moderate number of segments of various lengths. These segments are compared to the profiles to measure their similarity. The segments having a high similarity to a given motion profile are annotated with the corresponding label. The proposed approach performs fast, allows profiles to be dynamically changed at runtime, and does not require any learning procedure, in comparison with existing solutions evaluated on real-life data.