Detailed Information on Publication Record
2017
Fast Subsequence Matching in Motion Capture Data
SEDMIDUBSKÝ, Jan, Pavel ZEZULA and Jan ŠVECBasic information
Original name
Fast Subsequence Matching in Motion Capture Data
Authors
SEDMIDUBSKÝ, Jan (203 Czech Republic, guarantor, belonging to the institution), Pavel ZEZULA (203 Czech Republic, belonging to the institution) and Jan ŠVEC (203 Czech Republic)
Edition
Cham, 21st European Conference on Advances in Databases and Information Systems, p. 59-72, 14 pp. 2017
Publisher
Springer
Other information
Language
English
Type of outcome
Stať ve sborníku
Field of Study
10201 Computer sciences, information science, bioinformatics
Confidentiality degree
není předmětem státního či obchodního tajemství
Publication form
printed version "print"
RIV identification code
RIV/00216224:14330/17:00094760
Organization unit
Faculty of Informatics
ISBN
978-3-319-66916-8
UT WoS
000463611400005
Keywords in English
subsequence matching; motion capture data; content-based retrieval; similarity measure; segmentation; indexing
Tags
International impact, Reviewed
Změněno: 14/5/2020 15:14, RNDr. Pavel Šmerk, Ph.D.
Abstract
V originále
Motion capture data digitally represent human movements by sequences of body configurations in time. Subsequence matching in such spatio-temporal data is difficult as query-relevant motions can vary in lengths and occur arbitrarily in a very long motion. To deal with these problems, we propose a new subsequence matching approach which (1) partitions both short query and long data motion into fixed-size segments that overlap only partly, (2) uses an effective similarity measure to efficiently retrieve data segments that are the most similar to query segments, and (3) localizes the most query-relevant subsequences within extended and merged retrieved segments in a four-step postprocessing phase. The whole retrieval process is effective and fast in comparison with related work. A real-life 68-minute data motion can be searched in about 1s with the average precision of 87.98% for 5-NN queries.
Links
GBP103/12/G084, research and development project |
|