Detailed Information on Publication Record
2014
MUSE framework 1.5
RUSŇÁK, Vít, Lukáš RUČKA, Pavel KAJABA, Desana DAXNEROVÁ, Matej MINÁRIK et. al.Basic information
Original name
MUSE framework 1.5
Name in Czech
MUSE framework 1.5
Authors
RUSŇÁK, Vít, Lukáš RUČKA, Pavel KAJABA, Desana DAXNEROVÁ, Matej MINÁRIK, Jiří KAREŠ, Michal BÁBEL and Petr HOLUB
Edition
2014
Other information
Language
English
Type of outcome
Software
Field of Study
10201 Computer sciences, information science, bioinformatics
Country of publisher
Czech Republic
Confidentiality degree
není předmětem státního či obchodního tajemství
References:
Organization unit
Faculty of Informatics
Keywords in English
multi-touch; human-computer interaction; HCI; interaction framework; gesture recognition; group collaborative environment
Technical parameters
Kontaktní osoba: Vít Rusňák, Fakulta Informatiky, Masarykova univerzita, Botanická 68a, Brno, e-mail:xrusnak@fi.muni.cz; Software je šířen pod BSD licencí. Uživatel software souhlasí s podmínkami této licence.
Změněno: 19/1/2015 15:26, RNDr. Vít Rusňák, Ph.D.
Abstract
V originále
MUSE is a framework for developing cost-affordable interactive environments. It allows for rapid development of interactive systems based on tabletops and interactive tiled-display walls as well as for general interactive spaces. It serves as a testing environment for a development of new interaction techniques for large-scale interactive systems. The framework provides different interaction possibilities of multimodal user interfaces which are made of multiple low-cost commodity devices (e.g., single- and multi-touch overlay panels and foils, web cameras, depth sensors). Main features of the framework: a) coupling of multiple low-cost commodity multi-touch sensors which are represented as a single seamless interface; b) functions for distinguishing and continuous association of users and touch operations they performed; c) hand-tracking for identification and distinguishing users' body parts (e.g., hands) used in association with touch input events; d) integration of gesture recognition algorithms Protractor, $1 and $N to provide experimental gesture recognition of touch events (libreco library) e) extended semantic description of input events enabling their personalisation (e.g., personalized gesture recognition); (libdtuio library) f) highly configurable toolkit and application base. Supported HW: majority of existing resistive and capacitive single- and multi-touch sensor overlay panels and devices, MS Kinect depth tracker in both versions, web cameras.
Links
LM2010005, research and development project |
| ||
MUNI/A/0855/2013, interní kód MU |
| ||
MUNI/33/19/2013, interní kód MU |
|