Detailed Information on Publication Record
2011
GPU-Based Sample-Parallel Context Modeling for EBCOT in JPEG2000
MATELA, Jiří, Vít RUSŇÁK and Petr HOLUBBasic information
Original name
GPU-Based Sample-Parallel Context Modeling for EBCOT in JPEG2000
Authors
MATELA, Jiří (203 Czech Republic, belonging to the institution), Vít RUSŇÁK (203 Czech Republic, belonging to the institution) and Petr HOLUB (203 Czech Republic, guarantor, belonging to the institution)
Edition
Dagstuhl, Germany, Sixth Doctoral Workshop on Mathematical and Engineering Methods in Computer Science -- Selected Papers, p. 77--84, 8 pp. 2011
Publisher
Schloss Dagstuhl--Leibniz-Zentrum fuer Informatik
Other information
Language
English
Type of outcome
Stať ve sborníku
Field of Study
10201 Computer sciences, information science, bioinformatics
Country of publisher
Germany
Confidentiality degree
není předmětem státního či obchodního tajemství
Publication form
printed version "print"
References:
RIV identification code
RIV/00216224:14330/11:00049746
Organization unit
Faculty of Informatics
ISBN
978-3-939897-22-4
ISSN
Keywords in English
EBCOT;JPEG2000;Tier-1;GPU;context modeller
Tags
International impact, Reviewed
Změněno: 15/2/2013 18:28, RNDr. Jiří Matela, Ph.D.
Abstract
V originále
Embedded Block Coding with Optimal Truncation (EBCOT) is the fundamental and computationally very demanding part of the compression process of JPEG2000 image compression standard. EBCOT itself consists of two tiers. In Tier-1, image samples are compressed using context modeling and arithmetic coding. Resulting bit-stream is further formated and truncated in Tier-2. JPEG2000 has a number of applications in various fields where the processing speed and/or latency is a crucial attribute and the main limitation with state of the art implementations. In this paper we propose a new parallel approach to EBCOT context modeling that truly exploits massively parallel capabilities of modern GPUs and enables concurrent processing of individual image samples. Performance evaluation of our prototype shows speedup 12 times for the context modeller, and 1.4--5.3 times for the whole EBCOT Tier-1, which includes not yet optimized arithmetic coder.
Links
GD102/09/H042, research and development project |
| ||
MSM0021622419, plan (intention) |
| ||
MUNI/A/0914/2009, interní kód MU |
|