NARISETTI, Narendra, Michael HENKE, Kerstin NEUMANN, Frieder STOLZENBURG, Thomas ALTMANN a Evgeny GLADILIN. Deep Learning Based Greenhouse Image Segmentation and Shoot Phenotyping (DeepShoot). Frontiers in Plant Science. Lausanne: FRONTIERS MEDIA SA, 2022, roč. 13, JUL, s. 1-16. ISSN 1664-462X. Dostupné z: https://dx.doi.org/10.3389/fpls.2022.906410.
Další formáty:   BibTeX LaTeX RIS
Základní údaje
Originální název Deep Learning Based Greenhouse Image Segmentation and Shoot Phenotyping (DeepShoot)
Autoři NARISETTI, Narendra, Michael HENKE (276 Německo, domácí), Kerstin NEUMANN, Frieder STOLZENBURG, Thomas ALTMANN a Evgeny GLADILIN.
Vydání Frontiers in Plant Science, Lausanne, FRONTIERS MEDIA SA, 2022, 1664-462X.
Další údaje
Originální jazyk angličtina
Typ výsledku Článek v odborném periodiku
Obor 10600 1.6 Biological sciences
Stát vydavatele Švýcarsko
Utajení není předmětem státního či obchodního tajemství
WWW odkaz na webovou stránku
Impakt faktor Impact factor: 5.600
Kód RIV RIV/00216224:14740/22:00127315
Organizační jednotka Středoevropský technologický institut
Doi http://dx.doi.org/10.3389/fpls.2022.906410
UT WoS 000832787800001
Klíčová slova anglicky greenhouse image analysis; image segmentation; deep learning; U-net; quantitative plant phenotyping
Štítky CF PLANT, rivok
Příznaky Mezinárodní význam, Recenzováno
Změnil Změnila: Mgr. Pavla Foltynová, Ph.D., učo 106624. Změněno: 6. 2. 2023 19:35.
Anotace
BackgroundAutomated analysis of large image data is highly demanded in high-throughput plant phenotyping. Due to large variability in optical plant appearance and experimental setups, advanced machine and deep learning techniques are required for automated detection and segmentation of plant structures in complex optical scenes. MethodsHere, we present a GUI-based software tool (DeepShoot) for efficient, fully automated segmentation and quantitative analysis of greenhouse-grown shoots which is based on pre-trained U-net deep learning models of arabidopsis, maize, and wheat plant appearance in different rotational side- and top-views. ResultsOur experimental results show that the developed algorithmic framework performs automated segmentation of side- and top-view images of different shoots acquired at different developmental stages using different phenotyping facilities with an average accuracy of more than 90% and outperforms shallow as well as conventional and encoder backbone networks in cross-validation tests with respect to both precision and performance time. ConclusionThe DeepShoot tool presented in this study provides an efficient solution for automated segmentation and phenotypic characterization of greenhouse-grown plant shoots suitable also for end-users without advanced IT skills. Primarily trained on images of three selected plants, this tool can be applied to images of other plant species exhibiting similar optical properties.
Návaznosti
EF16_026/0008446, projekt VaVNázev: Integrace signálu a epigenetické reprogramování pro produktivitu rostlin
VytisknoutZobrazeno: 3. 9. 2024 06:15