WIESNER, David, Julian SUK, Sven DUMMER, Tereza NEČASOVÁ, Ulman VLADIMÍR, David SVOBODA and Jelmer WOLTERINK. Generative modeling of living cells with SO(3)-equivariant implicit neural representations. Medical Image Analysis. Netherlands: Elsevier, 2024, vol. 2024, No 91, p. 102991-103008. ISSN 1361-8415. Available from: https://dx.doi.org/10.1016/j.media.2023.102991.
Other formats:   BibTeX LaTeX RIS
Basic information
Original name Generative modeling of living cells with SO(3)-equivariant implicit neural representations
Authors WIESNER, David (203 Czech Republic, belonging to the institution), Julian SUK (528 Netherlands), Sven DUMMER (528 Netherlands), Tereza NEČASOVÁ (203 Czech Republic, belonging to the institution), Ulman VLADIMÍR (203 Czech Republic), David SVOBODA (203 Czech Republic, guarantor, belonging to the institution) and Jelmer WOLTERINK (528 Netherlands).
Edition Medical Image Analysis, Netherlands, Elsevier, 2024, 1361-8415.
Other information
Original language English
Type of outcome Article in a journal
Field of Study 10201 Computer sciences, information science, bioinformatics
Country of publisher Switzerland
Confidentiality degree is not subject to a state or trade secret
WWW URL
Impact factor Impact factor: 10.900 in 2022
Organization unit Faculty of Informatics
Doi http://dx.doi.org/10.1016/j.media.2023.102991
UT WoS 001171218800001
Keywords in English cell shape modeling; neural networks; implicit neural representations; signed distance function; generative model; interpolation
Tags cbia-web
Tags International impact, Reviewed
Changed by Changed by: RNDr. David Wiesner, Ph.D., učo 255597. Changed: 30/4/2024 14:23.
Abstract
Data-driven cell tracking and segmentation methods in biomedical imaging require diverse and information-rich training data. In cases where the number of training samples is limited, synthetic computer-generated data sets can be used to improve these methods. This requires the synthesis of cell shapes as well as corresponding microscopy images using generative models. To synthesize realistic living cell shapes, the shape representation used by the generative model should be able to accurately represent fine details and changes in topology, which are common in cells. These requirements are not met by 3D voxel masks, which are restricted in resolution, and polygon meshes, which do not easily model processes like cell growth and mitosis. In this work, we propose to represent living cell shapes as level sets of signed distance functions (SDFs) which are estimated by neural networks. We optimize a fully-connected neural network to provide an implicit representation of the SDF value at any point in a 3D+time domain, conditioned on a learned latent code that is disentangled from the rotation of the cell shape. We demonstrate the effectiveness of this approach on cells that exhibit rapid deformations (Platynereis dumerilii), cells that grow and divide (C. elegans), and cells that have growing and branching filopodial protrusions (A549 human lung carcinoma cells). A quantitative evaluation using shape features and Dice similarity coefficients of real and synthetic cell shapes shows that our model can generate topologically plausible complex cell shapes in 3D+time with high similarity to real living cell shapes. Finally, we show how microscopy images of living cells that correspond to our generated cell shapes can be synthesized using an image-to-image model.
Links
LM2023050, research and development projectName: Národní infrastruktura pro biologické a medicínské zobrazování
Investor: Ministry of Education, Youth and Sports of the CR, Czech BioImaging: National research infrastructure for biological and medical imaging
MUNI/A/1081/2022, interní kód MUName: Modelování, analýza a verifikace (2023)
Investor: Masaryk University
PrintDisplayed: 26/5/2024 08:52