Encoding biological metaverse: Advancements and challenges in neural fields from macroscopic to microscopic
Neural fields can efficiently encode three-dimensional (3D) scenes, providing a bridge between two-dimensional (2D) images and virtual reality. This method becomes a trendsetter in bringing the metaverse into vivo life. It has initially captured the attention of macroscopic biology, as demonstrated...
Saved in:
Published in | Innovation (New York, NY) Vol. 5; no. 3; p. 100627 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
United States
Elsevier Inc
06.05.2024
Elsevier |
Online Access | Get full text |
Cover
Loading…
Summary: | Neural fields can efficiently encode three-dimensional (3D) scenes, providing a bridge between two-dimensional (2D) images and virtual reality. This method becomes a trendsetter in bringing the metaverse into vivo life. It has initially captured the attention of macroscopic biology, as demonstrated by computed tomography and magnetic resonance imaging, which provide a 3D field of view for diagnostic biological images. Meanwhile, it has also opened up new research opportunities in microscopic imaging, such as achieving clearer de novo protein structure reconstructions. Introducing this method to the field of biology is particularly significant, as it is refining the approach to studying biological images. However, many biologists have yet to fully appreciate the distinctive meaning of neural fields in transforming 2D images into 3D perspectives. This article discusses the application of neural fields in both microscopic and macroscopic biological images and their practical uses in biomedicine, highlighting the broad prospects of neural fields in the future biological metaverse. We stand at the threshold of an exciting new era, where the advancements in neural field technology herald the dawn of exploring the mysteries of life in innovative ways. |
---|---|
Bibliography: | ObjectType-Article-2 SourceType-Scholarly Journals-1 ObjectType-Feature-3 content type line 23 ObjectType-Review-1 |
ISSN: | 2666-6758 2666-6758 |
DOI: | 10.1016/j.xinn.2024.100627 |