Utilizing masked autoencoder generative models to extract microscopy representation autoencoder embeddings

The present disclosure relates to systems, non-transitory computer-readable media, and methods for training and utilizing generative machine learning models to generate embeddings from phenomic images (or other microscopy representations). For example, the disclosed systems can train a generative ma...

Full description

Saved in:
Bibliographic Details
Main Authors Beaini, Dominique, Celik, Safiye, Leung, Jessica Wai Yin, Earnshaw, Berton Allen, Saberian, Mohammadsadegh, Khan, Ayla Yasmin, Sharma, Vasudev, Fallah, Maryam, Sypetkowski, Maciej, Kenyon-Dean, Kian Runnels, Mabey, Benjamin John, McLean, Peter Foster, Balakrishnan, Jaichitra, Kraus, Oren Zeev, Morse, Kristen Rose, Cheng, Chi, Makes, Maureen Katherine
Format Patent
LanguageEnglish
Published 15.10.2024
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The present disclosure relates to systems, non-transitory computer-readable media, and methods for training and utilizing generative machine learning models to generate embeddings from phenomic images (or other microscopy representations). For example, the disclosed systems can train a generative machine learning model (e.g., a masked autoencoder generative model) to generate predicted (or reconstructed) phenomic images from masked version of ground truth training phenomic images. In some cases, the disclosed systems utilize a momentum-tracking optimizer while reducing a loss of the generative machine learning model to enable efficient training on large scale training image batches. Furthermore, the disclosed systems can utilize Fourier transformation losses with multi-stage weighting to improve the accuracy of the generative machine learning model on the phenomic images during training. Indeed, the disclosed systems can utilize the trained generative machine learning model to generate phenomic embeddings from input phenomic images (for various phenomic comparisons).
Bibliography:Application Number: US202318545438