Google microscope combines AI, AR for real-time cancer detection

April 18, 2018 // By Rich Pell
Google (Mountain View, CA) has built a prototype microscope with real-time artificial intelligence (AI) capabilities for cancer detection.

The augmented reality (AR) microscope platform consists of a modified light microscope that enables real-time image analysis and presentation of the results of machine learning algorithms directly into the field of view. Google researchers say they believe that the microscope can possibly help "accelerate and democratize" the adoption of deep learning tools to help make pathologists more efficient and save lives around the world.

Currently, direct viewing of tissue samples using a standard compound optical (or "light") microscope is the predominant means by which pathologists diagnose illness. A critical barrier to the widespread adoption of deep learning in pathology, says Google, is the dependence on having a digital representation of the microscopic tissue.

The augmented reality microscope (ARM) developed by Google researchers aims to change that. It can be retrofitted into existing light microscopes found in hospitals and clinics around the world using low-cost, readily-available components, and without the need for whole slide digital versions of the tissue being analyzed.

"Modern computational components and deep learning models, such as those built upon TensorFlow, will allow a wide range of pre-trained models to run on this platform," say the researchers. "As in a traditional analog microscope, the user views the sample through the eyepiece. A machine learning algorithm projects its output back into the optical path in real-time. This digital projection is visually superimposed on the original (analog) image of the specimen to assist the viewer in localizing or quantifying features of interest."

Importantly, the researchers note, the computation and visual feedback update quickly. The current implementation runs at approximately 10 frames per second, resulting in the model output updating seamlessly as the user scans the tissue by moving the slide or changing magnification. In addition, the researchers say, the ARM should be able to offer a range of visual feedback options - such as text, arrows, contours, heatmaps, or animations - and is capable of running many types of

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.