Aishik Konwer, Ph.D. Research Proficiency Presentation: 'Predicting Disease Trajectory on Medical Imaging'

Dates: 
Thursday, September 2, 2021 - 2:00pm to 3:30pm
Location: 
Zoom - contact events@cs.stonybrook.edu for Zoom info.
Event Description: 
Abstract:

Automated image-based diagnostic tasks are largely focused on a single time point, usually at disease presentation, and do not explicitly consider temporal disease manifestations. However, a clinically relevant question that physicians face is how to predict the future trajectory of a disease, i.e. disease prognosis. We will review an array of frameworks which jointly employ Convolutional and Recurrent neural networks (CNN-RNN) for predictive modeling. Examples like P-net and TempSeq-Net will be used to demonstrate how disease trajectories are captured for lung tumor and ocular images, respectively. However, none of these methods jointly exploit the spatial distribution of disease characteristics within images and the temporal information across timepoints. Also, such analysis has mostly been performed on smaller medical imaging datasets leading to overfitting and poor generalizability, whereas training of deep neural networks on large datasets requires expensive annotations. We will discuss the merits of self-supervised learning for pretraining transformer networks and using them for downstream prediction tasks.

To overcome the above challenges, we present a two-stage long short term memory (LSTM) model that encodes a combination of spatial and temporal information. The first stage aggregates spatial information from different locations of the diseased organ. This module also considers imaging variability in the adjoining spatial locations. The second stage learns to aggregate information from temporal images and unravels the information to predict the severity at a future time point. It has also been clinically shown that certain disease patterns in adjacent spatial locations tend to be related to each other. We propose a multi-scale Gated Recurrent Unit (GRU) framework with a dedicated correlation module within our temporal encoder that exploits this latent state inter-zone similarity with a novel correlation loss. Experiments were performed on a multi-institutional COVID-19 dataset that comprises sequential chest radiographs. Our approaches outperform transfer learning and radiomic feature-based baselines on this dataset.

Computed Event Type: 
Mis