A Hierarchical Convolutional Neural Network for Vesicle Fusion Event Classification
Quantitative analysis of vesicle exocytosis and classification of different modes of vesicle fusion from the fluorescence microscopy are of primary importance for biomedical researches. In this paper, we propose a novel Hierarchical Convolutional Neural Network (HCNN) method to automatically identify vesicle fusion events in time-lapse Total Internal Reflection Fluorescence Microscopy (TIRFM) image sequences. Firstly, a detection and tracking method is developed to extract image patch sequences containing potential fusion events. Then, a Gaussian Mixture Model (GMM) is applied on each image patch of the patch sequence with outliers rejected for robust Gaussian fitting. By utilizing the high-level time-series intensity change features introduced by GMM and the visual appearance features embedded in some key moments of the fusion process, the proposed HCNN architecture is able to classify each candidate patch sequence into three classes: full fusion event, partial fusion event and non-fusion event. Finally, we validate the performance of our method on 9 challenging datasets that have been annotated by cell biologists, and our method achieves better performances when comparing with three previous methods.
H. Li et al., "A Hierarchical Convolutional Neural Network for Vesicle Fusion Event Classification," Computerized Medical Imaging and Graphics, vol. 60, pp. 22-34, Elsevier Ltd, Sep 2017.
The definitive version is available at http://dx.doi.org/10.1016/j.compmedimag.2017.04.003
Keywords and Phrases
Convolution; Fluorescence; Fluorescence Microscopy; Gaussian Distribution; Neural Networks; Refractive Index; Biomedical Research; Convolutional Neural Network; Detection and Tracking; Event Classification; Gaussian Mixture Model; TIRFM Image; Total Internal Reflection Fluorescence Microscopy; Vesicle Fusion; Image Fusion; Hierarchical Convolutional Neural Network; Vesicle Fusion Event
International Standard Serial Number (ISSN)
Article - Journal
© 2017 Elsevier Ltd, All rights reserved.