Transferring Microscopy Image Modalities with Conditional Generative Adversarial Networks


Phase Contrast (PC) and Differential Interference Contrast (DIC) microscopy are two popular non-invasive techniques for monitoring live cells. Each of these two image modalities has its own advantages and disadvantages to visualize specimens, so biologists need these two complementary modalities together to analyze specimens. In this paper, we investigate a conditional Generative Adversarial Network (conditional GAN), which contains one generator and two discriminators, to transfer microscopy image modalities. Given a training dataset consisting of pairs of images (source and destination) captured on the same set of specimens by DIC and Phase Contrast microscopes, we can train a conditional GAN, and with this well-trained GAN, we can generate the corresponding Phase Contrast image given a new DIC image, vice versa. The preliminary experiments demonstrate that our approach outperforms one state-of-the-arts method, and can provide biologists a computational way to switch between microscopy image modalities, so biologists can combine the advantages of different image modalities to better visualize and analyze specimens over time, without purchasing all types of microscopy image modalities or switching between imaging systems back-andforth during time-lapse experiments.

Meeting Name

IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (2017: Jul. 21-26, Honolulu, HI)


Computer Science

Research Center/Lab(s)

Intelligent Systems Center

Keywords and Phrases

Computer Vision; Pattern Recognition; Adversarial Networks; Differential Interference Contrast Microscopy; Microscopy Images; Noninvasive Technique; Phase Contrast Microscopes; Phase-Contrast Image; State of the Art; Training Dataset; Image Analysis

International Standard Book Number (ISBN)


International Standard Serial Number (ISSN)


Document Type

Article - Conference proceedings

Document Version


File Type





© 2017 IEEE Computer Society, All rights reserved.

Publication Date

01 Jul 2017