Dissertation/Thesis Abstract

Multimodal 3-D segmentation of optic nerve head structures from spectral domain OCT volumes and color fundus photographs
by Hu, Zhihong, Ph.D., The University of Iowa, 2011, 144; 3552000
Abstract (Summary)

Currently available methods for managing glaucoma, e.g. the planimetry on stereo disc photographs, involve a subjective component either by the patient or examiner. In addition, a few structures may overlap together on the essential 2-D images, which can decrease reproducibility. Spectral domain optical coherence tomography (SD-OCT) provides a 3-D, cross-sectional, microscale depiction of biological tissues. Given the wealth of volumetric information at microscale resolution available with SD-OCT, it is likely that better parameters can be obtained for measuring glaucoma changes that move beyond what is possible using fundus photography etc.

The neural canal opening (NCO) is a 3-D single anatomic structure in SD-OCT volumes. It is proposed as a basis for a stable reference plane from which various optic nerve morphometric parameters can be derived. The overall aim of this Ph.D. project is to develop a framework to segment the 3-D NCO and its related structure retinal vessels using information from SD-OCT volumes and/or fundus photographs to aid the management of glaucoma changes.

Based on the mutual positional relationship of the NCO and vessels, a multimodal 3-D scale-learning-based framework is developed to iteratively identify them in SD-OCT volumes by incorporating each other's pre-identified positional information. The algorithm first applies a 3-D wavelet-transform-learning-based layer segmentation and pre-segments the NCO using graph search. To aid a better NCO detection, the vessels are identified either using a SD-OCT segmentation approach incorporating the presegmented NCO positional information to the vessel classification or a multimodal approach combining the complementary features from SD-OCT volumes and fundus photographs (or a registered-fundus approach based on the original fundus vessel segmentation). The obtained vessel positional information is then used to help enhance the NCO segmentation by incorporating that to the cost function of graph search. Note that the 3-D wavelet transform via lifting scheme has been used to remove high frequency noises and extract texture properties in SD-OCT volumes etc. The graph search has been used for finding the optimal solution of 3-D multiple surfaces using edge and additionally regional information. In this work, the use of the 3-D wavelet-transform-learning-based cost function for the graph search is a further extension of the 3-D wavelet transform and graph search.

The major contributions of this work include: 1) extending the 3-D graph theoretic segmentation to the use of 3-D scale-learning-based cost function, 2) developing a graph theoretic approach for segmenting the NCO in SD-OCT volumes, 3) developing a 3-D wavelet-transform-learning-based graph theoretic approach for segmenting the NCO in SD-OCT volumes by iteratively utilizing the pre-identified NCO and vessel positional information (from 4 or 5), 4) developing a vessel classification approach in SD-OCT volumes by incorporating the pre-segmented NCO positional information to the vessel classification to suppress the NCO false positives, and 5) developing a multimodal concurrent classification and a registered-fundus approach for better identifying vessels in SD-OCT volumes using additional fundus information.

Indexing (document details)
Advisor: Garvin, Mona K.
Commitee: Abramoff, Michael D., Garvin, Mona K., Reinhardt, Joseph M., Sonka, Milan, Wu, Xiaodong
School: The University of Iowa
Department: Electrical and Computer Engineering
School Location: United States -- Iowa
Source: DAI-B 74/06(E), Dissertation Abstracts International
Subjects: Computer Engineering, Engineering, Electrical engineering
Keywords: Color fundus photographs, Multimodal, Optic nerve head structures, Spectral domain oct volumes, Three-d segmentation, Wavelet-transform-learning-based
Publication Number: 3552000
ISBN: 978-1-267-90197-2
Copyright © 2020 ProQuest LLC. All rights reserved. Terms and Conditions Privacy Policy Cookie Policy