Synthesizing a complete tomographic study with nodules from multiple radiograph views via deep generative models
Chen, Andrew
Loading…
Permalink
https://hdl.handle.net/2142/108587
Description
Title
Synthesizing a complete tomographic study with nodules from multiple radiograph views via deep generative models
Author(s)
Chen, Andrew
Issue Date
2020-07-09
Director of Research (if dissertation) or Advisor (if thesis)
Koyejo, Oluwasanmi
Department of Study
Electrical & Computer Eng
Discipline
Electrical & Computer Engr
Degree Granting Institution
University of Illinois at Urbana-Champaign
Degree Name
M.S.
Degree Level
Thesis
Keyword(s)
radiography
machine learning
generative modeling
synthesis
Abstract
Chest radiography encodes a 3D anatomy into a complex 2D representation. This projection creates distinctive challenges even for the most experienced radiologists as many critical findings are superimposed, often resulting in error or further imaging. In particular, it is difficult to visualize the volume and density of lung nodules in chest radiographs. A deep generative model, commonly used to synthesize realistic images, can be used to perform 2D to 3D translations. In this thesis, we propose a generative model, optimized using pixel-wise error, that can synthesize a complete tomographic study containing nodules from frontal and lateral chest x-ray radiographs. Additionally, the generated studies maintain the proper chest cavity structure.
Use this login method if you
don't
have an
@illinois.edu
email address.
(Oops, I do have one)
IDEALS migrated to a new platform on June 23, 2022. If you created
your account prior to this date, you will have to reset your password
using the forgot-password link below.