Deformable image registration using convolutional neural networks

Koen A.J. Eppenhof*, Maxime W. Lafarge, Pim Moeskops, Mitko Veta, Josien P.W. Pluim

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionAcademicpeer-review

Abstract

Deformable image registration can be time-consuming and often needs extensive parameterization to perform well on a specific application. We present a step towards a registration framework based on a three-dimensional convolutional neural network. The network directly learns transformations between pairs of three-dimensional images. The outputs of the network are three maps for the x, y, and z components of a thin plate spline transformation grid. The network is trained on synthetic random transformations, which are applied to a small set of representative images for the desired application. Training therefore does not require manually annotated ground truth deformation information. The methodology is demonstrated on public data sets of inspiration-expiration lung CT image pairs, which come with annotated corresponding landmarks for evaluation of the registration accuracy. Advantages of this methodology are its fast registration times and its minimal parameterization.

Original languageEnglish
Title of host publicationMedical Imaging 2018
Subtitle of host publicationImage Processing
PublisherSPIE
ISBN (Electronic)9781510616370
DOIs
Publication statusPublished - 1 Jan 2018
Externally publishedYes
EventMedical Imaging 2018: Image Processing - Houston, United States
Duration: 11 Feb 201813 Feb 2018

Publication series

NameSPIE Proceedings
Volume10574

Conference

ConferenceMedical Imaging 2018: Image Processing
Country/TerritoryUnited States
CityHouston
Period11/02/1813/02/18

Keywords

  • convolutional networks
  • deformable image registration
  • machine learning
  • thoracic CT

Fingerprint

Dive into the research topics of 'Deformable image registration using convolutional neural networks'. Together they form a unique fingerprint.

Cite this