Skip to content
BY-NC-ND 4.0 license Open Access Published by De Gruyter September 22, 2018

Deep transfer learning for aortic root dilation identification in 3D ultrasound images

  • Jannis Hagenah EMAIL logo , Mattias Heinrich and Floris Ernst


Pre-operative planning of valve-sparing aortic root reconstruction relies on the automatic discrimination of healthy and pathologically dilated aortic roots. The basis of this classification are features extracted from 3D ultrasound images. In previously published approaches, handcrafted features showed a limited classification accuracy. However, feature learning is insufficient due to the small data sets available for this specific problem. In this work, we propose transfer learning to use deep learning on these small data sets. For this purpose, we used the convolutional layers of the pretrained deep neural network VGG16 as a feature extractor. To simplify the problem, we only took two prominent horizontal slices throgh the aortic root, the coaptation plane and the commissure plane, into account by stitching the features of both images together and training a Random Forest classifier on the resulting feature vectors. We evaluated this method on a data set of 48 images (24 healthy, 24 dilated) using 10-fold cross validation. Using the deep learned features we could reach a classification accuracy of 84 %, which clearly outperformed the handcrafted features (71 % accuracy). Even though the VGG16 network was trained on RGB photos and for different classification tasks, the learned features are still relevant for ultrasound image analysis of aortic root pathology identification. Hence, transfer learning makes deep learning possible even on very small ultrasound data sets.

Published Online: 2018-09-22
Published in Print: 2018-09-01

© 2018 the author(s), published by Walter de Gruyter Berlin/Boston

This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License.

Downloaded on 2.12.2023 from
Scroll to top button