Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

Compared to other modalities such as computed tomography or magnetic resonance imaging, the appearance of ultrasound images is highly dependent on the expertise of the sonographer or clinician making the image acquisition, as well as the machine used, making it a challenge to analyze due to the frequent presence of artefacts, missing boundaries, attenuation, shadows, and speckle. In addition, manual contouring of the epicardial and endocardial walls exhibits large inconsistencies and variations as it is strongly dependent on the sonographer's training and expertise. Hence, in this paper we propose a fully automated image analysis framework to ultimately perform wall motion abnormality classification in 2D+T images. We explore both traditional Random Forests classification with handcrafted features and spatio-temporal hierarchical aggregation of information with a deep learning CNN-based approach. Regarding the later classifier, we also investigate the effect of local phase information retrieval through the use of Feature Asymmetry (FA), and demonstrate that pre-processing videos with FA enables the spatio-temporal CNN to better discover relevant left ventricle endocardial abstractions from low-level features to high-level representations automatically.

Original publication




Conference paper

Publication Date





3140 - 3143