AI can analyze CMR scans and detect increase risk of stroke – heart attacks – heart failure

0
879

Artificial intelligence has been used for the first time to instantly and accurately measure blood flow, in a study led by UCL and Barts Health NHS Trust.

The results were found to be able to predict chances of death, heart attack and stroke, and can be used by doctors to help recommend treatments which could improve a patient’s blood flow.

Heart disease is the leading global cause of death and illness.

Reduced blood flow, which is often treatable, is a common symptom of many heart conditions.

International guidelines therefore recommend a number of assessments to measure a patient’s blood flow, but many are invasive and carry a risk.

Non-invasive blood flow assessments are available, including Cardiovascular Magnetic Resonance (CMR) imaging, but up until now, the scan images have been incredibly difficult to analyse in a manner precise enough to deliver a prognosis or recommend treatment.

In the largest study of its kind, funded by British Heart Foundation and published in the journal Circulation, researchers took routine CMR scans from more than 1,000 patients attending St Bartholomew’s Hospital and the Royal Free Hospital and used a new automated artificial intelligence technique to analyse the images.

By doing this, the teams were able to precisely and instantaneously quantify the blood flow to the heart muscle and deliver the measurements to the medical teams treating the patients.

By comparing the AI-generated blood flow results with the health outcomes of each patient, the team found that the patients with reduced blood flow were more likely to have adverse health outcomes including death, heart attack, stroke and heart failure.

The AI technique was therefore shown for the first time to be able to predict which patients might die or suffer major adverse events, better than a doctor could on their own with traditional approaches.

Professor James Moon (UCL Institute of Cardiovascular Science and Barts Health NHS Trust) said: “Artificial intelligence is moving out of the computer labs and into the real world of healthcare, carrying out some tasks better than doctors could do alone.

We have tried to measure blood flow manually before, but it is tedious and time-consuming, taking doctors away from where they are needed most, with their patients.”

This shows one of the scans from the study

Myocardial blood flow “perfusion map” created and analysed using AI showing an area of the heart receiving a reduced blood supply (arrow) and putting the patient at risk of heart attacks and other adverse events. Image is credited to Kristopher Knott et al.

Dr Kristopher Knott (UCL Institute of Cardiovascular Science and Barts Health NHS Trust) added:

“The predictive power and reliability of the AI was impressive and easy to implement within a patient’s routine care.

The calculations were happening as the patients were being scanned, and the results were immediately delivered to doctors.

As poor blood flow is treatable, these better predictions ultimately lead to better patient care, as well as giving us new insights into how the heart works.”

Dr Peter Kellman from the National Institutes of Health (NIH) in the US, who working with Dr Hui Xue at the NIH, developed the automated AI techniques to analyse the images that were used in the study, said:

“This study demonstrates the growing potential of artificial intelligence-assisted imaging technology to improve the detection of heart disease and may move clinicians closer to a precision medicine approach to optimize patient care. We hope that this imaging approach can save lives in the future.”

Dr Kellman is director of the Medical Signal and Image Processing Program at the National Heart, Lung, and Blood Institute, part of the NIH.

Funding: The study was funded by the British Heart Foundation, National Institute for Health Research, European Regional Development Fund and Barts Charity, and involved additional researchers from the Royal Free Hospital, Queen Mary University of London and the University of Leeds.


The burden of cardiovascular disease (CVD) is rapidly increasing due to higher prevalence of obesity, diabetes, and metabolic syndrome [1].

Our current understanding of multivariate risk factors involved in the etiology of CVD is largely due to prospective population-based research studies such as the Framingham Heart Study [2], the MONICA project [3], and the INTERHEART study [4].

These have established the now well-known major cardiovascular risk factors of hypertension, smoking, lipid profile, obesity, diabetes, and inactivity [5]. These studies demonstrate the value of population-based longitudinal studies for predicting and preventing cardiovascular disease.

Recently, non-invasive imaging has been incorporated into several large-scale prospective longitudinal studies, in order to develop predictive biomarkers derived from cardiac structural and functional measurements [67•].

Longitudinal follow-up and monitoring of events enable examination of the progression of disease from sub-clinical manifestations (e.g., remodeling) to clinical symptoms, and the study of the relationship between imaging-derived biomarkers and adverse events.

In particular, cardiac magnetic resonance (CMR) imaging is increasingly used in cohort-based studies, since it requires no ionizing radiation or anatomical “windows” and has high resolution and reproducibility [810].

CMR has a wide range of contrast mechanisms and can provide detailed information on cardiac morphology (size, shape) and function (ventricular pump function, tissue strain and torsion, regional wall motion abnormalities), flow, and microstructure [1112].

The combination of non-invasive imaging with epidemiological and clinical data offers a rich source of “big heart data,” which opens up new avenues of exploration to improve our understanding of the progression of sub-clinical disease across different population groups [13].

These studies therefore form a substantial part of the global move to P4 medicine (predictive, preventive, personalized, and participatory) [14] through big data informatics.

A substantial hurdle that must be overcome for this vision to be realized is the prohibitively large resource currently required for quantification of clinically meaningful parameters from the vast amounts of image information available.

Current clinical practice typically requires manual assessment of the images, which is time consuming and prone to subjective bias in the measurements. Robust and accurate automated image analysis is required for objective assessment of imaging bio-markers. This review examines the challenges involved and recent steps towards this goal.

Large-Scale Cardiovascular Imaging Studies

Examples of large-scale studies which explicitly performed cardiovascular imaging for mechanistic insights into disease progression are summarized below.

Multi-Ethnic Study of Atherosclerosis

The Multi-Ethnic Study of Atherosclerosis (MESA) was designed to investigate the manifestation of sub-clinical disease and the progression to clinical symptoms in several population sub-groups in the USA (African-American, Chinese, Hispanic, and white) [15].

Initiated in 2000, MESA has followed 6814 men and women aged 45–84 years old across six centers for over 10 years. The analysis of 10 years’ follow-up has recently been completed for around 3000 participants [16]. CMR was utilized to assess sub-clinical disease processes [17]. A substantial ancillary study program facilitates data sharing and collaborations.

Jackson Heart Study

The Jackson Heart Study (JHS) was designed to investigate the mechanism of cardiovascular disease in African-Americans living in the southeastern USA (Jackson, MS) [1819]. Overall, 5302 people aged 21–84 years participated.

This high-risk group has increased mortality from cardiovascular disease as well as higher incidence of hypertension, obesity, and diabetes. Economic, sociocultural, behavioral, dietary, and physical activity measures were related with cardio-metabolic risk factors [20]. Both CT and CMR examinations were acquired in approximately 3000 participants to measure heart function and calcium scores [21].

UK Biobank

UK Biobank (UKB) is an extensive study that recruited 500, 000 people aged between 40 and 69 years in 2006–2010 from across the UK. Questionnaires, physical examinations, and biological samples have been obtained.

An imaging enhancement sub-study has recently begun with the aim of imaging 6000 participants in the pilot phase [7•], with the objective of scaling up to 100,000 participants over a 5–6-year period. Imaging modalities include CMR examinations, abdominal MRI, brain MRI, carotid ultrasound, and DEXA. Data are being made available on request.

Canadian Partnership for Tomorrow Project

The Canadian Partnership for Tomorrow Project (CPTP) aims to develop a comprehensive study to understand how environment, lifestyle, and genetics contribute to chronic diseases [22]. Recently, a $16m initiative to gather detailed information from about 10,000 participants was announced (The Canadian Alliance for Healthy Hearts and Minds), including data on environments, lifestyle, and behaviors that could affect their cardiovascular health. Participants will be assessed by MRI evaluation of the brain, blood vessels, heart, and liver.

ICELAND MI

ICELAND MI is an epidemiologic cohort study of the prevalence of myocardial infarction in older individuals. A total of 936 participants were randomly selected from men and women aged 67–93 years. CMR scans were collected, including gadolinium contrast images to identify scar tissue. This study has shown that a high degree of undiagnosed myocardial infarction exists in this cohort and that CMR was able to detect infarction more readily than standard methods [23].

Framingham Offspring Study

The Framingham Offspring study [24] was initiated in 1971 as a continuation of the highly successful Framingham study. Participants undergo periodic examinations every 3–4 years including comprehensive interim history, physical examination, blood pressure, blood tests, as well as other testing on a cycle-specific basis. Of these participants, 1707 underwent CMR scans during 2002–2006 [25].

The Dallas Heart Study

The Dallas Heart Study was initiated in 2000 and designed as a single-center population-based study of multiethnic cardiovascular disease in Dallas County, TX. Of 6101 participants interviewed, 2971 received imaging examinations including cardiac MRI, electron beam CT, and dual-energy X-ray absorptiometry. Cardiac MRI revealed two to three times higher prevalence of LV hypertrophy in blacks than in whites [26].

Registries

Registries such as the EuroCMR registry [27] seek to evaluate the utility and efficacy of imaging in the clinical context. The main goal is to evaluate the prognostic potential of CMR as well as cost-effectiveness. More than 27,000 consecutive patients have been enrolled from 57 centers in 15 countries in the EuroCMR registry. Similarly, the Global CMR registry has recently been established to collate MRI patient data from around the world with 44,000 cases contributed to date.

Data Sharing Initiatives

Sharing imaging and clinical data with the wider research community is essential to the development of the field [28]. Central to this data sharing framework is a secured protection of private patient data, as required under the Health Insurance Portability and Accountability Act (HIPAA) regulations in the USA and Directive 95/46/EC in the EU. In order to facilitate data sharing, all data must be obtained with institutional review board approval and informed participant consent compatible with data sharing. Data must be de-identified and participant confidentiality and privacy must be protected so that the identity of the participants remains unknown.

Infrastructure to support data sharing has been developed by the Cardiac Atlas Project (CAP), a worldwide consortium to host large-scale cardiac image data with derived analyses and associated diagnostic information [29]. Over 3000 CMR cases have been contributed to the database from several different studies.

More than 20 research groups worldwide are using this resource for various research activities, including large-scale generalization of cardiac motion for percutaneous coronary intervention, characterization of shape variation for medical device design, learning-based registration to extract morphological information, quantification of local cardiac remodeling for electromechanical simulations, and automatic identification of wall motion abnormalities.

In order to pool data from several disparate studies, any bias in the results due to imaging or analysis protocol must be removed, so that data from all studies can be compared on a level playing field. Atlas-based bias correction methods have been proposed for solving this problem [30].

atient-specific models of heart shape and motion are used to provide a standard coordinate system, which maps the heart according to anatomical location.

The shape parameters of the models give information on the shape mean and variation across the cohort, as well as the progression of remodeling due to disease or the benefits of treatment.

CAP has developed methods to pool data from different sources in a standardized manner and to correct bias arising from imaging or analysis protocol. CAP is endorsed by the Society for Cardiovascular Magnetic Resonance, which maintains an upload site where cases can be contributed to the atlas project [31].

Atlas-based methods have been applied in the MESA baseline cohort to investigate the shape variation among subcohorts [32•]. Figure 1 shows the analysis pipeline.

Contours derived from the core laboratory were adjusted for breath-hold misregistration and registered to a common coordinate system. Principal component analysis was used to characterize global shape distributions.

After correction for height, the dominant shape component was associated with heart size. After size, the second dominant shape component was sphericity at end-diastole (13 %) and concentricity at end-systole (10 %). The resulting shape components distinguished differences due to ethnicity and risk factors with greater statistical power than traditional mass and volume indices.

An external file that holds a picture, illustration, etc.
Object name is nihms737979f1.jpg
Fig 1Flow chart of the atlas construction: a fiducial landmarks defined on the images (3D view from anterior), bcontours drawn by the core lab, c 3D finite element model showing shape control points (yellow), dcalculation of remodeling indices, e variation …

Challenges

With large population-based studies involving medical imaging, there is an enormous amount of data processing required. Without automated processing methods, this data mountain would be insurmountable. Crucial to the development of such methods is the availability of benchmark datasets with validated ground truth.

These are essential for the validation of algorithms and objective comparison of the strengths and weaknesses of different methods.

Several problem areas are considered below, with emphasis on open “challenges”: community-driven collaborative projects, often held in association with a conference, designed to enable researchers to compare and contrast different methods applied to standardized datasets with common ground truth.

An index of challenges in general biomedical image analysis can be found at http://www.grand-challenge.org/, while more specialized cardiac image and modeling analysis challenges are available at http://www.cardiacatlas.org/web/guest/challenges.

Ventricular Function

Balanced steady-state free precession CMR imaging can provide the most accurate estimates of mass and volume of any imaging modality. Several automated methods have been proposed for locating the inner and outer contours of the left and right ventricles; for a review, see [33]. Common methods include graph cut [34] or level set [35]segmentation methods, and multi-atlas registration and label propagation methods [35].

Benchmarking studies are particularly useful in this area but are limited by the need for validated ground truth. Traditionally, experts manually draw contours on each image, but this is counterproductive for high-volume data due to time-consuming and painstaking processes. This has resulted in limited numbers of cases with expert ground truth.

In 2009, a left ventricular segmentation challenge was held using 45 cases from a mixed patient dataset (normal, heart failure, myocardial infarction, and hypertrophy) by using expert-drawn contours as the ground truth at end-diastolic and end-systolic frames [36].

The data are in the public domain and can be accessed directly via the Cardiac Atlas Project website. To leverage the robustness and usability of large-scale data for ventricular function benchmarking, an updated challenge was held in 2011 with more cases (200 patients with myocardial infarction) and ground truth available for all frames in the cine sequence [37].

An interesting feature of this work is the ability to update the consensus contours using statistical fusion methods. If a new dataset meets certain quality requirements, with acceptable bias and precision, the ground truth contours can be updated to incorporate this new information. This resulted in a mechanism by which researchers can continue to upload results and refine the ground truth [37]. As more groups participate, the consensus becomes more robust and less influenced by any particular contributor.

Flow

Blood flow is directly related to the morphology and function of the cardiovascular system. Accurate blood flow measurement remains a challenge because of the process involving flow velocity field mapping inside heart chambers and through the great vessels as well [38]. Flow through an image slice or within a 3D block of tissue must be analyzed by firstly segmenting the vasculature and secondly integrating the phase contrast velocity within the vessel over time [39].

Goel et al. [40] developed a method for automatically identifying the ascending and descending aorta and computing flow in phase contrast MRI acquisitions, and applied this to 1884 participants of the Dallas Heart Study. Two challenges relating to computational analysis of blood flow with MRI velocity data have made single-case benchmark data available to the community [4142].

Perfusion

The automated analysis of myocardial blood flow (perfusion) remains challenging, since there is no standard perfusion imaging protocol. Quantification of absolute blood flow in milliliters per gram per minute required detailed knowledge of the pulse sequence parameters, in order to correct for the non-linear relationship between contrast concentration and signal intensity [43]. There are also several methods available for quantitative analysis [44]. However, some methods have shown promising results for quantification on a pixel basis [45]. A benchmark dataset has been provided for testing motion correction algorithms in the 2014 STACOM perfusion challenge [46].

Landmark Detection

The location and motion of specific landmarks is useful for quantifying cardiac structural and functional characteristics and as a precursor for other analyses such as ventricular mass and volume quantification.

For example, the location of the mitral valve provides longitudinal shortening as well as enables the base of the left ventricle to be located. A landmark detection challenge in 2012 made 200 cases with manual ground truth available for validation and benchmarking [42]. Machine learning methods show promise for landmark detection but require large datasets with manual ground truth in order to train the algorithms [4748].

Scar Quantification

Late gadolinium enhancement provides a robust method for quantifying the scar burden in patients with myocardial infarction [49]. A left ventricular scar identification challenge was held in 2012 [42]. The challenge made available 30 late gadolinium enhancement MRI data sets to participants for segmentation of enhanced regions from post-myocardial infarction from 15 patients and 15 pigs that had been subjected to myocardial ischemia. Ground truth was established by using manual segmentations from experienced clinical observers.

Left atrial scar burden is also important for evaluation of atrial fibrillation, both for identification of patients at risk and for the evaluation of ablation therapy. A benchmark challenge for left atrial scar burden was performed recently in association with the International Symposium of Biomedical Imaging [50].

Motion Analysis

CMR tissue tagging provides direct measures of tissue function [51]. Myocardial tissue tagging was used in a sub-set of the MESA cohort to evaluate tissue function independently of geometric pump function [52].

The Harmonic Phase method was used for analysis since this is automatic after contours enclosing the heart have been defined [53]. However, phase unwrapping errors and lack of resolution can cause problems. Feature tracking methods can provide robust estimates of global strain from untagged cine images, although regional strain estimates are more variable [54].

These methods were derived from speckle tracking algorithms designed for echo-cardiographic data [55]. All three of these methods were compared in an open challenge for motion estimation in which data from 15 volunteers and a phantom were made available for benchmarking and validation [56].

T1 Mapping

Information on myocardial cellular structure can be inferred from the local tissue T1, and non-contrast T1 mapping methods are now available [57] which give information on the extra-cellular matrix [58]. Non-contrast T1 mapping is being employed in the UK Biobank CMR extension [7•]. Pre- and post-contrast T1 maps can be used to calculate extra-cellular volume [59]. These methods are not currently standardized, but several methods have been proposed for automated analysis [6061].


Source:
UCL

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Questo sito usa Akismet per ridurre lo spam. Scopri come i tuoi dati vengono elaborati.