New AI-based tool can ‘detect heart failure from just ONE heartbeat’

New AI-based tool can ‘detect heart failure from just ONE heartbeat’ and is 100% accurate, scientists claim

  • Scientists ‘fed’ it with electrocardiograms made up of 490,000 heart beats 
  • The technology was then exposed to a series of ‘five minute ECG excerpts’ 
  • The scientists hope their tool will one day help doctors diagnose HF sooner 

A new AI-based tool could detect heart failure from just one heartbeat, research suggests.

Scientists ‘fed’ the system with electrocardiograms (ECG) that made up more than 490,000 heartbeats.

The technology was then exposed to a series of ‘five minute ECG excerpts’ taken from 24 hour recordings. 

Results showed the convolutional neural network, as it is called, was 100 per cent accurate at spotting patients with heart failure. 

An AI model could detect heart failure from one heartbeat with 100 per cent accuracy (stock)

The University of Surrey team hope their tool will one day help doctors diagnose HF sooner, ‘benefiting patients and easing pressures on NHS resources’.

Heart failure occurs when the organ’s muscles are too weak or stiff to pump blood around the body effectively. 

This can be due to high blood pressure or the arteries narrowing. Drinking too much alcohol can also cause it, the NHS says.  

The condition affects around 26million people worldwide to some extent, according to the European Society of Cardiology. 

In the most severe cases, up to 40 per cent of patients die from the condition, the researchers wrote. 

It is also one of the main causes of hospitalisation among elderly people, the team added. 

With life expectancy on the rise, the team set out to uncover a more accurate way of diagnosing HF early on.

Existing methods look at heart rate variability (HRV), which describes inconsistencies in the space between consecutive heartbeats.

However, HF can generally only be diagnosed after a person’s HRV is looked at for around 24 hours. 

To overcome this, the researchers led by Dr Sebastiano Massaro focused on ECG signals rather than HRV. 

They collected ‘long-term ECG recordings’ from 15 severe HF patients taken from the BIDMC Congestive Heart Failure Database.

The ‘control group’ was made up of ECGs from 18 healthy people from the Normal Sinus Rhythm Database.

Each participant had around 20 hours of ECG recordings, the researchers wrote in the journal Biomedical Signal Processing and Control. 

Dr Massaro said: ‘Our model delivered 100 per cent accuracy. By checking just one heartbeat we are able detect whether or not a person has heart failure.’

He added that their tool is one of the first proven to be able to identify the ECG’s morphological features linked to the severity of heart failure.

Dr Leandro Pecchia, president of the European Alliance for Medical and Biological Engineering, said it offers a ‘major advancement’ in spotting heart failure.

He added: ‘Enabling clinical practitioners to access an accurate HF detection tool can make a significant societal impact.’

Dr Pecchia said patients could ‘benefit from early and more efficient diagnosis’ and also said the tool may ease ‘pressures on NHS resources’.

HOW DOES ARTIFICIAL INTELLIGENCE LEARN?

AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn.

ANNs can be trained to recognise patterns in information – including speech, text data, or visual images – and are the basis for a large number of the developments in AI over recent years.

Conventional AI uses input to ‘teach’ an algorithm about a particular subject by feeding it massive amounts of information.   

AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn. ANNs can be trained to recognise patterns in information - including speech, text data, or visual images

AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn. ANNs can be trained to recognise patterns in information – including speech, text data, or visual images

Practical applications include Google’s language translation services, Facebook’s facial recognition software and Snapchat’s image altering live filters.

The process of inputting this data can be extremely time consuming, and is limited to one type of knowledge. 

A new breed of ANNs called Adversarial Neural Networks pits the wits of two AI bots against each other, which allows them to learn from each other. 

This approach is designed to speed up the process of learning, as well as refining the output created by AI systems. 

Read more at DailyMail.co.uk