New artificial intelligence MRI scans could take just five minutes instead of 90
New artificial intelligence MRI scans could take just five minutes instead of 90
- Current MRI scans require patients sit still for up to an hour-and-a-half
- They take a series of 2D images, which are combined to make a 3D picture
- New technique takes just a few 2D images, before AI ‘fills in the gaps’
- AI has been trained on millions on MRI images from thousands of cases
- Project started three years ago, early results show the method may be feasible
5
View
comments
Facebook artificial intelligence (AI) is working alongside radiologists to create MRI scans that last as little as five minutes, new research suggests.
Current MRI scans require patients, who are often in pain, sit perfectly still for up to an hour-and-a-half while a scan is completed. This is due to such scans taking a series of 2D images of a person’s insides, which are combined to create a 3D picture.
Known as the FastMRI project, the new technique involves taking just a few 2D images, before AI that has been trained on millions of other scans ‘fills in the gaps’.
Early results show the technique may be feasible, however, such projects take time. Therefore, to give the innovation a boost, the researchers recruited AI specialists from Facebook.
Dr Daniel Sodickson, from New York University, said: ‘We have some great physicists here and even some hot-stuff mathematicians, but Facebook has some of the leading AI scientists in the world’.
Although the project started three years ago, it still has a long way to go, with doctors being unable to predict if and when such technology may be available.
Facebook AI is working alongside radiologists to create MRI scans that last as little as five minutes. Current scans take up to an hour-and-a-half while internal images are taken (stock)
- The pill that stops you getting fat: New drug ‘forces the… ‘Coconut oil is poison’: Harvard medical professor says the… A glass of wine a day could help you avoid a heart attack:… Hope for advanced ovarian cancer patients as half of women…
Share this article
‘We don’t know if we’ll succeed – but that’s kind of the fun of it’
FastMRI takes just a few 2D images of a person’s internal structures before AI ‘fills in the gaps’.
This AI has been trained on around three million images from 10,000 cases. All data has been made anonymous.
The technology notices overall patterns and abnormalities from just a few images and extends these elsewhere in the picture.
Dr Sodickson told Tech Crunch: ‘The sense is that already in the first attempts, with relatively simple methods, we can do better than other current acceleration techniques — get better image quality and maybe accelerate further by some percentage, but not by large multiples yet.’
The researchers accept their project is a challenge, with just a few mistakes potentially meaning the difference between an all-clear scan and one highlighting a tumour.
Larry Zitnick, from Facebook, added: ‘We don’t know if we’ll succeed or not. But that’s kind of the fun of it.’
HOW DOES ARTIFICIAL INTELLIGENCE LEARN?
AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn.
ANNs can be trained to recognise patterns in information – including speech, text data, or visual images – and are the basis for a large number of the developments in AI over recent years.
Conventional AI uses input to ‘teach’ an algorithm about a particular subject by feeding it massive amounts of information.
AI systems rely on artificial neural networks (ANNs), which try to simulate the way the brain works in order to learn. ANNs can be trained to recognise patterns in information – including speech, text data, or visual images
Practical applications include Google’s language translation services, Facebook’s facial recognition software and Snapchat’s image altering live filters.
The process of inputting this data can be extremely time consuming, and is limited to one type of knowledge.
A new breed of ANNs called Adversarial Neural Networks pits the wits of two AI bots against each other, which allows them to learn from each other.
This approach is designed to speed up the process of learning, as well as refining the output created by AI systems.
Computers could replace doctors in just 10 years
This comes after former health secretary Jeremy Hunt said last September that patients may one day be diagnosed by computers, not doctors.
Mr Hunt said: ‘So what might medicine look like when the NHS is 80 [in 2028]? Well, the first thing is we may well not be going to doctors for a diagnosis, we might be going to computers instead’.
AI could help in diagnosing patients by analysing X-rays and samples to determine conditions such as cancer, according to NHS England bosses.
In as little as a decade’s time, patients may even be diagnosed with disorders before they develop symptoms due to DNA screening being set to become accessible to the masses, Mr Hunt said.
The future will also see patients being able to declare their wishes about sensitive subjects, such as organ donation and end-of-life care, through apps, Mr Hunt added.
Source: Read Full Article