A new selfie app can tell you your risk of getting pancreatic cancer.
With few detectable symptoms, pancreatic cancer is one of the most deadly. Patients have a nine percent chance of surviving five years.
Now, researchers at the University of Washington have devised a method to spot the only concrete sign: yellowing of the eyes.
BiliScreen uses a smartphone camera to detect increased levels of bilirubin (a yellow substance found in bile) in the white part of your eye – even if you can’t see it in a mirror.
Currently, the standard method to measure bilirubin levels is a blood test, but it is not routinely offered and can be costly.
Experts hope the app, which will be debuted on September 13 at a conference in Hawaii, could dramatically lower the rate of pancreatic cancer deaths.
Breakthrough? BiliScreen uses a smartphone camera to detect increased levels of bilirubin (a yellow substance found in bile) in the white part of your eye – even if you can’t see it in a mirror
In adults, the whites of the eyes are more sensitive than skin to changes in bilirubin levels.
It can be an early warning sign for pancreatic cancer, hepatitis or the generally harmless Gilbert’s syndrome, since all of them affect the body’s ability to control bile.
Unlike skin color, changes in the sclera are more consistent across all races and ethnicities.
Yet by the time people notice the yellowish discoloration in the sclera, bilirubin levels are already well past cause for concern.
‘The problem with pancreatic cancer is that by the time you’re symptomatic, it’s frequently too late,’ said lead author Alex Mariakakis, a doctoral student at the Paul G. Allen School of Computer Science & Engineering.
The UW team wondered if computer vision and machine learning tools could detect those color changes in the eye before humans can see them.
‘The hope is that if people can do this simple test once a month — in the privacy of their own homes — some might catch the disease early enough to undergo treatment that could save their lives,’ Mariakakis added.
The researchers hope the app could also potentially ease the burden on patients with pancreatic cancer who require frequent bilirubin monitoring.
In an initial clinical study of 70 people, the BiliScreen app — used in conjunction with a 3D printed box that controls the eye’s exposure to light — correctly identified cases of concern 89.7 percent of the time, compared to the blood test currently used.
BiliScreen builds on earlier work from the UW’s Ubiquitous Computing Lab, which previously developed BiliCam, a smartphone app that screens for newborn jaundice by taking a picture of a baby’s skin.
The app calculates the color from the sclera – based on the wavelengths of light that are being reflected and absorbed – and correlates it with bilirubin levels using algorithms
Using the app with the box accessory (pictured) – reminiscent of a Google Cardboard headset – led to slightly better results
A recent study in the journal Pediatrics showed BiliCam provided accurate estimates of bilirubin levels in 530 infants.
Next steps for the research team include testing the app on a wider range of people at risk for jaundice and underlying conditions, as well as continuing to make usability improvements — including removing the need for accessories like the box and glasses.
‘This relatively small initial study shows the technology has promise,’ said co-author Dr. Jim Taylor, a professor in the UW Medicine Department of Pediatrics whose father died of pancreatic cancer at age 70.
‘Pancreatic cancer is a terrible disease with no effective screening right now,’ Taylor said.
‘Our goal is to have more people who are unfortunate enough to get pancreatic cancer to be fortunate enough to catch it in time to have surgery that gives them a better chance of survival.’
Co-authors include Allen School undergraduate student Megan A. Banks, research study coordinator Lauren Phillipi and assistant professor of medicine Lei Yu.
The research was funded by the National Science Foundation, the Coulter Foundation and endowment funds from the Washington Research Foundation.
HOW DOES THE APP WORK?
BiliScreen uses a smartphone’s built-in camera and flash to collect pictures of a person’s eye as they snap a selfie.
The team developed a computer vision system to automatically and effectively isolate the white parts of the eye, which is a valuable tool for medical diagnostics.
The app then calculates the color information from the sclera — based on the wavelengths of light that are being reflected and absorbed — and correlates it with bilirubin levels using machine learning algorithms.
To account for different lighting conditions, the team tested BiliScreen with two different accessories.
First, they tried paper glasses printed with colored squares to help calibrate color.
Second, they tried a 3-D printed box that blocks out ambient lighting.
Using the app with the box accessory — reminiscent of a Google Cardboard headset — led to slightly better results.