Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors
Search in posts
Search in pages
Home » News » Big data, better health

Big data, better health

Local researchers are using artificial intelligence to mine complex statistical information and revolutionize the delivery of care

Graphic of a young person holding a hologram projection.

By Joel Schlesinger
Winnipeg Regional Health Authority
Published Monday, June 3, 2019

At first glance, the image of the spine on the computer screen appears to be as clean as a whistle.

The 24 bones making up the spine follow a gentle S-shaped curve, and each looks similar to the one above and below. Most doctors reviewing an image like this one would typically give the patient a clean bill of health.

But take another look at the scan using software developed by Dr. Bill Leslie and his team of computer scientists and medical imaging specialists at the University of Manitoba.

Now the image looks more like a heat map, with red blotches representing “hot spots” along the patient’s spine. And upon closer examination, one can see that one of the bones in the middle of the spine has an irregular top edge. These irregularities indicate fractures, a reliable marker for osteoporosis, a debilitating disease that weakens the bones.

The ability to identify these fractures is no small thing.

“Spine fractures are the most common breaks that we see with osteoporosis, but most of them occur gradually so the pain is usually not that marked,” says Leslie, Director of the Manitoba Bone Density Program and a professor of Internal Medicine and Nuclear Medicine at the University of Manitoba.

In other words, a patient might have early signs of osteoporosis and not even know it. But the software can help identify a potential problem before it worsens or causes a devastating hip fracture. That means better treatment options for the patient, and less need for more expensive care down the road.

Needless to say, the software used by Leslie and his team represents a major step forward in the early detection and treatment of osteoporosis. But it is also something else: a prime example of how advances in computer technology are being used to reshape the delivery of health care.

As Leslie explains, advances in computing power over the last decade have propelled the development of machine learning, a subset of artificial intelligence. As a result, researchers are now able to harness software programs that can analyze complex data-sets, also known as “Big Data.”

Add powerful computers, sophisticated software, and complex data all together and you end up with information that can help improve patient outcomes.

The software used by Leslie and his team illustrates the point.

To better understand how this happens, it helps to have a bit of background.

Let’s start with the software. It is actually better described as an artificial neural network. That means Leslie and his team are using a mathematical model running on a computer that is designed to simulate certain functions of the human brain – in this case, to recognize objects in images.

The process of teaching the neural network by showing it examples is called “training.” Each time the software is trained with an example, the neural network predicts whether the example is normal or abnormal, and tiny adjustments are made to the model to make its answer closer to the truth the next time that example is seen. This process is repeated many thousands of times.

The neural network being used to spot spine fractures has been trained with more than 10,000 spine images, some that are normal, and others with fractures due to osteoporosis. It has encoded what it has “learned” into more than 50 million numbers, each representing the strength of a connection between one artificial neuron and another. The trained model was then asked to identify spine fractures in scans that it has not seen before, and it did so with over 90 per cent accuracy.

“If you had two radiologists classifying the same cases, you usually don’t expect to get more than 80 to 90 per cent agreement,” says Leslie. But a computer can analyze hundreds of X-rays in just a few seconds once it has “learned” what to look for in an image.

Dr. Peter Nickerson, Vice-Dean of Research at the University of Manitoba’s Rady Faculty of Health Sciences, says the work being done by Leslie and others has huge implications for the future of health care in this province and elsewhere.

As he explains, continuing advances in computing power will eventually lead to care that is more personalized and targeted to the patient. While health-care providers have been working towards the goal of personalized care for the last 20 years or more, computer systems that employ artificial intelligence and machine learning elevate that ability by an order of magnitude thought impossible only a few years ago.

As a result, Manitoba is at an important tipping point. Our health-care system is increasingly able to gather and organize large amounts of data. And there is now technology available that can analyze this information at lightning speed to reveal findings that otherwise might go unnoticed. The question is: Can we take advantage of the opportunities out there to become a leader in this emerging field?

To help answer that question, the Rady Faculty of Health Sciences at the University of Manitoba, in consultation with Manitoba Health, Research Manitoba, the George and Fay Yee Centre for Healthcare Innovation, the Winnipeg Regional Health Authority, the Bioscience Association of Manitoba and other stakeholders, has developed a five-year strategic plan to help develop the technological capacity needed to leverage advances in big data analytics with a view to producing better health research and better care for patients here in Manitoba.

Among other things, the strategy calls on the partners to:

  • Strengthen alignment and engagement among various partners in the field, including the provincial government and other academic departments.
  • Enhance data quality and access through cleaning, coding, classifying and retrieval.
  • Enhance the ability to describe, visualize and model complex data.

It’s no small undertaking, says Dr. Lisa Lix, Director of the Data Science Platform at the George and Fay Yee Centre for Healthcare Innovation, Canada Research Chair in Methods for Electronic Health Data Quality, and senior scientist at the Manitoba Centre for Health Policy.

Both centres are playing a key role in the University of Manitoba, Rady Faculty of Health Sciences’ complex data strategy – the Centre for Healthcare Innovation drives training and some technological development; the Manitoba Centre of Health Policy serves as a gathering point for the reams of data from the health-care system and is exploring new methods of analysis.

“Our role is to think about the kinds of data that we can bring together, to help support that bringing together of data, and to think about the methods by which we can analyze data,” says Lix, a professor in the Department of Community Health Sciences at the University of Manitoba. “That’s where our challenges are and the space in which we are increasingly moving.”

“Our role is to think about the kinds of data that we can bring together, to help support that bringing together of data, and to think about the methods by which we can analyze data,” says Lix, a professor in the Department of Community Health Sciences at the University of Manitoba. “That’s where our challenges are and the space in which we are increasingly moving.”

As Lix explains, the large and complex data found in heath care can be characterized by certain features – often referred to as three Vs – volume, variety and velocity

“When we talk about the attributes of big data, we talk about those Vs,” she adds.

Manitoba generates an increasing amount of health information every day, making the managing and curating of it an ongoing concern.

“We’re being inundated with many kinds of data (variety) at increasing volume (velocity),” she says.

Everything from chest X-rays to blood test results to hospital admissions to bacterial analysis of patients’ digestive systems is being compiled and stored and – someday soon – analyzed.

But organizing and analyzing data for useful purposes are fields of expertise unto themselves. And these vocations are largely new ones at that.

Lix says the medical community recognizes the challenges and the potential. That’s why the Rady Faculty of Health Sciences is spearheading a move to develop research, technology and knowledge capacity through partnerships designed to bring together bio-statisticians, bio-informaticians, clinical data specialists, and trainees.

“If we have these data-sets available, we really do need to be accountable for making good use of them.”

Certainly, when a test – like blood cholesterol – is done for a patient, it is immediately useful, telling a physician whether a patient needs medication or not. But those tests frequently include other, secondary data that are also potentially useful. And the data are there for the taking, held in storage, but often dormant, waiting to be made useful.

“As an example, at the Manitoba Centre for Health Policy, which is a provincial repository of health databases, there are data that go back to the 1970s that are in electronic form, such as hospital records,” she says. This is a wealth of information that can potentially illustrate over time the progress of health and health-care use for Manitobans.

But how data are collected and stored has obviously changed over the decades.

“You can well appreciate that how we would even code information on patient records in the 1970s would be very different from how we do that today,” she says.

This brings challenges with regard to sorting through, compiling and organizing data so the information can be analyzed. It’s not an easy task. Consider the difficulty involved in gathering valuable data from doctors’ notes, often notoriously difficult to read by human eyes, let alone by a computer. This challenge, by the way, involves another V – veracity.

Indeed, technology has reached a stage whereby software can do just that – correctly extracting relevant data from non-standardized data-sets like doctors’ handwritten notes. In fact, technology is to the point where these activities of extracting and organizing the data are hardly ground-breaking. Postal services, for example, have been using this type of software for some time to sort mail by reading addresses that are handwritten.

But as Lix explains, storing data securely, organizing the information so it can be used, and analyzing it is only part of the overall story. The other challenge is to understand its potential and develop the research questions to which the data can provide answers. To accomplish this, clinicians and data scientists require special training, which Lix has been co-leading.

Funded by the Natural Science and Engineering Research Council of Canada (NSERC) for six years, the Visual and Automated Disease Analytics Program (VADA) initiative is co-led by the University of Manitoba’s computer science program. Faculty and students from the University of Victoria are also part of the program.

The students – who are in Master’s and PhD programs – come from a variety of disciplines: computer science, psychology, mathematics, and health science and statistics, to name just a few.

“Our goal is to train them on how they can present data and use automated analytic tools” in a way that is understandable to other stakeholders, including policymakers, physicians, and patients, says Lix.

By building up expertise in the health-care system in this area of data science, Lix says these individuals can then use advanced computing tools to analyze data-sets with a view to tackling “health problems with a particular focus on chronic diseases and infectious diseases.”

Communication is a part of this initiative, she adds. Health sciences and the other disciplines typically involve jargon specific to their own areas of knowledge. To be able to work together, which is required in this area because it involves so many kinds of different expertise, everyone must be able to understand one another to develop research questions to which data can provide meaningful answers.

One of the projects students are working on, Lix says, involves patient-reported outcomes for hip and knee replacements that aim to measure the quality of life of patients after surgery. Using a questionnaire, patients answer how they feel physically and emotionally following the procedures, along with what might be limiting their mobility and their quality of life.

“We know (being pain-free) and functional mobility are as important as living longer, so what predicts quality of life measures?”

The answer may be found by using sophisticated computer programs to combine patient responses with other kinds of data – including doctors’ visits, hospital admissions, prescriptions for painkillers and antibiotics, and potential co-morbidities like cardiovascular disease.

Of course, in order for the analyses to be useful, you must first figure out what questions are relevant to the data, and what answers potentially can be found within a particular dataset.

“It boils down to, ‘How can we create good prediction models for disease and then present that information in a way that is relevant to decision-makers?'” she says.

While Lix and her team are largely focused on developing expert capacity to use big data analytics in the province, others are already using existing computer technology in the field to improve patient outcomes.

Among them is Nickerson, who is head of the province’s organ transplant program.

Nickerson and his research team have been investigating how to use data analytics to better predict successful outcomes from transplants, particularly kidney transplants, which are among the most common procedures in this field.

Manitoba has one of the highest rates of kidney disease in the country, so it would be beneficial to have diagnostics that could potentially improve outcomes, and reduce costs.

“Right now in transplant, we have protocols for care,” he says. These protocols dictate that all patients undergoing a transplant will be given the same combination of drugs to suppress the immune system and allow the kidney to be accepted by the body. But this is not an ideal approach because every patient is different on a genetic level, and those differences dictate how each patient’s immune system will react to a donor kidney.

“So one of the things that we’ve been looking at in our quality improvement processes is the genomic data that tells us what the quality of the match is between a donor and a recipient.”

So far, their work has uncovered two genetic markers that predict the likelihood of rejection. This gives doctors an idea of how much immune suppressing medication a patient may require so his or her body does not reject the organ. That’s important because the medication comes with side-effects like increased risk of serious infection because the immune system is suppressed.

But to discover meaningful correlations between donor, patient and outcomes, a software program using artificial intelligence is used to review hundreds of transplants, their success rates, use of therapeutics [drugs], and relevant genetic markers to reveal who may need more medication and who requires less.

“In Manitoba, we’ve been at the forefront of that, publishing some of the cutting-edge literature on molecular tissue matching to inform how much drug a patient might need,” Nickerson says.

Nickerson and his team are now partnering with other centres across North America and overseas to see if the identified genetic markers apply to other populations. If they do, it would mean their findings have universal clinical significance and benefits.

“The thinking is that if it’s true scientifically, it should be true with other populations, which we’re trying to prove now,” Nickerson says. “But this type of analysis we couldn’t have done 10 years ago because we didn’t have the computational tools.”

Leslie agrees. In fact, he says recent advances in computer science have drawn him back to a field he left many years ago.

“I got out of computer science in the early 1980s because I didn’t think artificial intelligence was going anywhere,” he says.

But in the last decade, computing power caught up with the ambitions of medical researchers, programmers and other interested parties. Years ago, artificial intelligence truly was science fiction, says Leslie. Today it is reality.

Like Nickerson’s work, Leslie’s research also involves partnerships – he is working with the Canadian Longitudinal Study on Aging, and the University of Lausanne in Switzerland – to verify whether what works here applies elsewhere.

Looking ahead, Leslie says the potential of machine learning goes beyond osteoporosis diagnosis. It could also be trained to find other potential trouble spots in the medical imaging data. For instance, he points to how calcium is visible in X-rays, including the aorta, from which oxygenated blood leaves the heart to feed tissues and organs throughout the body.

What’s most notable in an X-ray in this respect is calcium build-up in the aorta, which is linked to cardiovascular disease.

“In fact there’s a linkage – at least that’s one of the theories – between heart disease and osteoporosis,” he says. “As you deplete calcium from the spine, it winds up in vascular structures, so as osteoporosis risk goes up, cardiovascular risk goes up as well.”

Consequently, one of the next steps for the research may involve expanding the computer program’s capability to measure and analyze calcium buildup in the aorta.

“That way, we could scan for osteoporosis and identify cardiovascular risk at the same time.”

Indeed, Leslie says the possibilities for computer-powered data science and analytics are endless.

Nickerson and Lix agree. All three say the field has the potential to affect every aspect of health care, from identifying bacteria in the gut that leads to inflammatory bowel disease to selecting which form of chemotherapy may be more effective in killing off a particular form of cancer.

That’s why building research capacity is so important. Only then will researchers and clinicians be able to use the complex data-sets to their full potential and improve outcomes for patients.

For the time being, however, it remains a work in progress, one that also includes building awareness and understanding among researchers, health-care providers, and policy-makers.

Perhaps most importantly, the public also must grasp the potential impact that things like artificial intelligence and machine learning can have on health care.

“This is sort of like the idea of driverless cars,” says Leslie. “At what point do people feel comfortable enough with the technology to be hands-off with this?”

We’re likely not there yet. Nor is the technology, he adds.

Even with the advances and research findings in Manitoba, human input is still necessary, Leslie says. A physician has the last say, for example, in deciding whether a patient actually has osteoporosis or a spine fracture.

“The human is still the expert and the gold standard, but to have an assistant that never gets tired or misses anything, I think that would probably be welcome,” Leslie says.

“So as [artificial intelligence] moves further into the medical mainstream, it’s important to frame it this way for patients so they are not scared they are being left to a machine.”

The big takeaway is that these technologies are tools to make health-care providers better at what they do, Leslie explains.

“But we’re still the ones in control.”

Joel Schlesinger is a Winnipeg writer.

Share this page