Using big data to fight dementia and Alzheimer’s
Scientists striving to cure Alzheimer’s disease and other brain disorders are turning to a powerful new tool they hope will light the way to effective treatments: big data. The idea is to use supercomputers to search through reams of patient data – everything from MRI scans to the results of cognitive testing to lipid levels – for patterns that might reveal the precise cause of neurodegenerative disorders, which have so far proved stubbornly difficult to predict, halt or even slow down.
But figuring out how best to share information from potentially tens of thousands of people around the world is proving to be a legal, ethical and logistical challenge, which is why the Organisation for Economic Co-operation and Development convened a workshop on the subject in Toronto on Monday, one of the first gatherings of its kind.
“Up until really quite recently, most of these studies have tried to link together one or two variables. So we have [brain] imaging and we look at cognition or we have genetics and we look at behaviour,” said Michael Strong, dean of the school of medicine and dentistry at the University of Western Ontario in London. But tying together a dozen or more variables across diseases in a single database, “that’s really quite new,” Dr. Strong said. “We’re really on the cusp.”
Dr. Strong, the lead investigator for a $28.5-million Ontario research project that intends to harness big data to better understand brain diseases, attended the OECD workshop along with more than 50 other scientists, doctors, policy experts, computing whizzes and patient advocates.
The workshop comes as governments around the world appear to be waking up to the toll neurodegenerative disorders are taking on health-care systems and the economy in general, a toll that is expected to rise as the population ages.
source: www.theglobeandmail.com

