One pandemic. 5 million people. Now what?
Dionne Aleman’s pandemic management model is built on hundreds of factors — and a big computer
Think back to 2009, when we were told that a new strain of flu — called H1N1 — was about to besiege the population of the Greater Toronto Area (GTA).
“Get vaccinated,” said the authorities. And then the line-ups began and you could feel the tension throughout the GTA.
Fortunately, H1N1 didn’t turn out to be the massive killer that was feared. But it — like the SARS outbreak of 2002–2003 — reinforced the need for public health agencies to find ways to plan for large populations to cope with a pandemic. And, as we are seeing in so many aspects of modern living, computing is playing a huge role.
“We could plan for a pandemic without computing, but it would be wildly impractical,” says Dionne Aleman, a professor in Mechanical and Industrial Engineering. “It would be like flipping a coin to determine how a disease would spread. It could take two days of flipping coins over and over to simulate a day of activity.”
Aleman and her research team in the Medical Operations Research Lab are far past the coin-flipping methodology. The lab is in the final stages of building a computer model that will be of vital assistance to Ontario public health agencies.
How? “The simple answer is that we ask and answer as many ‘What if?’ questions as possible,” says Aleman. “That may sound simplistic, but it takes a lot of computational effort to make it happen.”
In building the model, Aleman’s team focuses on the GTA’s five million citizens. Each citizen is considered to be “an individual agent who has various properties, such as age, gender, where they live, do they ride public transit, what is their route, where do they work or go to school?” They get this information from sources such as the TTC, Statistics Canada and census studies.
The goal is to model whether — and how — these agents will come into contact with each other, thus spreading the disease at the core of the pandemic. “Each agent is like a block of memory. So to model the GTA, we need a computer with a lot of memory. Every time one agent has contact with another, that’s a computational effort that has to be made. We keep adding up these contacts and at the end of a simulated day, we want to see what’s the probability of each person being infected tomorrow.”
The technology the Aleman team uses is a small cluster of 256 processors, wired together in a dedicated room. “With the level of detail we have in our model, it takes about one to two seconds to simulate one day. We can simulate about 60 days in just a few minutes. This is great, because planning for a pandemic is all about probabilities.”
So, the researchers can introduce new information — as they are doing now in adding statistics about the number of children in various Toronto school systems — and then have the computer assess the probabilities of how disease will spread because of these additions.
One of the key problems Ontario public health officials asked the team to analyze is the importance of how fast people get vaccinated in terms of halting the pandemic. “We targeted 60 per cent of the population and asked if it mattered if those people were vaccinated over 10 weeks or 15 weeks. Then we ran a simulation and we found that, yes, if people got vaccinated earlier, it would stop the spread of disease much more quickly.”
Do the researchers have to be talented at using computers?
“You have to know how to program and code. There’s definitely room for growth at making supercomputing more accessible. I wonder why there isn’t more of a push to bring more people into this level of computing. It doesn’t have to be as easy as using an iPad, but there’s too much of a gap now. So that should be the next big step — enabling researchers to become more adept at actually using high performance computing.”