Thinking like a forecaster
What are the benefits for humanitarians?
Previously this blog looked at using climate forecasts to allocate funds before a crisis hits, using the example of a recent cyclone in India (Hudhud). However, anticipation is not limited to ‘Hudhud’ type situations but to a wider range of humanitarian situations. Advances in forecasting techniques may help us to get better at this.
Within the range of typologies of humanitarian crises, cyclones (such as Hudhud) are the exception rather than the rule. Advances in weather modelling mean that NGOs can receive alerts three or even six days before large cyclones or tropical storms hit allowing time to prepare. In a similar way, advanced weather models for droughts and to a lesser extent flooding now also exist. However, many other crises that we deal with, such as conflict, displacement, disease outbreaks (or any combination of these with climatic factors), are messy and don’t easily benefit from scientific models that can predict their expected course. But this doesn’t mean it’s not possible to be anticipatory in these crises.
Despite the lack of scientific models, we humanitarian actors are forecasting or making predictions on the trajectory of crises all the time.
Making resource allocation decisions on the anticipated course of a crisis, rather than the actual situation, is central to how we respond. Firstly, it’s widely accepted that the best way to respond to a crisis is to prevent it from happening in the first place. For example, last month the Start Fund Allocations Committee (a group of 12 humanitarian directors who respond to locally driven requests for rapid response funds), allocated £125,000 to the Cholera response in Northern Nigeria. This little known emergency received funds not because of the current caseload, but because of the warnings being made locally of the potential for Cholera to spread into the Lake Chad Basin and the catastrophic consequences that could ensue. In this situation it was not a sophisticated scientific model prompting an anticipatory response but local warnings, combined with years of experience of cholera responses, that convinced the committee of this urgent need.
Secondly, the humanitarian machine is a heavy beast, and so decisions taken now will hit the ground in a few days at the earliest, or a few months at the latest. To remain relevant decisions need to be anticipatory. For example in a situation of mass displacement caused by conflict, programme managers are constantly asking questions such as: Where will people move to and why? How likely is it that the current volume of movement will continue/ increase/ decrease? How long do we anticipate they will stay for? In the balance are questions around the drivers that forced the movement in the first place, the security in the areas to which they have been displaced, access to resources and many other factors. Answers to these predictive questions greatly impact on the quantity and type of assistance which is given.
What is concerning is that studies tell us that humans are naturally not very good at forecasting!
Studies tell us that ‘experts’ on complex subjects such as geopolitics and economics have proved themselves to be very bad at making predictions, even in their fields of specialism. In one famous project, Professor Philip Tetlock collected over 27,500 forecasts from experts on defined questions and waited to see whether these came right or wrong. The study carried out over 18 years concluded that “Chimps randomly throwing darts at the possible outcomes would have done almost as well as the experts” (Stevens 2013).
Would humanitarians have fared any better?
Professor Tetlock still believes that people can be good forecasters, but in the right conditions. There is now a whole field of study on expert judgement, which outlines some of the biases which can make people bad forecasters, and the conditions which can promote good forecasting.
How to be good at forecasting
Research initiatives such as the Good Judgement Project have identified several basic ways we can improve our ability to forecast. This project runs as a kind of forecasting tournament, with thousands of diverse volunteers challenged to answer meaty questions of geo-strategic importance (like will Greece default, will there be a military strike on Iran etc.). The findings were summarised in a recent article by Tim Harford How to see into the future with the recommendations as follows:
- Basic forecast training: The Good Judgement Project claims that just 20 minutes of training on how to put a probability on a forecast, and how to correct for well know biases, can help the volunteers to greatly improve their performance. Apparently “even experienced geopolitical seers tend to have expertise in a subject, such as Europe’s economies or Chinese foreign policy, rather than training in the task of forecasting itself”.
- Teamwork: Assembling good forecasters into teams who are able to discuss and argue produces better predictions.
- Open-minded approach: People that are seriously seeking to predict the future (and not promoting a specific agenda or view of the world) make for better forecasters. These people actively look to prove themselves wrong, and will adjust their forecasts on the basis of new information
- Feedback: Forecasters need to be able to keep score, and improve their predictions based on past error
So according to this study, it comes down to “the right talents or the right tactics” to get really good at this, with people that use these improving over time to far outstrip chance.
What does this mean for us? As a starting point it suggests that the ‘expert’ analyst, security manager or even the Country Director sitting alone, reviewing information and producing strategy scenarios is perhaps not always the best model. Successful models of strategy planning or resource allocation may require us to acknowledge the role of forecasting and prediction in our processes more consciously, and make efforts to improve this. These include working together as teams, keeping track of our successes and failures and better understanding our strengths and blind spots. Making minor adjustments to process in this way could potentially help us to better achieve our goals to channel assistance to where and when it is most needed.
What do you think? Please get in touch by twitter @emilymontier or email firstname.lastname@example.org