Start Fund: Evaluation of Crisis Anticipation

In 2019, we commissioned our first evaluation of crisis anticipation at the Start Network. We were keen to reflect on our risk-taking, look at which hazards we needed to invest in to improve our skill, and learn how to better measure the quality of anticipation alert notes submitted to the Start Fund. A key element of this was to look back across anticipation alerts and see where our forecasted emergencies had happened as expected and what kind of differences we had seen. The evaluation looked at fourteen anticipatory projects from thirteen different forecasted crises. It concluded that half of them had not occurred, which prompted a wider review of all the projects where data was available to determine whether their forecasts were correct. To do this, we used information submitted by implementing agencies when their project has finished. We looked at data from 37 projects, which were implemented across 24 different forecasted emergencies. Thirty-six percent of forecasted emergencies took place as predicted or with a more significant impact, meaning 64% either did not occur or occurred with less intensity. While the Start Network saw a few ‘false alarms’ as a characteristic of a healthily risk-taking humanitarian system, the number of near misses seemed high. Looking into the data, we learned three key points which will inform our approach moving forward.

This report was produced by Integrated Risk Management Associates.