Using big data to turn global catastrophic risks into opportunities

Big data is currently transforming both the public and private sectors by increasing efficiency, transparency and productivity whilst also promoting sustainability. As the ability to utilise intelligent data analytics distinguishes today’s winners, data is fast becoming the oil of the 21st century. Organisations and countries that manage to harness this new commodity will ensure sustainable economic growth in the same way that those with access to cheap fossil fuel resources have been in an advantageous position in the past.  

The proliferation of mobile technology, wireless sensors, social media and the Internet of Things, provides a means of monitoring socio-economic activity, consumption of resources, transactions, human mobility and environmental change. Recent advances in data science are now capable of coping with the technical challenges of collecting, managing and developing actionable insights from big data. Much of the exciting research has focused on addressing the technical challenges of dealing with the three V’s that define big data (volume, velocity and variety), which is growing at 40% per year (Figure 1). The sheer size and complexity of the data being created by internet devices (Figure 2) implies a need to move beyond simple linear models and embrace sophisticated modelling approaches. Many organisations sit on a treasure chest of data, which when combined with external data will offer enormous potential.  

Measuring and monitoring the UN’s sustainable development goals will require better processes to utilise big data. The UN Statistical Commission has established a global working group to provide strategic vision, direction and coordination of a global programme on Big Data for official statistics. There are numerous challenges ahead that will require multidisciplinary teams to process raw data, extract insights and produce dashboards to enable intelligent decision-making. Fortunately, this revolution has already started in the insurance sector.
Figure 1: Amount of big data created each year.

There are many contenders when it comes to identifying the most threatening global catastrophic risks. Over the centuries, epidemics, earthquakes, floods and windstorms have competed for the position of deadliest disaster. Those with the highest death tolls include the Black Death of 1348 that wiped out up to 60% of Europe’s population and the Spanish Influenza of 1918 that killed between 40 and 100 million people. The costliest catastrophe, with estimated economic losses now exceeding $235 billion, is the earthquake and tsunami that hit Tōhoku, Japan in 2011, resulting in meltdowns at the Fukushima nuclear power plant.  

Reinsurance organizations quantify and compare catastrophic risks in terms of potential financial losses. Since 1987, when AIR Worldwide released the first catastrophe model, reinsurers have benefited from the scientific rigor of catastrophe models to assess risk. The financial losses associated with a particular peril are simulated by combining the hazard, exposure and vulnerability. While impact is clearly important, the frequency of catastrophic events must also be calculated to determine how to develop adequate risk management systems. Big data comprising historical events, crowd-sourced data and computer-simulated output form the ingredients of a CAT model. As the science matures and both practitioners and academics seek to cooperate, the growing need for a collaborative platform has emerged in the form of the Oasis Loss Modelling Framework ( 

There are many opportunities to use big data to improve the assessment and management of global catastrophic risks. At present, risk assessment is largely a backward looking exercise where a catalogue of historical extreme events form the basis of the analysis. In many cases, an assumption is made that the risk has not changed during the historical period. This approach is defensible if the hazard, exposure and vulnerability are not changing over time. In reality, all three can vary and both data and advanced modeling techniques are required to understand the complex interactions.  

Emerging risks, such as terrorism, lack a historical catalogue and forward-looking predictive models are required. Natural disasters such as windstorm and flood are affected by climate change and overreliance on the past may underestimate future risk. Satellites and drones are helping to collect data to better understand exposure and vulnerability. Crowd-sourcing can also be used effectively to encourage people to build resilience to disasters and develop disaster risk management strategies. 

Scientific models allow insurers to evaluate the risk associated with natural disasters. The probability of exceeding a specific financial loss is calculated using advanced quantitative modelling. At the core of this risk modelling is the need to determine the relationship between a particular measure of the hazard, such as wind speed or rainfall and the resulting financial losses. Catastrophe models involve the computationally intense process of using geographical information systems (GIS) to describe the spatial variation of exposure and vulnerability for a particular portfolio of buildings. By running numerous simulations of extreme events that vary in time and space, the catastrophe model assesses the chances of experiencing losses of different magnitudes. These models can be broken down into modules describing the hazard, exposure, vulnerability and financial components. The development of these modules relies on access to a skilled team of scientists, engineers, statisticians and actuaries. 


Opportunities arise at the interface of novel data, advanced modeling and a willingness to innovate business practices. The transition to using quantitative models to automate decision-making, remove inefficiencies and prioritize resources is already taking place in many organisations.  

Big data is providing the ability to offer weather insurance for farmers. Data from weather stations or satellites can be used to construct an index that tracks the losses that have arisen due to extreme weather events. With the availability of low-cost wireless sensors and higher resolution information, the accuracy and feasibility of this innovative type of insurance is improving. 

Many countries are now using public-private partnerships as a means of structuring national catastrophe programs to protect against natural disasters. New Zealand’s Earthquake Commission (EQC) provided primary natural disaster insurance that protected the owners of residential properties from the 2010 and 2011 Christchurch earthquakes. Early warning systems are another innovation and these rely on timely access to information – social media is playing an important role in communicating alerts. 

Figure 2: Number of internet devices being used each year.

Great opportunities exist for the private sector to use big data to monitor business activities and interactions with customers. This information is providing information about what works and what does not and is helping to increase efficiencies in many sectors. Success relies on being profitable and also managing risk when making decisions – big data is helping to provide actionable insights for both.  

Data is also becoming a valuable source of information about the preparedness of firms to cope with shocks that might arise from regulation, technology and climate change. Pension funds are consumers of such information in order to make long-term decisions about companies. Novel datasets and surveys are available to assess the true value of firms and to better understand how their activities are likely to be aligned with future opportunities in an effort to strengthen resilience. Key decisions in the face of future uncertainty can be supported by data and those that understand how to utilize big data are more likely to prosper. 

Risk reduction strategies tend to be reactive as it is easier to justify allocating resources in the aftermath of a disaster. As the lifespan of politicians and business leaders is relatively short, they rarely have the stamina to support long-term strategies that will not reap a reward until the next disaster. Furthermore, many responses involve incremental solutions that fail to grasp long-term opportunities.  

Talk about strengthening resilience is a growing trend that is replacing less optimistic discussions about risk management. Resilience implies more than risk reduction and can be viewed as having the capacity to adapt, recover and transform in response to adverse events. There is an important role here for big data to encourage proactive transformative solutions as opposed to incremental changes. New sources of data and innovative decision support tools could identify strategic actions and allow companies to be rewarded for transforming early. Performance metrics could help investors evaluate the companies that are already transforming and positioning themselves to make the most of future opportunities deserve to be rewarded now for their foresight.



Popular posts from this blog

A racing heart for St Valentine's day

Forecasting tea productivity using local weather conditions

Use of Machine Learning Techniques to Create a Credit Score Model for Airtime Loans