The re/insurance industry has evolved into one where it is at the forefront in quantifying risks, especially with the increasing needs and pressure to disclose climate-related finances, Environment Social and Governance (ESG) ratings, and moving to a net zero economy. The need to understand disasters and the impact of climate change has never been greater.
The use of Earth Observation (EO) in the re/insurance industry, and particularly catastrophe modelling, is not new. The need for analytical tools and datasets became quickly evident in the aftermath of natural catastrophe events in the late 80s and early 90s, which hit re/insurers hard and resulted in some companies failing. It was clear that the infrequent nature of these types of perils, which are challenging to capture through statistical modeling alone due to lack of sufficient historical claims data, meant that insurers underestimated costs.
Disciplines such as hydrology, atmospheric physics, seismology, Geographical Information Science (GIS), and engineering entered the industry to quantify the frequency and severity of loss from natural hazards. EO and emerging spatial data handling technologies played an important role in the analyses of different perils, and became fundamental to creating better diversified insurance portfolios. For example, accumulations of insured exposure at risk could be quickly identified and related to potential sources of natural hazard, be it a flood plain or fault line.
Over time, natural catastrophe models have become increasingly sophisticated, by incorporating a set of artificial events simulated over tens of thousands of years to represent the full spectrum of possible events beyond those observed in history. Supported by increased computational power and highly detailed and quality data, it is now possible to model perils at national or even international scale, capturing correlations and peril interdependencies, such as flooding across catchments or hurricane-induced rainfall.
From exposure to loss
Understanding the exposures is the starting point and includes the assessment of geospatial asset and exposure data, their availability, and the quality requirements for modeling. In re/insurance, data consists typically of policyholder information, sums insured, location information and, to some degree, additional attributes describing the types of assets. The level of granularity in location information is important to increase the accuracy of modeled results with respect to the associated hazard intensity. Additional risk attributes, such as type of occupancy, year built, height, etc., are important for applying the right damage ratios between the exposure and the hazard.
Exposure information can be enriched or distributed to finer granularity through a variety of sources — from governmental and international authoritative datasets (such as mapping agencies, national statistical agencies or key infrastructure ministries); ‘crowd sourced’ data, such as OpenStreetMap; EO and other remotely captured data (for example, CORINE land use land cover data, population density, nightlight imagery, etc.); to geodemographic and buildings datasets, such as the Global Exposure Database (GED), as well as agricultural/environmental and ecological data providers.
EO datasets and derivative products are also essential in the production of hazard, whether it is creating hazard maps and scenarios (Figure 1), stochastic catalogues of simulated extreme events, or for model validation. For example, for precipitation modeling EO data from Realtime TRMM (Tropical Rainfall Measuring Mission); for rainfall-runoff modelling soil data such as MODIS (Moderate Resolution Imaging Spectroradiometer); and for hydraulic modelling terrain data such as SRTM (Shuttle Radar Topography Mission) are important.
From a loss perspective, once an event occurs the need for real-time data has been reliant on a variety of sources of information. This could include gauge observations, but also a combination of EO-derived data from a variety of sensors (for example, Synthetic Aperture Radar (SAR) combined with high resolution imagery swathes at low altitude), as well as digital mapping, mobile and drone technology to get information remotely about an emergency situation.
In re/insurance this information is essential for many time-sensitive activities: to support the allocation of reserves, triage loss adjustment and accelerate and validate claims payments; to enable exposure reporting during event monitoring; to provide policyholder support and preparedness; and to mitigate losses with pre-emptive plans and procedures, as well as pre-event predictions and monitoring.
In re/insurance, data consists typically of policyholder information, sums insured, location information and, to some degree, additional attributes describing the types of assets.
Integration is the new frontier
With the host of applications and growing demands for more data, the ability to interpret and integrate such data sources into existing workflows and to have the ability to scale this to nationwide insurance portfolios, is crucial. Some insurance companies are already very sophisticated in their tools, enabling underwriters, exposure managers and modelers to interrogate and model data, to inform the pricing of policies, to monitor accumulations and measure against the company’s risk appetite. The ability to integrate seamlessly and consume data easily is a priority. However, in other cases, third party providers are still needed to provide the required services.
Overall, it boils down to data. The need to have information at your fingertips, to deliver business insights, provide added value and drive decision-making. That need is only growing by the day, especially for high resolution and real-time access to data, with increasing coverage and frequency and the ability to pinpoint remote sensors to geographical areas as and when needed. However, challenges remain around prohibitive costs, data processing capabilities and accessibility.
Whilst the adoption of the latest satellite technology in the re/insurance market has been slow due to lack of expertise and prohibitive costs and license terms, long EO records since the 1970s, albeit with varying levels of resolution, combined with technological advancements, have enabled the sophisticated modeling solutions we see today. For the future, key requirements will be the provision of high quality global datasets and input files to support global, correlated modeling of perils. We will also need to focus on simpler products, which can be easily integrated and customized to end-user needs.
About the author: Tina Thomson is the Global Head of Research at Gallagher Re and leads the analytical service offering to a wide range of re/insurance clients across the West and South regions of Europe, Middle East, and Africa. Tina holds a PhD in Geomatic Engineering from University College London and retains close links with the academic community in the UK through her involvement as a Fellow at the Remote Sensing and Photogrammetry Society (RSPSoc) and the Royal Geographical Society (RGS-IBG).
Suggested Further Reading
Bettini, G, Gioli, G, Felli, R. (2020) Clouded skies: How digital technologies could reshape “Loss and Damage” from climate change. WIREs Clim Change. https://doi.org/10.1002/wcc.650
Inkpen, R, Gauci, R, Gibson, A. (2021) The values of open data. Area. https://doi.org/10.1111/area.12682