Although cyber risk premiums have expanded sizeably in recent years and loss ratios compare favourably relative to other product lines, sustainable growth of the cyber insurance market should not be taken for granted. Daniel Hofmann, Steve Wilson and Rachel Anne Carter of The Geneva Association identify three prerequisites that must be met to ensure sustainability.
Expanding the boundaries of insurability and making new risks manageable is not new for the insurance industry. Over centuries, insurers have developed products and services that reflect the changes in the risk landscape. Cyber risk is nevertheless taking us into uncharted territory.
Exposure bases are hard to define and measure, and they are constantly changing. Historical claims data are scarce and not considered to be representative of future vulnerabilities. Threats are constantly evolving; they can spread widely and rapidly, and a series of consecutive large events is plausible.
What is required
Moreover, a high degree of interconnectivity may result in potentially boundless impacts. Thus, to make cyber risk insurable, there are three fundamental prerequisites.
- First, there needs to be sufficient resilience at the source of risk. If homeowners did not lock their homes, theft would not be insurable. The first steps in addressing any risk are to assess, measure and manage it. Residual risks (ie, those that cannot be contained at the source) can then be mitigated through insurance
- Second, insurers must make an acceptable return on capital. This requires disciplined and effective underwriting
- Third, the available capital must both withstand shocks from accumulation events and provide adequate compensation to insureds after such an event - in the case of cyber, it means absorbing accumulation risk, which is the root of many concerns about cyber risk
Improvements in underwriting capabilities
At the core of all underwriting is the need to know the exposure, which is a measure of risk. In property classes, exposures are readily measurable and stable over time, but in the digital economy, exposures are neither stable nor measurable.
With the difficulties of measuring exposure comes the double challenge of assessing claims cost. Historical claims data are sparse and threats are changing rapidly. They spread and replicate across the globe and, unlike natural catastrophes, can endlessly adapt and recur with alarming frequency.
So even with a credible volume of historical claims data, its predictive power is questionable. To address these challenges, insurers have developed a few approaches.
- To improve exposure measures, data protocols are emerging that combine basic company information with digital risk indicators, such as patching frequency and backup procedures. In 2016, Lloyd’s established a schema for cyber exposure data that provided a much-needed standard for the main features of input data in cyber risk tools and the attributes to be considered when evaluating cyber risk.
- Advanced data analytics allow the analysis of special cyber risk characteristics. Service providers have developed digital risk assessment tools, providing risk scores and benchmarks of standards compared to peers. While there is a potential for promising insights, the tools will undoubtedly take more time to mature.
- Leading underwriting organisations are also implementing proactive approaches to assess the likely rate of change in future developments. It is now accepted practice for insurers to draw on a range of inputs, such as research publications, specialist modelling firms, and cyber security companies. Accumulation modellers may also conduct their own research and discuss trends with in-house cyber security experts. In larger insurance companies, the risk engineering function is evolving to include cyber and technology skills.
The common thread to these developments is the shift from an essentially ‘physical’ world to a ‘digital’ world. The future underwriting profile may include a deeper understanding of data sciences and a much greater familiarity with the technologies at the source of the underlying risk.
Towards more sophisticated modelling
Currently, insurers rely on pragmatic - but solid - methodologies that assess proportions of total limits at risk against the currently known major scenarios of data breaches, cloud outage, widespread malware, and disruption to critical infrastructure. These deterministic, scenario-driven methods, with expert judgement applied, provide a working solution while more sophisticated and insightful models are being developed. Progress is being made, but there is a divergence of views.
- For the bears, the challenges are highly significant and will take a decade or more to overcome. Should this view prevail, capital providers may be unwilling to provide funds at the levels needed to support expected market growth.
- The bulls, however, believe that advances in technologies will provide the capabilities to understand and measure these new technological risks. In this view, data, far from being scarce, is abundant and it is only a matter of how to extract, capture and utilise it. Techniques are emerging that harness the computational power of today’s data processing and analytical tools and so shorten the duration of the learning curve for cyber risk, with maturity perhaps around five years away.
Both views have their proponents, and both have compelling arguments. Progress may depend on insurers resolving three major challenges.
The first challenge is to define a ‘footprint,’ let alone measure the exposure within it. Supply chains have become increasingly digitalised and, with the range of cloud-based services extending further along the value chain, aggregations ‘in the cloud’ lie both within and across industries.
And the internet of things creates connections that reach into the homes of hundreds of millions of individuals. These connecting threads and digital ‘monocultures’ create an exposure base that is largely opaque, lacks hard boundaries and enables threats to permeate across sectors and countries.
The second challenge relates to the scarcity of extreme events. Modellers of natural catastrophes have addressed the ‘data scarcity’ for many natural perils by reference to relevant sciences, such as meteorology and seismology. However, for cyber modelling, there is no analogous hazard science to draw on.
The connected world
The third challenge is created by the high level of interconnectivity. Cloud service providers now connect many commercial organisations that would otherwise have little or no dependency. Further, commercial entities use common software or basic hardware. These monocultures create connecting threads both across and within industry sectors and present unprecedented challenges for risk assessment.
However, in the light of these challenges, seemingly intractable problems have become more tractable in recent years.
One approach uses publicly accessible digital information to identify connections between firms and their cloud providers. With a detailed digital map, it is then possible to assess the impact on a business of a specific cloud outage or failure, a significant step in understanding the risks associated with cloud interconnections.
Cyber catastrophe models are progressing beyond the pragmatic ‘stacking of limits’ and they are starting to have many of the qualities of their natural catastrophe counterparts. That said, much more needs to be done for accumulation models to reach standards comparable with natural catastrophe modelling.
All stakeholders must step up
The insurance industry can offer only a partial remedy. Other stakeholders must play their part too. Given the fluid stage of developments, it would nevertheless be premature to make firm recommendations. Prudence suggests refraining from making irreversible decisions, especially when a market is demonstrating high levels of innovation. Policymakers should endeavour to use the market as a discovery mechanism and expect best practices to be adopted quickly by competitors and new market entrants. A
Mr Daniel Hofmann is senior adviser insurance economics, Mr Steve Wilson is senior adviser cyber, and Ms Rachel Anne Carter is director cyber at The Geneva Association.