Concerns are growing about the aggregation of cyber risks after several high-profile events in 2021 such as the SolarWinds breach, the Kaseya ransomware attack, and the Microsoft Exchange Server zero-day vulnerabilities.
Some risk aggregation events revolve around conventional metrics like industry type. For example, when the Colonial Pipeline - the largest fuel pipeline in the US - was forced to temporarily shut down its operations after falling victim to a ransomware attack, insurers knew there would be subsequent business interruption at other businesses that were dependent on the pipeline.
The more common attacks these days, according to Phil Edmundson, CEO and founder of Corvus Insurance, are hacks against cyber infrastructure that crosses over many types of businesses – for example, the Microsoft Exchange Server zero-day vulnerabilities and the Palo Alto Networks VPN vulnerability. These attacks often expose a single point of failure in a particular type of software or utility that is found in many different organizations.
While cyber risk aggregation events are extremely concerning – especially in this digital-first era – commercial insurers are starting to understand that there is a way to manage their exposure and learn from how the industry has approached other types of aggregation risk.
“Just as the industry was very challenged to develop more sophisticated models to manage windstorm in Florida after a series of hurricanes in the 1990s, or to become more sophisticated in managing terrorism risk after 9/11, I think the industry is waking up to an opportunity to better manage cyber risk aggregation in the same manner,” said Edmundson.
“By that, I mean insurers will have to continue identifying aggregations of traditional measurements, like industry type, and I think the P&C insurance industry is well equipped to do that already. But there are a host of other cyber metrics that conventional insurers don’t measure. Because of that, they are unable to quantify their aggregation of risk and to assess whether or not their book of business is indicative of the broader market or if it might have some unknown aggregation.”
Insurtech Corvus builds smart commercial insurance products using data-driven underwriting, under the premise that the quality of the data you collect dictates the quality of your insights.
“Through our Corvus scan, we can measure things like the activity and use of VPN providers,” said Edmundson. “When we see a risk arising out of a vulnerability in a [network, software, VPN, cloud provider, etc.], we know immediately what percentage of our book of business is exposed to that vulnerability. But conventional insurers either don’t collect that data, or the only place it’s collected is on a PDF application and it’s not sitting in a database. That means they have no way of knowing or comparing their book of business to some industry standard.
“If [a cyber security company] provides supplies 10% of the overall economy with VPN services, how does that insurer know whether their book of business is 80% exposed or 2% exposed? Most conventional insurers are unable to manage these types of risk aggregations. In addition, when a vulnerability does become public in what we call zero-day events, because those insurers don’t know which policyholders have that vulnerability or that cyber metric, they’re unable to work with their policyholders to improve their risk management posture.”
Edmundson used the example of the Palo Alto Networks VPN vulnerability in 2021. He said that when Corvus became aware of the vulnerability, they could identify that only a small number of their policyholders (181 to be exact) had used the VPN software, and because Corvus requires all policyholders to share a name and email contact, they were able to inform their exposed policyholders of the vulnerability and how to fix it in a quick and efficient manner.
“More broadly, this same type of application exists in some of the biggest cyber hurricane nightmares you can think of,” the CEO told Insurance Business. “Attacks against cloud service providers come to mind. Most folks understand that Amazon Web Services (AWS) is a big cloud service provider, but in fact, AWS is really a host of different geographies, and they have different data centres around the world. We collect data on every one of our policyholders that tells us which of those data centres they’re using, so we can measure aggregation by the regional distribution of cloud service providers. We are then able to manage our book of business and better address problems as and when they take place.”
One of the challenges that insurers face in understanding and managing cyber risk is the fact that the risk is evolving, and new risks are emerging all the time. Edmundson noted that it helps to have a very active cybersecurity team who can examine what types of data should be collected and analyzed.
“Another tool that’s really important is cyber risk modelling,” he added. “We work with several risk modelling companies to develop and measure more of these cyber risk aggregations together. There will always be new categories of vulnerabilities as IT infrastructure changes, and as defences change, and as the attackers move their strategies to new areas. It’s very dynamic, and you absolutely need to have an internal cybersecurity team in order to keep up with it all.
“The problem is that most traditional insurers don’t collect this data. It’s not just as easy as using a third-party scan. Many conventional insurers will buy scan reports from a third party, but that’s not good enough because that information is static. It’s a one-time event, and it doesn’t go into a database that can be manipulated and used in a core application rating model, or used in machine learning, or to provide alerts to the policyholder six months into the policy year. For that reason, I think traditional insurers have more challenges around cyber risk aggregation than cyber-focused insurtechs that are collecting this data.”