Enterprise cyber threat remediation needs to improve in several key areas, according to an analysis of common remediation strategies.
Common enterprise cyber threat remediation strategies are about as effective as random chance, a study has revealed.
Some of the simple rule-based strategies do not perform any better than rolling the dice, according to a follow-up report by predictive cyber risk firm Kenna Security and the Cyentia Institute research services firm.
Comparing various enterprise cyber risk remediation strategies against a “random” approach, the study found that the efficiency rate remained the same, at around 23%.
The study was based on the analysis of five years of historical vulnerability data, comprising millions of data points compiled from more than 15 sources. A total of 94,597 Common Vulnerability Exposures (CVEs) from Mitre were also used in the research.
The key areas organisations need to improve, the report said, include reducing the time it takes to assess whether a newly published vulnerability is relevant, improving their capability to assess the risk of any vulnerability and how to mitigate that risk, and improving their capability to reduce risk proactively, efficiently and effectively using predictive models.
A key finding of the study was that most current approaches to prioritising and fixing vulnerabilities are roughly as effective as or even less effective than addressing vulnerabilities at random.
Researchers compared 15 different remediation strategies against a strategy of fixing vulnerabilities at random to provide a point of reference that illustrates the effectiveness of each strategy. More than half of the strategies were no more effective than chance.
The study also revealed that the volume and velocity of vulnerabilities is rapidly increasing. In 2017, the study showed businesses had to decide how to address an average of 40 new vulnerabilities every single day, with the number of year-over-year entries in the database more than double the 2016 figures.
Predict attacks and act fast
The study showed that most reported vulnerabilities are not used by hackers, which underlines the need for businesses to focus on the vulnerabilities that pose the greatest risk.
Of the thousands of new vulnerabilities published every year, the report said exploits are not even developed for the vast majority (77%) and even fewer (less than 2%) are actively used in an attack.
According to the report, speed must be a priority. The greatest number of exploits are published in the first months after a vulnerability is released, with 50% of exploits published within two weeks of a new vulnerability. This means businesses realistically only have 10 working days to find and fix the riskiest vulnerabilities.
“Effective remediation depends on quickly determining which vulnerabilities warrant action and which of those have highest priority, but prioritisation remains one of the biggest challenges in vulnerability management,” said Karim Toubba, CEO of Kenna Security.
“Businesses can no longer afford to react to cyber threats, as the research shows that most common vulnerability remediation strategies are about as effective as rolling dice. But there is hope. A predictive model based on cutting-edge data science is more efficient, requires less effort and provides better coverage of an enterprise’s attack surface,” he said.
The researchers found that machine learning systems based on a predictive model perform two to eight times more efficiently, with equivalent or better coverage of vulnerabilities when compared against the 15 other strategies assessed in the research.
Jon Oltsik, senior principal analyst at Enterprise Strategy Group, said: “In the past, we used analogue tuning to define which systems were considered mission-critical, but this didn’t provide a level of useful granularity. Fast-forward to 2018, and risk-based intelligent vulnerability management platforms can now consume terabytes of configuration data, asset data, vulnerability data and threat intelligence to create a fine-grained analysis of which systems really need immediate patching against current threats.
“Now these systems are moving beyond real-time assessments by forecasting weaponisation and risk well before an attack is possible. This proactive approach can provide insight and help organisations anticipate attacker behaviour,” he said.
The report recommends that organisations begin by measuring the efficiency and coverage of their remediation strategy and assess what policy tweaks would increase those metrics.
“From there, the value of predictive models will become not just apparent, but measurable,” the report said, adding that predictive models can and do enable businesses to adopt a proactive strategy for vulnerability remediation that delivers the most efficient use of their people, tools, time and, ultimately, money to address the threats that pose the greatest risk.
This article originally appeared on ComputerWeekly.