The rise of remote work has widened the battlefield against cyber threats. Faced with a growing attack surface, businesses must plan for the unpredictable and guard against costly disruptions. In this context, the detection rate becomes an essential performance criterion for cybersecurity solutions. How does this rate influence protection against cyberattacks, and what financial repercussions can it have for companies?
According to Gartner, 2022 was marked by “the increase in the attack surface,” a fundamental trend that has not weakened since. Businesses are expanding their digital footprint to adopt new technologies, migrate to the cloud, or simplify work models. In 2023, for example, nearly 55% of the French population would be teleworking at least one day per week. However, companies are exposed to growing cyber threats in such a connected world, with the slightest technological advance bringing its share of risks.
To capitalize on innovation in a financially viable way, these risks must be anticipated. However, computer threats evolve according to variables that are often complex to evaluate. Therefore, the effectiveness of the cybersecurity solutions available to deal with them is also difficult to measure. To help with this, the “detection rate” of security solutions is one of the key indicators that has emerged in recent years. But what exactly does this rate mean, and how does it translate into a company’s broader financial landscape?
Demystifying detection rates
The detection rate of a security solution provides a quantifiable measure of its ability to detect and respond to various cyberattacks. These ratings are typically assigned by independent testing laboratories that impartially evaluate a solution’s performance. If a security solution has a 95% detection rate, it has detected and potentially neutralized 95% of cyber threats during its testing phase. However, this also means that there remains a residual risk of 5%, which companies must pay attention to.
This 5% “exposure” seems small initially, but the financial impacts can be considerable. By consolidating data from various reference sources, such as the cost of IBM data breaches in 2023, the cost of residual risk becomes clearer.
Measuring risk exposure
Take the example of phishing. Many analysts agree that 90% of successful data breaches begin with a spear phishing attack, a targeted campaign where the attacker personalizes a deceptive message by impersonating a person or person of a company and uses personal details or context to make the attack more convincing. Unlike phishing, which casts a wide net to trap ordinary victims, spear phishing addresses a chosen target directly with a personalized lure.
Now, consider the case of a company facing 1,258 phishing attempts every week. Assuming a proven attack frequency of 16% represents 201 potential breaches. According to IBM, the average cost of a successful attack reaches $4.76 million. Considering that the probable click rate is 18% for trained employees and 35% for those not, the financial consequences of the residual risk are considerable.
It is possible to calculate the probable cost of residual risk using the following formulas:
>> Cost of customer risk per violation: Average cost per violation * Residual risk
>> Number of phishing events per week: (Attacks per week * Attack frequency) * Residual risk
>> Probability of a trained employee clicking on a phishing event: Number of phishing events * Probability of clicking (by awareness level)
>> Residual risk cost per week: Customer risk cost per violation * Probability that an employee clicks on a link
Suppose we apply these calculations to the scenario described above. In that case, we see a striking difference in the “weekly cost of residual risk” depending on whether the detection rate is 5% or 10%: $431,000 versus $1.72 million, i.e., an additional cost of $1.3 million in terms of risk for an additional 5%.
The importance of detection rates
Given the financial issues linked to “risk,” companies have every interest in considering detection rates when selecting cybersecurity solutions or partners. As with any financial investment decision, they must measure their market exposure; in other words, they determine the likelihood of failure of their cybersecurity solutions, the costs involved, and whether these costs can be controlled.
The problem is that the key role of detection rates still needs to be addressed, and there currently needs to be a regulation forcing suppliers to be transparent about the detection rates of their solutions. Conversely, companies always have the opportunity to ask the question and listen carefully to the answer.
Beyond detection rates
Although the detection rate is a key indicator of the effectiveness of a cybersecurity solution, managing cybersecurity risks is a complex and multifaceted initiative. It requires close communication between stakeholders (employees, partners, banks, insurance, and governments).
Other measures recommended to consolidate their cyber defense include:
- Adopting a zero-trust architecture.
- Using Managed Security Service Providers.
- Even training.
Employees can be a formidable first line of defense and recognize a phishing email if they are well aware.
Cybersecurity can be compared to a chess game, in which the detection rate is certainly a variable but is only one piece on the chessboard. A truly comprehensive cyber risk management strategy is required, as businesses can protect their digital assets and ensure their financial stability in the face of ever-evolving cyber threats.
Also Read : Spend It Well By Focusing On Cybersecurity And Privacy