Questionable Interpretation of Cybersecurity's Hidden Labor Cost
7.2.2018 securityweek Cyber
Report Claims a 2,000 Employee Organization Spends $16 Million Annually on Incident Triaging
The de facto standard for cybersecurity has always been detect and respond: detect a threat and respond to it, either by blocking its entry or clearing its presence. A huge security industry has evolved over the last two decades based on this model; and most businesses have invested vast sums in implementing the approach. It can be described as 'detect-to-protect'.
In recent years a completely different isolation cyber security paradigm has emerged. Rather than detect threats, simply isolate applications from them. This is achieved by running the app in a safe container where malware can do no harm. If an application is infected, the container and the malware is abandoned, and a clean version of the application is loaded into the container. There is no need to spend time and money on threat detection since it can do no harm. This is the isolation model.
The difficulty for vendors of isolation technology is that potential customers are already heavily invested in the detect paradigm. Getting them to switch to isolation is tantamount to asking them to abandon their existing investment as a waste of money.
Bromium, one of the earliest and leading isolation companies, has chosen to demonstrate the unnecessary continuing manpower cost of operating a detect-to-protect model, together with the unnecessary cybersecurity technology that supports it.
Bromium commissioned independent market research firm Vanson Bourne to survey 500 CISOs (200 in the U.S.; 200 in the UK; and 100 Germany) in order to understand and demonstrate the operational cost of detect-to-protect. All the surveyed CISOs are employed by firms with between 1000 and 5000 employees, allowing the research to quote figures based on an average organization of 2000 employees.
The bottom-line of this research (PDF) is that a company with 2,000 employees spends $16.7 million dollars every year on protect-to-detect. No comparable figure is given for an isolation model, but the reader is allowed to assume it would be considerably less.
The total cost is achieved by combining threat triaging costs, computer rebuilds, and emergency patching costs to provide the overall labor cost, plus the technology cost of nearly $350,000. The implication is that it is not so worrying to abandon $350,000 for a saving of $16 million -- and indeed, that would be true if the manpower costs are valid. But they are questionable.
All costs in the report are based on figures returned by the survey respondents. For example, according to the report, "Our research showed that enterprises issue emergency patches five times per month on average, with each fix taking 13 hours to deploy. That’s 780 hours a year, which—multiplied by the $39.24 average hourly rate for a cybersecurity professional—incurs costs of $30,607 per year."
But since these are emergency patches, we can add an additional $19,900 in overtime and/or contractor costs: a total of $49,900 every year that could be all but eliminated by switching to an isolation model.
The cost of computer rebuilds comes from the cost of rebuilding compromised computers that detect-to-protect has failed to protect. "On average," says the report, "organizations rebuild 51 devices every month, with each taking four hours to rebuild—equating to 2,448 hours each year. When multiplied by the average hourly wage of a cybersecurity professional, $39.24, that’s an average cost of $96,059 per year."
All these costs would seem to be realistic for a detect-to-protect model. The implication is that a switch to the isolate model would save nearly $500,000 per year to offset the cost of isolation. But the report goes much further, and suggests that much of a colossal $16 million can also be saved every year by an organization with 2,000 employees that will no longer require incident triaging by the security team.
How? "Well," claims the report, "on average SOC teams triage 796 alerts per week, taking an average of 10 hours per alert—that’s 413,920 hours across the year. When you consider that the average hourly rate for a cybersecurity professional is $39.24, that’s an annual average cost of more than $16 million each year."
The math works. But an alternative way of looking at these figures is that 7,960 hours of triaging would take more than 47 employees doing nothing but triaging 24 hours a day, seven days a week. Frankly, I doubt if any company with 2,000 employees does anything near this amount of triaging. It is, I suggest, misleading to state bluntly (as the report does): "Organizations spend $16 million per year triaging alerts."
“Application isolation provides the last line of defense in the new security stack and is the only way to tame the spiraling labor costs that result from detection-based solutions,” says Gregory Webb, CEO at Bromium. “Application isolation allows malware to fully execute, because the application is hardware isolated, so the threat has nowhere to go and nothing to steal. This eliminates reimaging and rebuilds, as machines do not get owned. It also significantly reduces false positives, as SOC teams are only alerted to real threats. Emergency patching is not needed, as the applications are already protected in an isolated container. Triage time is drastically reduced because SOC teams can analyze the full kill chain.”
All of this is perfectly valid -- except for the $16 million annual detect-to-protect triaging claim. SecurityWeek has invited Bromium to comment on our concerns, and will update this article with any response.