http://www.govinfosecurity.com/dhss-huge-cybersecurity-skills-shortage-a-6080 By Eric Chabrow Gov Info Security September 20, 2013 More than one in five mission-critical cybersecurity-related jobs at a key Department of Homeland Security unit are vacant, the Government Accountability Office says. That’s a finding buried in a GAO report on how DHS could improve how it tracks recruiting costs. DHS’s National Protection and Programs Directorate’s Office of Cybersecurity and Communications, which houses much of the department’s cybersecurity personnel, had a vacancy rate of 22 percent as of June, according to a new GAO report, DHS Recruiting and Hiring. Why so many vacancies? DHS officials tell the GAO that they face some challenges because of the length of time to conduct security checks needed to grant clearances, low pay compared with private-sector positions and lack of clearly-defined skill sets for these positions. Each job in the federal government falls into an occupational series classification. Cybersecurity personnel are spread throughout a number of occupational series, with most categorized within the information technology series. […]
Do we need a new way to measure risk?
Absolutely! The old Risk = Threat X Vulnerability x Cost equation is a great methodology to measure risk as it takes a common sense approach to try and tie value to the likelihood that value could be impacted. I’m not suggesting that the whole thing be tossed out entirely, but isn’t there a more practical way to measure risk ? I’m aiming this article at explaining why it’s dead and what we might want to consider as a more viable way to quantify our risk score.
Top 10 Reasons the old equation is dead
1. Corporations often do not know about all the assets they own.
2. Configuration Management Databases (CMDB) often miss systems that are powered off, behind firewalls or due to other security controls.
3. Configuration Management Databases (CMDB) cannot quantify the value of a system.
4. Security Engineers cannot quantify a system manually as they do not have the spare cycles it would take to accurately perform the assessment.
5. Auditors do not have enough information from all the system, application, database owners to assess the value.
6. Applications often use similar infrastructures for storage, network or processing.
7. To come to a cost value, one must come up with every imaginable scenario which is almost impossible to predict.
8. How do we come up with an accurate value of a system?
9. How do we account for different data types since they impact “value” of “cost”?
10. It’s much easier for asset owners to determine the data type that flows through or is stored on a system or infrastructure component than it is to estimate the cost of a breach or the value of an information asset.
My risk equation proposal
New Formula: Risk = Threat x Vulnerability x Data Classification
Proposed Data Classification Values:
Classified = 5
Internal = 2
Public = 1
Looking at vulnerability management using a data classification weighting rather than some magical cost we bake up would serve to better enhance and target our resource remediation times to the appropriate places. I could even see a tie-in to DLP technologies so that the data classification is automatically updated into the vulnerability management tool so that it can be automated rather than manually defined by infrastructure, system or application owners. Often, none of these individuals know how valuable their information components are so security engineers plug in arbitrary values into asset value fields inside their vulnerability management platforms to signify the importance of systems…feel free to comment.