Do we need a new way to measure risk?
Absolutely! The old Risk = Threat X Vulnerability x Cost equation is a great methodology to measure risk as it takes a common sense approach to try and tie value to the likelihood that value could be impacted. I’m not suggesting that the whole thing be tossed out entirely, but isn’t there a more practical way to measure risk ? I’m aiming this article at explaining why it’s dead and what we might want to consider as a more viable way to quantify our risk score.
Top 10 Reasons the old equation is dead
1. Corporations often do not know about all the assets they own.
2. Configuration Management Databases (CMDB) often miss systems that are powered off, behind firewalls or due to other security controls.
3. Configuration Management Databases (CMDB) cannot quantify the value of a system.
4. Security Engineers cannot quantify a system manually as they do not have the spare cycles it would take to accurately perform the assessment.
5. Auditors do not have enough information from all the system, application, database owners to assess the value.
6. Applications often use similar infrastructures for storage, network or processing.
7. To come to a cost value, one must come up with every imaginable scenario which is almost impossible to predict.
8. How do we come up with an accurate value of a system?
9. How do we account for different data types since they impact “value” of “cost”?
10. It’s much easier for asset owners to determine the data type that flows through or is stored on a system or infrastructure component than it is to estimate the cost of a breach or the value of an information asset.
My risk equation proposal
New Formula: Risk = Threat x Vulnerability x Data Classification
Proposed Data Classification Values:
Classified = 5
Internal = 2
Public = 1
Looking at vulnerability management using a data classification weighting rather than some magical cost we bake up would serve to better enhance and target our resource remediation times to the appropriate places. I could even see a tie-in to DLP technologies so that the data classification is automatically updated into the vulnerability management tool so that it can be automated rather than manually defined by infrastructure, system or application owners. Often, none of these individuals know how valuable their information components are so security engineers plug in arbitrary values into asset value fields inside their vulnerability management platforms to signify the importance of systems…feel free to comment.