Leadership Laboratory

Leadership Laboratory

Audit and Governance

This series includes essays on security audit and governance. Tone at the top is a crucial aspect of leadership. However, our primary repository for audit information is the SANS audit blog.

Other Related Articles in Audit and Governance


Qualitative vs. Quantitative Risk Assessment


Stephen Sims

The field of risk assessment and risk management is becoming increasingly more complex as we navigate our way through the terrain of Operations, Audit, Compliance, Budgeting and the many other facets of business. In this battle we often find ourselves justifying all of the components used to assign a proper risk rating to the many business units within our organizations. Organizations are rightfully demanding data driven results. Gone are the days in which a security professional could stay hidden inside of a lab focusing on the latest 0-day exploits. We have all been exposed to the light and must take on a greater responsibility to protect our customers, employees and critical data.

It’s interesting that when you take a look at the primary reasoning behind using Domain Name System (DNS), you learn that people prefer to access websites over the Internet by name rather than by IP address. For example, it is much easier to remember www.hotmail.com as opposed to 64.4.32.7. What’s my point you ask? In the world of risk assessment, you will quickly find out that business units prefer numbers over names. Not just numbers, repeatable, internally consistent numbers. With information security, basing a final risk rating simply on numbers does not often result in the best analysis. Combining multiple elements gets us much closer to an accurate understanding of our threat level. Taking a multi-dimensional view at risk assessment however, tends to introduce a level of complexity when assessing the relative risk. This introduces the concepts of quantitative and qualitative measures.

Quantitative risk assessment comes into play when we have the ability to map a dollar amount to a specific risk. For example, lets say there are 1,000 records of confidential patient data at a medical center residing on a single database. This database is accessed directly by a web server which resides in a semi-trusted or DMZ environment. A compromise of the method in which the web server communicates with the database could result in the exposure of all 1,000 records of patient data. Let us also say that during a Business Impact Analysis (BIA) it was determined that the replacement cost for each record is $30. This cost includes contacting the patient to inform them of the incident, changing the patients account numbers and printing new health cards. We can easily determine that the maximum quantitative loss associated with a full compromise of that system is $30,000. Well, that wasn’t too bad was it? Unfortunately, there is much more to consider. But does quantitative risk have to always map to money? No! When you look at the 20 Critical Controls carefully, you note that 15 of the controls can be fully automated. Internally consistent repeatable numbers can be generated that can be used to create a dashboard or report to business unit managers.

Qualitative risk assessment rears its head in a different form. Let us introduce some additional factors and threat vectors into our example above. We now find out that the database that once held only 1,000 records is now going to hold a range of 10,000 records to 500,000 records. You also learn that multiple groups within the organization will be accessing and modifying the database daily and that the control of that system will fall under the Operations group. We’ve only made a couple of changes and already the complexity has greatly increased. You suddenly hear a knock on your office door... you glance through your peephole and realize it’s the auditors. After making a valiant attempt to hide, they make their way into your office and inform you that the data on said database is neither encrypted in transit to the web server or at rest on the database. They also inform you that the company has ninety days to document and remediate the issue as the system is not in compliance with the Health Insurance Portability and Accountability Act (HIPAA). We now have additional risk elements to add into our equation for our final assessment.

We must now understand any inherent vulnerabilities that exist on the system or application. For the sake of time, let us say that a code review was performed on the Internet-facing web application which speaks to our database in question. During this code review it was discovered that the application is vulnerable to an attack known as SQL Injection. This is the method in which a malicious user will attempt to append additional data into an SQL statement in the hopes of gaining access to data or a system for which they do not have permission to access. If the application does not properly filter out inappropriate user input, it may be susceptible to this form of attack.

Now that we’ve discovered a vulnerability that exists on our system, we must determine not only the cost associated with a compromise, but also the likelihood of discoverability, the difficulty of execution and a couple of other factors we will discuss. Let us now start with the cost associated with a compromise. Since there may be up to 50K records stored on the database, we must consider the worst-case scenario. 500K records X $30 per record = $15 million. We now see that there is a significant quantitative value associated with the risk. The problem with simply going by this formula is that it is only one-dimensional. Just because we store $100 million in a bank vault does not mean a criminal could easily steal it. This brings us back to our qualitative risk rating. We must have a way in which we can assign a risk level to a vulnerability that takes those other factors into consideration. You will often find that many organizations have three to five qualitative risk levels. To keep our example simple, let us use the three levels Low, Medium and High.

Where do we get our quantitative and qualitative data? Many security professionals focus on information from penetration tests and vulnerability scanners. This is good information, but it is not the complete picture. There are three primary views, the outside view, information that is gathered outside of the operating system, usually from the network. There is also the inside view, information taken directly from the operating systems. We need agents that run on the operating system, such as the McAfee HIPS product, Bit 9 or Secunia PSI to provide this information to our SIEM or console. Finally there is the information about the activities of the users of our systems. Where do they go, what do they do, what do they click on. One promising approach is internal testing, with fake phish and other similar tests.

Let's pull this together into an example:

So far we have learned the following about this environment:
  • The database could contain a range of 10K to 500K records.
  • Records are valued at $30 each.
  • Data is not encrypted in transit or at rest.
  • Multiple business units access and modify the data.
  • Systems are maintained by the Operations group.
  • We have an audit requirement to document the encryption issue and apply mitigating controls.
Let us introduce one final piece into our risk assessment. Reputation risk is the impact on earnings and investor or consumer confidence as a result of negative publicity to the business. In our situation, the most likely cause is an unauthorized disclosure of customer data due to system or network compromise. The negative impact of such an event could easily surpass the monetary loss associated with our quantitative risk assessment. This piece of risk assessment must always be taken into account to have a comprehensive rating. Often times you will find yourself contacting the legal department to understand the weight a particular compromise may have on the reputation of the company.

At this point we have introduced a myriad of elements into our risk assessment. Given the simplicity of our outsider threat vector through SQL Injection, the fact that this form of attack is not often detected by system logs and Intrusion Detection tools, the reputation risk associated with going public with 500K compromised records and the probability that this attack vector is likely to be repeated once discovered, we can easily assign a qualitative risk level of "High." We now have a quantitative risk assessment value of $15 million and a qualitative risk level of "High."

At this point Senior Management has the option to accept the risk rating that has been assigned, or options can be explored that may help lower the risk rating. Let’s go back to the CISSP and SANS Security Essentials courses for a moment and look at the formula for calculating a Single Loss Expectancy (SLE). You take the value of an asset; in this case it would be a single record at $30. You then take the Exposure Factor, which in our case is up to 500K records. We multiply the Asset Value by the Exposure Factor and come back to our value of $15 million. This is our SLE for our scenario. We must then calculate the Annualized Loss Expectancy (ALE) to determine how many times per year this incident is likely to occur. To calculate the ALE we must take the SLE and multiply it by the Annual Rate of Occurrence (ARO). The problem is that our database system has just been promoted to its new role and we do not have a good historical perspective on this type of threat. It is reasonably safe to say that if the cost of introducing controls to mitigate the chances of this type of attack are only a small fraction of the overall financial loss associated with a full compromise we can feel comfortable making suggestions to address the threat.

After submitting our recommendation for risk mitigation the corporation would like to invest in outsourcing the creation of customized Intrusion Detection signatures to alert on any traffic which poses a threat to our database in question. Host-based Intrusion Prevention Software (HIPS) will be installed on both the web application server and database server. A separate project will launch to look for ways of using column-level encryption on the database and encryption in transit. They would also like to invest in correcting the code review findings with a realistic deliverable date of six months from now. At this point we may feel comfortable enough to mitigate the inherent risk rating from "High" down to "Medium." Perhaps some penetration testing should be performed to determine if the new IDS and HIPS tools are properly configured and set up to block an attack. We can also feel confident that if the code is properly corrected within the assigned time-frame, the residual risk rating will result in a threat-level of "Low."

The most interesting part of risk assessment is that each and every circumstance you encounter will require its own customized criteria to properly determine a rating. With our above example we would still have to consider the insider threat vector associated with the lack of encryption in transit or on the database server at rest. Educating each group or individual on the many factors to properly assess a vulnerability will result in a much greater level of efficiency and repeatability down the line. Using a multidimensional approach including the areas mentioned in this article will certainly increase the validity and accuracy of your risk assessments.

Stephen Sims, GSE, CISSP, CISA. He can be reached at stephen.sims@deadlisting.com