Friday, May 28, 2010

Predicting vulnerability

Great paper by Thomas Zimmerman and co. at Microsoft on the development of techniques/metrics to predict the number of vulnerabilities in a given binary. Not an incredible of productive research to date in this area predicting the number of remaining vulnerabilities, minimize them, or the time/complexity required to exploit them. This is critically important for software developers attempting to remove/reduce vulnerabilities, defenders deploying potentially highly vulnerable software (hello Adobe products!) and vulnerability researchers trying to optimize where to spend their energy and predict time to discovery.

From their abstract:
Many factors are believed to increase the vulnerability of software system; for example, the more widely deployed or popular is a software system the more likely it is to be attacked. Early identification of defects has been a widely investigated topic in software engineering research. Early identification of software vulnerabilities can help mitigate these attacks to a large degree by focusing better security verification efforts in these components. Predicting vulnerabilities is complicated by the fact that vulnerabilities are, most often, few in number and introduce significant bias by creating a sparse dataset in the population. As a result, vulnerability prediction can be thought of as the proverbial “searching for a needle in a haystack.” In this paper, we present a large-scale empirical study on Windows Vista, where we empirically evaluate the efficacy of classical metrics like complexity, churn, coverage, dependency measures, and organizational structure of the company to predict vulnerabilities and assess how well these software measures correlate with vulnerabilities. We observed in our experiments that classical software measures predict vulnerabilities with a high precision but low recall values. The actual dependencies, however, predict vulnerabilities with a lower precision but substantially higher recall.

Wednesday, May 19, 2010

Vulnerability Market Numbers

Great idea out of unsecurityresearch to do an anonymous survey of vulnerability researchers to identify their experiences interacting with the groups advertising vulnerability purchasing programs and direct buyers (anyone who buys vulnerabilities but doesn't maintain an advertised program).  I'll summarize some of the interesting results below.

iDefense bought the most bugs in total and was the slowest to pay. ZDI was second in both of those, and the slowest to make an offer. They also ranked the highest in "trustworthiness" and preference to sell to. SecuriTeam was second in preference and ranked the highest in "friendliness" with iDefense finishing last there.

I also did an analysis of the numbers by importing the totals into Excel. I discarded any data with less then two quantified samples (numbers over the quantified limit weren't included). Below I've included charts for client side, server side, aggregate as well as percentage of purchases that were "high value" (ie, exceeded the survey threshold).

Note that there are lots of opportunities for improvement in the numbers. First, some buyers buy more specialized bugs (ie, for products with limited market share). These bugs would go for less money due to the market impact and drive the vendor's numbers downwards as they would appear "on average" to pay less money when in fact they might pay normal or above average for the same bugs that others would. Ideally one would shop the same bug to multiple vendors and compare offers and do this for multiple bug classes to get a much better comparison.

Second, since the researchers reported the data anonymously and the survey was advertised to a limited group opportunities for bias exist there. They could be only reporting more interesting bugs, or disproportionately represent a specialty (Oracle products for example) that would skew the results.

From looking at the data it appears likely that the insufficient number of samples is biasing the "Direct" numbers too low. I took out a single low sample and that moved the average Direct price up to ~$9,400, putting Direct in first place. Given the percentage of Direct sales that exceed the "high value" threshold, I would argue that Direct sales are probably the highest on average but we don't have enough data to show exactly how much higher.

For all the (legitimate) complaints out of the NoMoreFreeBugs community and others it's great to see the market reacting and creating both financial incentives and information regarding the market for sellers and buyers. The survey is available here. I would encourage anyone interested in this area to pass the survey on to any researchers you know as increasing the statistical sampling will significantly improve the quality of the data available. Let me know if you want the numbers or more information about the analysis.

Also for further research on the topic including vendors, a good briefing by Pedram and some papers and other material on the topic see my post from September

Friday, May 14, 2010

Vehicle hacking

Researchers from UC San Diego and University of Washington have demonstrated the ability to compromise a modern automobile and assert control over critical functions of the vehicle using the government-mandated "On-Board Diagnostics (OBD-II) port. It is under the dash in virtually all modern vehicles and provides direct and standard access to internal automotive networks."

They implemented a CAN bus analyzer called CarShark and then fuzzed the protocols to find other attack vectors with significant results. 

The paper:

Experimental Security Analysis of a Modern Automobile (Or here)
K. Koscher, A. Czeskis, F. Roesner, S. Patel, T. Kohno, S. Checkoway, D. McCoy, B. Kantor, D. Anderson, H. Shacham, S. Savage. The IEEE Symposium on Security and Privacy, Oakland, CA, May 16-19, 2010.

They posted their Frequently Asked Questions on the paper.

While they used the OBD-II port (implying physical access was required) there are numerous wireless interfaces (such as CANRF) on board for entertainment, remote diagnostics and other features that interface with the on board network. Wireless interfaces include Bluetooth, Wifi, custom RF interfaces such as those for tire pressure, GM's Onstar system, one-way interfaces such as radio (particularly HD-radio) and Sirius and others.

Some web posts on the topic (from Stephen Northcutt):