Wednesday, March 25, 2009

Cerias Security Conference - Purdue

I attended sessions at the Cerias security conference on the Purdue conference today, and participated as a panelist on a discussion around the recent report "Unsecured Economies", performed by Dr. Karthik Kannan, Dr. Jackie Rees and Dr. Eugene Spafford, and funded by McAfee. (The report can be accessed through this request page: http://resources.mcafee.com/content/NAUnsecuredEconomiesReport).

The report is based on a study of 1000 senior IT decision makers across 800 companies, the distribution of their IP and data assets around the world, and the IP and data theft they have experienced in the last year. The numbers are rather staggering: $4.6M in AVERAGE losses per company in a single recent year.

Some interesting questions were asked during the panel, including "how were the values of the losses assessed?". Indeed "how we count" here is a tricky question. At Arxan, while we could look for example at the direct cost of any piracy of our software ("whoops, would/could have been a customer so that's a loss of $x of income/revenue"), the larger costs are in how such pirated s/w could be misused to compromise the value proposition of the company, and the resulting damage over the longer term to the company valuation.

My opening statement, boiled down, amounted to the following: enterprises today utilize vastly distributed computing elements, with no well defined perimeter, and each of which maintains and/or processes company data and/or IP. Perimeters defenses are ineffective, and even when in place around concentrations of computing elements, are too easily compromised through direct and indirect attack. Therefore, our security model must directly address the security of the fundamental data, the enterprise applications that process that data, and the keys that enable the legitimate usage of the data and applications.

This is where Arxan plays and represents Arxan core vision. And the reality is that's it's a journey and quest, by both us and our customers, because it's an ongoing battle with the criminals who are always seeking to overcome our latest and greatest solutions and defenses.

A few other notes on the conference. Dr. Ron Ritchey of Booz Allen (and also an adjunct professor at George Mason teaching a course in secure software development) gave the keynote this morning, and focused on the questions of how security flaws do or do not scale with the size and/or complexity of the code base. He had some fascinating data from the operating system world show the find rate of security issues in OS, particularly various Microsoft OS's, over time.

At one level (to me anyway) it's "obvious" that security issues scale with size and complexity. The questions are a bit more subtle than that: can security issues be taken out of a given code base over time, and can complexity management be applied to the continued development of or addition to that code base to keep aggregate security issues "constant" (or even on a downward sloping trend line)? The most obvious driver it seems to me are the s/w lifecycle practices utilized in the enhancement/maintenance process itself. Additionally, usage levels are a critical factor re: the resulting metrics. For example, Dr. Ritchey shared data showing nicely downtrending security find rates for NT starting around year 5 or 6 of deployment...but mightn't this be primarily a function of constently decreasing usage vs. newer Windows versions vs. any indication that NT was better or is being maintained "better"?

The other interesting data was on Vista as it compared with XP and other olders MS OS's. The find rate curve for Vista for the first two years is dramatically sharper that it was for XP (which in turn was higher than for the previous version), in fact the increase in slope was to me rather alarming. There were only two data points, but the trend is clearly in the wrong direction by a long shot, and this for an OS where increased security was a primary business objective (or so I understood). Of course the code size and complexity level of Vista vs. XP is much larger/higher, so..."to be expected", but that's not a good answer for us the users nor for the software industry in general, is it?

Ciao for now,

-Kevin