Today Sec. of State Clinton went after China for their network censorship:
http://www.cbsnews.com/stories/2010/01/21/ap/tech/main6123918.shtml
However, as I see it, the issue of real significance here isn't China's censorship. The news reports of "attacks" on Google and other "unnamed" companies is the action of real significance. I'm not referring to illegal access to mail accounts. I'm referring to the explicit theft of intellectual property in the form of source code:
http://www.wired.com/threatlevel/2010/01/google-hack-attack/
In China, the coupling between government and leading companies in different industries is extremely strong. It can be hard to distinguish where a company stops and the government begins when it comes to such industry players as Baidu, HuaWei, and China Telecom.
It is reasonable to suspect and to investigate the potential that aggressive theft of source code from US companies is an activity that is being actively supported, and potentially even led, by the Chinese government. It appears that at the very least, the Chinese government tolerates such operations and private industry reuse of this stolen software.
In an age when information and intellectual property is the coin of the realm, does government sanctioned intellectual property theft constitute not just a crime, but verges on an act of war?
These kinds of acts should be investigated deeply by the government. Regardless of ultimate responsibility, we need a strong, overt response from the US government. The message must be clear and backed by strong actions that this kind of attack will not be tolerated and will be prosecuted.
A specific US response needs to include a product watch program to monitor for the use of stolen software, followed by vigorous prosecution of such illegal usage of stolen technology through available legal, diplomatic and trade channels. Reused source code will have significant bodies of unique identifiable binary code in the products utilizing the technology. This is an area where private industry has far too little power to fight back effectively, though it could play a key role in the monitoring program.
I acknowledge the private industry accountability for failing to prevent such theft. We in the software industry can and must make deeper investments in our security systems around our core property of value, our source code. DLP technologies, encryption technologies, strong multi-factor authentication for source access, and other solutions are available.
China's censorship is an important issue. That some group from China is actively stealing US company technology out from under our nose is an extremely important issue as well, and needs equal attention and even more governmental action.
At Arxan, we provide technologies to help protect software intellectual property through protection of the binary code with what we call "guards". We provide this technology in both military/classified forms to the DoD and DoD contractors, and in commercial form to commercial customers. However, to protect the source code of software from theft through systemic security holes, different measures are needed. Stronger source code security measures need to be deployed by private industry. The US government must speak out and lead in efforts to identify and prosecute those responsible and those who attempt to take advantage of such theft.
Thursday, January 21, 2010
Monday, January 11, 2010
Secure Software Marketplaces
The news today of a trojan'd application for Android phones (http://www.sophos.com/blogs/gc/g/2010/01/11/banking-malware-android-marketplace) is a fascinating and potentially extremely significant, if not altogether expected development in the smart phone wars.
Simply put, if the consumer marketplace develops a ground fear of the software available for Android phones, the predictions about Android phone growth may be vastly inflated.
Whether we like it or not (and some don't, preferring a phone browser centric world), ubiquitous phone apps are the "killer app" for smart phones, at least for the moment. This single spot of bad news for Android can quickly become a huge differentiator for Apple with its controlled iTunes store for safe apps for the iPhone. Similarly, it points to an interesting opportunity in the business ecology: who is going to offer a vetted app store for Android phones, with appropriate software security reviews on the in-bound side and guarantees on the outbound side? Without such a market service, I'm suspicious that hackers will quickly ruin the unregulated marketplace for Android apps.
Secure 'droid app store anyone? Anyone?
Simply put, if the consumer marketplace develops a ground fear of the software available for Android phones, the predictions about Android phone growth may be vastly inflated.
Whether we like it or not (and some don't, preferring a phone browser centric world), ubiquitous phone apps are the "killer app" for smart phones, at least for the moment. This single spot of bad news for Android can quickly become a huge differentiator for Apple with its controlled iTunes store for safe apps for the iPhone. Similarly, it points to an interesting opportunity in the business ecology: who is going to offer a vetted app store for Android phones, with appropriate software security reviews on the in-bound side and guarantees on the outbound side? Without such a market service, I'm suspicious that hackers will quickly ruin the unregulated marketplace for Android apps.
Secure 'droid app store anyone? Anyone?
Monday, November 9, 2009
Security in the Cloud
Cloud computing is one of the "big new things" in commercial computing today. The promises of cloud computing are broad and deep: lowered capital costs, lowered operational costs, ease of scale, broad accessibility, high availability, and more.
And then there's security. It's the usual follow-on question after hearing about all the benefits, "yes, great, and...what about security?".
The simple truth is that cloud computing carries with it each and every security risk that already existing in your commercial computing environment, and unfortunately significantly increased risks.
Why is this so? Simply because at the highest levels, there is little structural change in shifting elements of your computing infrastructure from "here" (inside your corporate data center) to "there" (inside an external vendors corporate data center). The same security controls you needed (and in many cases didn't have) are needed in your cloud providors environment (and in many cases they don't have), and the same fundamental attack vectors and risks are present.
As we drill down into the details however, it will become clear that the situation is worse than this, for two fundamental reasons: one is shared infrastructure, the second is a general loss of control. Let's look at each of these.
The foundation of the cost benefit premise of cloud computing rests on the leverage achieved through a shared computing infrastructure, with the cost benefits of scale and higher average utilization. But shared with who? That's risk #1; you don't know who, and you can't control who. "Other companies, other users." Shared at what level? At all levels: shared storage, shared networking, shared routers, shared firewalls, right on down to operating your applications on the same physical hardware being used by other cloud clients (though always in a separate virtual machine instance).
So what's the risk of that? The risk is the ease of access to your data and application software. By definition, an environment where "others" are running their software and maintaining their data in the same physical environment that you are running your software and maintaining your data creates very substantial incremental security risk, because environmental access is the first step in any and every IP and data theft attack. If I'm "in" the general computing environment, and I can run arbitrary application software, I've got a launching pad for attacks on local data and applications.
Another element of shared infrastructure in cloud computing is the extension of the insider risk. Many of your own insiders will still have cloud environment access similar to the access they had when you were running inside your own data center. However, you've now added a whole new class of insiders: the cloud provider employees! And unlike your own insider threats, where you can take active steps to reduce risk, with the cloud provider you have no controls and no influence. Relative to these unknown people, you applications and data might as well be considered "fully available", with all that that implies.
The second general area of risk is in a loss of controls. This loss of control is across the board, starting at the level of physical access; when you operated in your data center, you controlled physical access, and with a cloud provider you don't. Logical access is no different; what people (administrators or otherwise) can access your databases and your applications? You have vague assurances from the cloud provider, but you have no direct control whatsoever.
This control issue extends out to more subtle yet extremely significant areas. Take the example of web application security risks. These are the most pernicious security risks in computing today, with SQL injection attacks alone (just one of many types of web application security risks) resulting in the theft of millions of credit card numbers. The most recent attempt to harden web applications is through the deployment of so called web application firewalls. These are networking appliances that monitor networking traffic looking for evidence of a web application attack. These devices require a very high amount of customization in their specific monitoring practices, effectively to "tune" the firewall the specifics of the applications and their operations being protected. Can such a solution be applied in your shift to a cloud computing environment? Generally no, due to the difficulty of assuring the application firewall is both "in the right place" relative to what is now managed as a highly mobile set of applications within the large cloud infrastructure environment, and the need for your application firewall rules to apply to your applications data flow and your applications data flow only.
Control issues cut right through all traditional required practices in commercial computing. Backup? Of course the cloud vendor provides backup! Can you test that it's actually occurring and the data is recoverable? There have already been major examples of commercial cloud providers losing customer data. It's a risk, and it's driven by your loss of control when shifting your computing practices to an external provider, and those risks are exacerbated by the shared infrastructure nature of that environment.
All of this said, cloud computing is here and it's expanding it's footprint dramatically across the commercial computing landscape. Cost saving attracts commercial usage likes light attracts moths. The issues cited here are going to get incrementally addressed over time, as part of high value cloud solutions.
The better news is that some fundamental solution technology exists today. The essence of security protection in a cloud environment is to take advantage of what you do control to implement security mechanisms to the level required by your business. The two critical control points are, simply put, your applications and your data.
Data security solutions have been increasingly developed and deployed over the last ten years, and these solutions generally can be deployed coupled directly into the cloud hosting environment. Any computing solution migration to the cloud must seriously consider the addition of such security technologies.
Application internal security solutions are a relatively new technology area. This kind of technology derives from military grade technology utilized to protect critical military technology assets from reverse engineering and tampering. This technology is now available for and being applied to commercial software.
Application internal security technology puts security functions directly into the application software. These security functions start with obscuring the code flow, the instruction sequencing, and even the unencrypted presence of critical blocks of code, to protect against reverse engineering and through reversing, the identification of critical value components and/or critical points for effective tampering. They extend to dynamic monitoring of code correctness both in terms of actual instruction to dynamic code behaviors. And such security units can, internally within the application, monitor data flows to detect and respond to evidence of web application security attacks.
The tremendous benefit of application internal security technology is the complete independence such technology has from location considerations. An internally secured application carries it's security properties with it, where ever it goes: in your data center, on your employee's laptops and cellphones, or in a external provider's cloud computing environment. Such technology is immune to network topology changes, and protects the application in private and shared infrastructures.
Cloud computing is still in it's infancy, and it's reasonable to say that cloud computing is one of several fundamental change agents that is transforming our information world at a faster rate than ever before. While cloud computing has dramatic benefits and is highly attractive as a computing environment solution, it must be approached extremely cautiously from a security perspective. The shared nature of the cloud and the loss of controls that occur when utilizing the cloud dramatically increase your security risk footprint. The best and most immediately available technologies for dealing with these two factors are the deployment of application internal security technologies and data security technologies.
And then there's security. It's the usual follow-on question after hearing about all the benefits, "yes, great, and...what about security?".
The simple truth is that cloud computing carries with it each and every security risk that already existing in your commercial computing environment, and unfortunately significantly increased risks.
Why is this so? Simply because at the highest levels, there is little structural change in shifting elements of your computing infrastructure from "here" (inside your corporate data center) to "there" (inside an external vendors corporate data center). The same security controls you needed (and in many cases didn't have) are needed in your cloud providors environment (and in many cases they don't have), and the same fundamental attack vectors and risks are present.
As we drill down into the details however, it will become clear that the situation is worse than this, for two fundamental reasons: one is shared infrastructure, the second is a general loss of control. Let's look at each of these.
The foundation of the cost benefit premise of cloud computing rests on the leverage achieved through a shared computing infrastructure, with the cost benefits of scale and higher average utilization. But shared with who? That's risk #1; you don't know who, and you can't control who. "Other companies, other users." Shared at what level? At all levels: shared storage, shared networking, shared routers, shared firewalls, right on down to operating your applications on the same physical hardware being used by other cloud clients (though always in a separate virtual machine instance).
So what's the risk of that? The risk is the ease of access to your data and application software. By definition, an environment where "others" are running their software and maintaining their data in the same physical environment that you are running your software and maintaining your data creates very substantial incremental security risk, because environmental access is the first step in any and every IP and data theft attack. If I'm "in" the general computing environment, and I can run arbitrary application software, I've got a launching pad for attacks on local data and applications.
Another element of shared infrastructure in cloud computing is the extension of the insider risk. Many of your own insiders will still have cloud environment access similar to the access they had when you were running inside your own data center. However, you've now added a whole new class of insiders: the cloud provider employees! And unlike your own insider threats, where you can take active steps to reduce risk, with the cloud provider you have no controls and no influence. Relative to these unknown people, you applications and data might as well be considered "fully available", with all that that implies.
The second general area of risk is in a loss of controls. This loss of control is across the board, starting at the level of physical access; when you operated in your data center, you controlled physical access, and with a cloud provider you don't. Logical access is no different; what people (administrators or otherwise) can access your databases and your applications? You have vague assurances from the cloud provider, but you have no direct control whatsoever.
This control issue extends out to more subtle yet extremely significant areas. Take the example of web application security risks. These are the most pernicious security risks in computing today, with SQL injection attacks alone (just one of many types of web application security risks) resulting in the theft of millions of credit card numbers. The most recent attempt to harden web applications is through the deployment of so called web application firewalls. These are networking appliances that monitor networking traffic looking for evidence of a web application attack. These devices require a very high amount of customization in their specific monitoring practices, effectively to "tune" the firewall the specifics of the applications and their operations being protected. Can such a solution be applied in your shift to a cloud computing environment? Generally no, due to the difficulty of assuring the application firewall is both "in the right place" relative to what is now managed as a highly mobile set of applications within the large cloud infrastructure environment, and the need for your application firewall rules to apply to your applications data flow and your applications data flow only.
Control issues cut right through all traditional required practices in commercial computing. Backup? Of course the cloud vendor provides backup! Can you test that it's actually occurring and the data is recoverable? There have already been major examples of commercial cloud providers losing customer data. It's a risk, and it's driven by your loss of control when shifting your computing practices to an external provider, and those risks are exacerbated by the shared infrastructure nature of that environment.
All of this said, cloud computing is here and it's expanding it's footprint dramatically across the commercial computing landscape. Cost saving attracts commercial usage likes light attracts moths. The issues cited here are going to get incrementally addressed over time, as part of high value cloud solutions.
The better news is that some fundamental solution technology exists today. The essence of security protection in a cloud environment is to take advantage of what you do control to implement security mechanisms to the level required by your business. The two critical control points are, simply put, your applications and your data.
Data security solutions have been increasingly developed and deployed over the last ten years, and these solutions generally can be deployed coupled directly into the cloud hosting environment. Any computing solution migration to the cloud must seriously consider the addition of such security technologies.
Application internal security solutions are a relatively new technology area. This kind of technology derives from military grade technology utilized to protect critical military technology assets from reverse engineering and tampering. This technology is now available for and being applied to commercial software.
Application internal security technology puts security functions directly into the application software. These security functions start with obscuring the code flow, the instruction sequencing, and even the unencrypted presence of critical blocks of code, to protect against reverse engineering and through reversing, the identification of critical value components and/or critical points for effective tampering. They extend to dynamic monitoring of code correctness both in terms of actual instruction to dynamic code behaviors. And such security units can, internally within the application, monitor data flows to detect and respond to evidence of web application security attacks.
The tremendous benefit of application internal security technology is the complete independence such technology has from location considerations. An internally secured application carries it's security properties with it, where ever it goes: in your data center, on your employee's laptops and cellphones, or in a external provider's cloud computing environment. Such technology is immune to network topology changes, and protects the application in private and shared infrastructures.
Cloud computing is still in it's infancy, and it's reasonable to say that cloud computing is one of several fundamental change agents that is transforming our information world at a faster rate than ever before. While cloud computing has dramatic benefits and is highly attractive as a computing environment solution, it must be approached extremely cautiously from a security perspective. The shared nature of the cloud and the loss of controls that occur when utilizing the cloud dramatically increase your security risk footprint. The best and most immediately available technologies for dealing with these two factors are the deployment of application internal security technologies and data security technologies.
Tuesday, October 20, 2009
The Democratization of Software
It's a strange new software world!
For those of us old enough to remember things like mainframes (my first ever computer programs ran on an IBM 360 model E22 at a local community college!), minicomputers (Dec PDP's, HP1000's, Data General Nova's, etc.), then the world changing arrival of the "PC" in 1983, the world of software was generally a "dark art". Very very few people knew what software was, and the population of those who actually wrote software was even smaller.
I personally learned my programming chops first in that same community college's computer center, writing Cobol code to schedule the lazy counselors to appointments with students (a brilliant idea of a new school VP administrator promoted out of the computer center, knowing that since the kid doing the programming was the son of a member of the board of trustee's, they couldn't effectively fight it!). Then it was on to writing assembly code for a PDP 11/35 running a customized version of RT-11, to drive and test a custom data acquisition board built by a small shop (Acroamatics) for the Navy. Then on to kernel level operating system development in HP, working again in assembly language on kernel level code for the HP1000 and the RTE (IV, VI, and A) operating system.
In those days, the late 70's and the 80's, software was generally incomprehensible to the masses. Literally. People just had no clue. By the early 90's that was changing pretty fast; people "knew about" software, but for the most part, in the same way they "knew about" automobile engines. That is, they knew software was there, was important, and "made the computer go", but not much more.
This starting changing in a major way with the development of the web and web site programming, starting with HTML (arguably not a programming language but let's not quibble). Suddenly a lot of "non-technical" people (non-computer scientists) were "programming". And as abilities to link in actual run-time software into web pages (PHP, Perl, Javascript, etc.) have become prevalent, this same group advanced into what is definitely the world of writing procedural software.
Now we have the iPhone and an open development environment for it. We are witnessing another huge shift in the breadth of activity in the creation of software, driven by this new ubiquitous platform. The opportunity to sell a few hundred thousand copies of a cool little application for a buck apiece suddenly brings the opportunity of "software for profit" right into the mainstream...and the mainstream is responding. We are seeing an explosion of a new cottage industry right before our eyes. I don't know the actual numbers of downloads of the objective C development environment for the iPhone, but I'm certain the numbers are staggering. The volume of applications available for the iPhone from this cottage industry is certainly staggering, and considering what a small percentage of actual development activity out there that represents, we have to acknowledge that a seismic level expansion of software development is underway.
Again, here's the point: for the FIRST time ever, we've are experiencing a "grand conjunction" of a widely popular platform with broad computing and I/O capabilities, with a freely available development environment, with a effective channel with a strong demand pull, with a world wide population who through web programming already has some awareness, skill and inclination. And viola...instant massive cottage software industry.
What are the longer term impacts of these "force vectors" going to be? I have several projections.
First, in the world of personal computing devices (which how I think of the iPhone by the way; the "phone" part of it I consider to merely be one of it's many I/O features), a free and open development platform is going to be a must. A single company can't compete against the forces of "solution" innovation and availability that Apple has shown can be unleashed.
Second, this "democratization" of software development isn't going to stop. SW skills are expanding across the population at an unprecedented rate, and that growth is going to continue and even accelerate. What exactly the impact of that will be is hard to predict, but I do believe as the world increasingly is driven by and supported by software, this is an enabler for the world's economy.
Third, the world of software cracking (finding technological ways to run this commercial software for free or for a black market low price) is going to continue to be a huge technology area and force in the industry. You can't discuss iPhone apps too long with friends and colleagues before hearing about the ability to "unlock" all the apps available "for free". There is a dark side of this democratization, a black market side. The technology race to fight those black market forces is just getting going in this particular market. Of course my company, Arxan Technologies, has been working for years with more serious users of such technologies, namely the US Department of Defense. These technologies are becoming more prevalent in the mass market consumer software space, helping to protect the product software that your son, your sister, and maybe even YOU wrote and published yourself!
For those of us old enough to remember things like mainframes (my first ever computer programs ran on an IBM 360 model E22 at a local community college!), minicomputers (Dec PDP's, HP1000's, Data General Nova's, etc.), then the world changing arrival of the "PC" in 1983, the world of software was generally a "dark art". Very very few people knew what software was, and the population of those who actually wrote software was even smaller.
I personally learned my programming chops first in that same community college's computer center, writing Cobol code to schedule the lazy counselors to appointments with students (a brilliant idea of a new school VP administrator promoted out of the computer center, knowing that since the kid doing the programming was the son of a member of the board of trustee's, they couldn't effectively fight it!). Then it was on to writing assembly code for a PDP 11/35 running a customized version of RT-11, to drive and test a custom data acquisition board built by a small shop (Acroamatics) for the Navy. Then on to kernel level operating system development in HP, working again in assembly language on kernel level code for the HP1000 and the RTE (IV, VI, and A) operating system.
In those days, the late 70's and the 80's, software was generally incomprehensible to the masses. Literally. People just had no clue. By the early 90's that was changing pretty fast; people "knew about" software, but for the most part, in the same way they "knew about" automobile engines. That is, they knew software was there, was important, and "made the computer go", but not much more.
This starting changing in a major way with the development of the web and web site programming, starting with HTML (arguably not a programming language but let's not quibble). Suddenly a lot of "non-technical" people (non-computer scientists) were "programming". And as abilities to link in actual run-time software into web pages (PHP, Perl, Javascript, etc.) have become prevalent, this same group advanced into what is definitely the world of writing procedural software.
Now we have the iPhone and an open development environment for it. We are witnessing another huge shift in the breadth of activity in the creation of software, driven by this new ubiquitous platform. The opportunity to sell a few hundred thousand copies of a cool little application for a buck apiece suddenly brings the opportunity of "software for profit" right into the mainstream...and the mainstream is responding. We are seeing an explosion of a new cottage industry right before our eyes. I don't know the actual numbers of downloads of the objective C development environment for the iPhone, but I'm certain the numbers are staggering. The volume of applications available for the iPhone from this cottage industry is certainly staggering, and considering what a small percentage of actual development activity out there that represents, we have to acknowledge that a seismic level expansion of software development is underway.
Again, here's the point: for the FIRST time ever, we've are experiencing a "grand conjunction" of a widely popular platform with broad computing and I/O capabilities, with a freely available development environment, with a effective channel with a strong demand pull, with a world wide population who through web programming already has some awareness, skill and inclination. And viola...instant massive cottage software industry.
What are the longer term impacts of these "force vectors" going to be? I have several projections.
First, in the world of personal computing devices (which how I think of the iPhone by the way; the "phone" part of it I consider to merely be one of it's many I/O features), a free and open development platform is going to be a must. A single company can't compete against the forces of "solution" innovation and availability that Apple has shown can be unleashed.
Second, this "democratization" of software development isn't going to stop. SW skills are expanding across the population at an unprecedented rate, and that growth is going to continue and even accelerate. What exactly the impact of that will be is hard to predict, but I do believe as the world increasingly is driven by and supported by software, this is an enabler for the world's economy.
Third, the world of software cracking (finding technological ways to run this commercial software for free or for a black market low price) is going to continue to be a huge technology area and force in the industry. You can't discuss iPhone apps too long with friends and colleagues before hearing about the ability to "unlock" all the apps available "for free". There is a dark side of this democratization, a black market side. The technology race to fight those black market forces is just getting going in this particular market. Of course my company, Arxan Technologies, has been working for years with more serious users of such technologies, namely the US Department of Defense. These technologies are becoming more prevalent in the mass market consumer software space, helping to protect the product software that your son, your sister, and maybe even YOU wrote and published yourself!
Monday, August 3, 2009
Code protection is critical in a web 2.0 world!
Neil McDonald of Gartner blogged on the differences between byte code and binary code analysis:
http://blogs.gartner.com/neil_macdonald/2009/07/24/byte-code-analysis-is-not-the-same-as-binary-analysis/
His points are important at a deeper level as it relates to the risk of reverse engineering and tampering. Specifically, byte code (.NET and Java) is almost trivially reversed engineered, and fairly easily tampered with using available tools...unless active steps are taken to address the risk.
Byte code representations of programs contain sufficient information to allow a complete inverse compilation back to source code. To address this problem, use of a .NET or Java obfuscator is necessary. The best in class obfuscators can perform a host of transformations with minimal to no impact on performance that raise very large hurdles for the would be theft. The transformations include general code encryption, code restructuring to create complexity that is not understood by inverse compilers (and difficult to understand by human analysis as well), string encryption so that variable and static data names become unintelligable, deletion of meta-data that describes program attributes, and even insertion of code for dynamic detection of evidence of tampering.
This kind of code protection becomes paramount in a Web 2.0 world were significant application components are being deployed to and executed by customers. Additionally, this kind of code protection is critical in a highly mobile world where applications and data frequently are on the move with employees.
http://blogs.gartner.com/neil_macdonald/2009/07/24/byte-code-analysis-is-not-the-same-as-binary-analysis/
His points are important at a deeper level as it relates to the risk of reverse engineering and tampering. Specifically, byte code (.NET and Java) is almost trivially reversed engineered, and fairly easily tampered with using available tools...unless active steps are taken to address the risk.
Byte code representations of programs contain sufficient information to allow a complete inverse compilation back to source code. To address this problem, use of a .NET or Java obfuscator is necessary. The best in class obfuscators can perform a host of transformations with minimal to no impact on performance that raise very large hurdles for the would be theft. The transformations include general code encryption, code restructuring to create complexity that is not understood by inverse compilers (and difficult to understand by human analysis as well), string encryption so that variable and static data names become unintelligable, deletion of meta-data that describes program attributes, and even insertion of code for dynamic detection of evidence of tampering.
This kind of code protection becomes paramount in a Web 2.0 world were significant application components are being deployed to and executed by customers. Additionally, this kind of code protection is critical in a highly mobile world where applications and data frequently are on the move with employees.
Subscribe to:
Posts (Atom)