News today that OCR vendor ABBYY joins the long list of companies that have inadvertently left their databases or storage open to the Internet and suffered a data breach as a result.
While not the case for ABBYY, poor use of Amazon’s S3 is often the root of the problem. This is, in part, by design: S3 was designed to permit public access - for legitimate uses like web hosting. The challenge is that there are many other S3 use cases where public access is not required or desirable. So prevalent is the issue that there are scripts to find open buckets.
So what to do? Clearly we should be in the habit of regularly running tools like Trusted Advisor that will identify these issues after the fact. But, as managers, engineers and architects we need to face up to the fact that building in the cloud isn’t the same as in the datacenter.
When it’s on premise, we get into the habit of thinking security is someone else’s job: “The DBAs will take care of that” or “I don’t really understand TCP/IP but I figure the network guys will”. In the cloud, it’s all on you: everything is defined by configuration and if you don’t understand those configurations you’ll leave things open.
That same freedom you enjoy: “Hey I don’t have to wait for the server to be installed!” can also be your downfall. Time to stop the silo thinking and take accountability for the whole application.