Regardless of industry affiliations, enterprises are universally tasked with protecting sensitive information – and will have data-at-rest that needs protection somewhere within their environment.
- Compliance and regulation drives some of this need for data-at-rest protection
- Risks from potential data breaches of consumer or customer information other requirements
- The need to protect sensitive intellectual property may be primary for other organizations
- And contractual requirements from customers might also drive the need to protect information.
These needs acquire an additional level of complication when organizations are driven to take advantage of the business flexibility and cost efficiency of the rapidly growing set of services available under the label of “cloud”. And we’re increasingly going to be placing our workloads there – one estimate is that by 2017 two-thirds of all workloads will be processed in the cloud and 1.4 zettabytes of data will be flowing over global networks.
Admittedly, the problems of protecting your data once it leaves your premises for a cloud environment are not suitable for a one-size-fits-all approach because of the wide variation in environment types, and vendor capabilities. Cloud environments are incredibly varied. From SaaS implementations where you strictly consume services, and have little say in how the security of infrastructure is implemented to PaaS and IaaS environments where you may have a mix of capabilities and responsibilities expected from your cloud provider, and others expected from you as the consumer of the cloud resource.
For the list below, you’ll find that some points apply to all environments – SaaS being the most limited and IaaS the most under your control. Regardless of the circumstance, if you apply these points to protect your data-at-rest in the cloud, you’ll meet or exceed many compliance requirements for protecting data-at-rest, and greatly reduce the attack surface available to those wishing to steal or otherwise compromise your information.
These points also shouldn’t break your budget – at Vormetric, we’ve seen customers do a simple risk analysis that included the likelihood of their organization becoming a target, then factored in the chance of being penetrated by an attack, and create a financial model for the potential exposure from lack of compliance or loss of protected information. In most cases, customers found that a shift in their IT spending priorities solved the funding problem.
- Have a minimum level of protection you apply to all cloud implementationsIf this point doesn’t make sense to you immediately consider these questions – some of them will apply to you:
- At the rate your organization creates data (and globally data is growing at an awesome rate that escalates every year) do your really know where all of our sensitive and legally protected data is located?
- When was the last time you checked?
- When was the last time you found a business unit using a cloud service or offering without your knowledge?
- Are you absolutely sure that application developers aren’t using old production data (i.e. sensitive, protected information) in your development and test environments?
- Are you using scalable infrastructure solutions behind your production applications that spin up additional capacity on demand? And where does that capacity exist?
- How about your DR scenarios? Are they in the cloud? And the data there protected?
The fact is that in any sizeable enterprise, it is nearly impossible to accurately know where your all of your sensitive information is located. When on premise, you have some additional peace-of-mind because of protections around your perimeter (although perimeter defenses are less effective now than at any time in the past), but cloud environments amplify the problem.
It is also true that not all protections you’ll find below are appropriate for all workloads – Some require that you know you have sensitive data to protect. But it does mean that you should choose a base, minimum level of data-at-rest protection that applies that you always apply, and extended protection for clearly sensitive information like healthcare records, credit card information, personnel records, and financial data.
How can you make this work? Become your organization’s “cloud broker” – make sure they have access to the resources they’ll need without going behind your back (assuming you are part of an IT organization), and then rigorously enforce standard s for protection for data. Typical methods for this include standard builds for IaaS environments that include required data security, resource gateways that protect information in SaaS environments and other data security infrastructure for every cloud resource you certify and make available. Do it quick before your data gets away.
- Get it there safelyFor many organizations, a “virtual private cloud” or other solution that includes a VPN to create an encrypted tunnel between your local network and your cloud resource will answer this need. But for more adhoc usage, encrypt the data before you move it to the cloud environment for safekeeping in transit. Cloud gateways to storage solution like Dropbox and Post-it meet some additional needs, but you’ll need an approach that addresses making sure data-at-rest from your environment that makes it’s way to the cloud is protected for each scenario that you support.
- Encrypt data-at-restThis might sound much like the last point but isn’t. Encryption is your front-line defense for defending data-at-rest. It limits access to those with the right keys – locking out anyone who doesn’t have them. It also meets a bevy of compliance requirements, removes any worry about retirement of disks and voids the threat of physical compromise of the cloud environment (even if someone walks away with the drive that has your data from the cloud provider, they won’t see a thing). Encryption by itself is not the answer – management of keys, policies for data access and other items covered below are also required. Once you’ve addressed the items below in combination with encryption, your attack surface available to “bad guys” is radically reduced, and the amount of data lost in an incident can be minimized.There is one item though that encryption by itself won’t address - the fact that in 20 years or so the computing power at the NSA will be able to read your data. This because by then NSA computing capabilities will naturally become more powerful, with the probable result being that breaking into data protected with present day encryption standards will be relatively easy. One thought to help bring this home - In the mid 90’s I was a product manager for Solaris at Sun Microsystems – and 64bit encryption solutions were what we included with the OS distribution. Today, even a desktop computer with the right tools has a good crack at being able to decrypt data protected with that standard in a reasonable amount of time.
- Keep your own keysEncryption key management is the next part of the story – Keep and manage your own keys, and do it off the cloud provider’s premises. The combination of encryption + keeping your own keys protects you from:
- A compromise by a third party of the cloud provider’s environment
- Denying the cloud provider access to your information – even cloud administrators or compromised cloud admin accounts can’t see your data
- Legal access to your data without your knowledge. A cloud provider could be compelled to give up your key if they manage it, and a local physical jurisdiction requires it – without your knowledge.
- Tie data access policies to your Directory Services Infrastructure and groupsAccess control to encrypted data from within your organization is the next point – again, this meets many compliance and regulatory requirements. Tying it to your Directory services infrastructure ties nicely to your on-boarding and off-boarding procedures – If you’ve implemented these correctly new employees get access to what they need automatically, and retired accounts are immediately denied access to sensitive information. Data access policies need a separate management chain and infrastructure that should “sync” to your directory services implementation. Policies can then be set that define what groups and applications have access to what data, under what circumstances and at what times.
- Monitor who, what, where, when and how data is accessedFrom within databases and applications, as well as at the OS/file system level, monitor and collect information on access to sensitive information. Collect and monitor the data for unauthorized access attempts that can let you know if an attack on your data is underway – alert where appropriate.
- Analyze monitoring data access patterns for anomaliesAn account authorized for access to sensitive data that starts behaving in an unusual way may well indicate an attack underway. As an example, an account that normally accesses ‘lightly” during the week, but is suddenly pulling down large amounts of data on a Sunday should throw an alert every time.
- Have mitigation plans in place when a threat is detectedThere’s been much talk about automatically denying access to sensitive data by potentially compromised accounts, but few organizations have been willing to implement this internally. When moving to the cloud though, with the enhanced risks of outside compromise and access, this should be strongly considered. At the least, a plan needs to be in place to respond to unauthorized access attempts to data and to anomalous access patterns. It should be well thought out, and based on the risk profile appropriate for the data and for your organization.
- Protect your encryption management and access policy setting environment with the same monitoring you use for your data
As cloud imparts additional risk, even to those to your security management environment – make sure that your security administrative accounts that set policy for access to data are monitored as well. A change in policy access patterns by an admin that is unusual may indicate the beginning of an attack that starts with compromised security administrative credentials. Or you may want to alert “always” when a security admin sets “root” access to protected data stored in the cloud – a vector that could allow an attacker to compromise your data without having to take the data out through your perimeter.