In the article below, Kevin Fogarty picks up cloud where he left it – with colourful discussion of security challenges that starts with the very early days of public cloud when IT had legitimate concerns around visibility into cloud operation and access and ends with some pretty compelling stats on questionable behaviour in the shadowy world of shadow IT, where IT has again lost control but this time to business users who are riding solo in the public corral. In both instances (and in the case of public cloud providers who optimize for resource efficiencies), the risk to enterprises is clear; however, Fogarty provides guidance on an alternative to continued risk that can help businesses quickly protect themselves – encryption. If you have ever wondered about different kinds of encryption, or about the development of new kinds of encryption services that can build security for data at rest and in motion – the norm in the mobile/cloud era – read on! (ed.)
Corporate IT departments avoided the public cloud for years because they were worried they wouldn’t be able to see corporate data in the cloud, let alone protect it.
They were right, mostly. Cloud computing was new enough that the legions of vendors who built vast libraries of software that was completely inexplicable even to most in IT had not had time to develop versions of their tools that would let a corporate NOC crew see how many virtual machines were active in a distant cloud, what kinds of workloads they were running, how close to capacity the physical servers were, what data was present, whether the cloud held any data that didn’t also exist inside the firewall (where it wouldn’t be lost), or that would provide visibility into active details like the names of employees who had accessed the data, when they’d done it and what they had done, where they’d logged in from, what device they’d used, whether their noses were itchy at the time and what the temperature of processors on the physical servers holding the virtual servers was at the time that users had scratched that itch.
Despite going to work every day to fire up banks of monitors so sophisticated they would have made the NASA ground controllers who sent Apollo 17 to the moon and got it back again weep with bitter envy, a corporate network operations crew circa 2009 might be able to confirm that Amazon or Azure was online if they were willing to send a ping or read the piles of marketing spam in their email.
If they wanted to know if an employee had added a CRM app to the IT inventory by signing up for Salesforce – or traded a hot sales leads-list for a pile of unhip MP3s by using Dropbox to transfer music to the office and work to the house – the all-seeing gurus of the network operations center (NOC) crew had a better chance of finding out by hanging around the lunchroom hoping someone would mention it than by trying to detect the stream of alien data moving into the corporate infrastructure.
Five years later, 81 percent of end users admit to using unapproved cloud services for important work purposes, and the average corporation has 579 cloud-based applications poking through its firewall every work day, though the CIOs in charge think they have fewer than 10. Of all the apps used surreptitiously by employees of large companies today, 88.7 percent are not secure enough to be considered an “enterprise” rather than a “consumer” app, according to enterprise cloud monitoring and security firm Netskope.
On the bright side, IT people (who use cloud apps in the office at least as often as end users do) seem now to have committed to supporting end users’ use of the cloud, though they continue to be stumped about how to do it.
The shiny new, heavily clustered, entirely virtualized data centres that house public-cloud services – which are configured to provide on-demand self-service, rapid changes in capacity, measured resource allocation and service levels – often have as top priority the pooling and conservation of computing resources to reduce the need to buy more and more servers, space, power, cooling and real estate.
In contrast, enterprise data centres tend to be carefully and statically configured to deliver required levels of performance for especially critical apps, and sprinkled with a mix of next-generation technology and a larger pile of legacy systems and the legacy system gurus who mind them like aging tigers who are stable enough when everything is normal, but prone to sudden and catastrophic violence when startled or disturbed.
The security and monitoring apps that have evolved into a highly efficient way to run enterprise data centres are rarely well suited for cloud environments. As a result, many cloud providers limit the monitoring and management systems a customer can bring in to avoid having customer tech clash with the cloud crowd’s own performance-tuning gear and to keep customers who don’t always keep up with the speed with which workloads and customer demesnes are shifted during load balancing from accidentally managing and crashing another customer’s server.
The biggest barrier to comprehensive hybrid cloud security, however, is a lack of preparation.
Forty-four percent of corporate data stored in the cloud isn’t managed or controlled at all by IT, according to a survey of 1,800 IT pros conducted by Ponemon Institute and sponsored by SafeNet.
Not only does IT not manage that data; IT isn’t even sure who should be responsible for keeping track of it. Only 38 percent of IT organizations have decided what person or department is primarily responsible for cloud security, according to the survey.
More than two thirds of respondents said it’s harder to manage user identities in the cloud due to the mix of sites and services and the tendency of users to log in from anywhere without logging in to a server or other resource inside the corporate firewall, which could help authenticate and track those users.
Sixty two percent of respondents said their companies allow people from third-party companies to access data in the cloud, confusing the authentication picture even more.
So, rather than try to build a brand new security infrastructure overnight – or make up for five years of ignoring the cloud overnight – many IT organizations are taking the one obviously most advantageous step: they’re encrypting or planning to encrypt all the data going from the enterprise to the cloud.
It makes sense from an efficiency standpoint. If putting data in the cloud removes IT’s ability to track and control the length of time it lives in the cloud, when or if it’s backed up, or if IT can’t identify or block users from third-party companies wanting to access the data or control who can see it, the safest thing to do is encrypt everything that goes into the cloud.
“The solution to government surveillance is to encrypt everything,” Google chairman Eric Schmidt told Bloomberg in the wake of the Edward Snowden scandal last November.
“Encryption works,” Snowden himself said in an ask-me-anything session with the UK’s Guardian newspaper. “Properly implemented strong crypto systems are one of the few things that you can rely on.”
Snowden was talking about encrypting messages in email, not files posted on a cloud service, but the principle is the same. A company that encrypts every file stored on any of its servers or sent over its network may still have files swiped from a Dropbox account or intercepted on the way to a Salesforce server. But if the encryption is strong enough – and the business patient enough, because strong encryption takes a few extra seconds every time a file is stored, which will tempt users to try to avoid the process – the file will probably remain secure even if it’s stolen.
Unfortunately, encryption isn’t the be-all, end-all of online data impermeability. Encryption based on algorithms that are too weak, such as DES, the security standard for the US government until it was shown to be breakable in 1998, can leave a company with a false sense of security and looming disaster on its hard drives.
Using an encryption algorithm at least as strong as the current standard AES 256, gives control over data back to the owner, reduces or eliminates the risk of allowing data to be stored in a country other than the one in which it originated and gives the data owner the option of “destroying” the data by shredding the encryption key that would allow it to be read, according to a 2012 analysis by former Homeland Security honchos Richard Falkenrath and Paul Rosenweig, both of whom now work at The Chertoff Group security advisory firm.
The worm in the apple, however, is the question of who gets to hold the encryption keys. Most cloud vendors provide tools with which users can encrypt data stored in the cloud, but the cloud provider keeps the master key, so the encrypted data is secret only as long as the cloud provider decides not to reveal it.
That may not be as much of an issue for large companies that rent substantial chunks of cloud at Rackspace or other companies that aim at the business market.
At Google, Dropbox or other consumer-oriented services, however, the promise that the service provider won’t reveal the contents of a private file “does not prevent [the cloud provider’s] own use of the data to improve search results or deliver ads,” Falkenrath and Rosenweig wrote. “Of course, this kind of access to the data has huge value to some cloud providers and they believe that data access in exchange for providing below-cost cloud services is a fair trade.”
For enterprises, it’s not even close to fair. A SaaS provider, cloud platform company or other cloud vendor that keeps encryptions keys to data belonging to another company may not open and read the files itself, but does become a target for hackers, corporate spies or law enforcement agencies with court orders that might force cloud companies to hand over the keys for a whole company’s encrypted data, according to the Cloud Security Alliance.
The Patriot Act, for example, allows law enforcement agencies to demand information under conditions that would normally be forbidden by Constitutional language forbidding unreasonable search and seizure. The number of demands grows every year, but since the volume of information actually turned over varies according to how effectively the service provider resists, potential customers should check a cloud provider’s pliability before signing up, according to the Electronic Frontier Foundation.
Canadian provinces, such as BC, tried to counter the Patriot Act in 2004 by passing the Freedom of Information and Protection of Privacy Act, which requires that the personal data stored by government bodies remain within the province, where it is not under the jurisdiction of snoopier agencies of the US government. Other provinces, such as Nova Scotia, have taken a like stance, while there are additional data residency requirements for different categories of data, such as healthcare, in different Canadian jurisdictions.
The EU has similar laws, which tempt some companies to store potentially sensitive data outside the borders of the US. The EU’s privacy protections are more comprehensive, long-standing and more widely tested in court. Both sets of rules protect only personal data, however, not a company’s intellectual property.
Storing encryption keys in a secure data centre, where there is no risk that a service provider could corrupt or surrender them and where the data could be protected by disaster recovery/business availability or backup plans may be a good alternative to legislation.
“Holding the keys in the data centre ensures maximum security and availability,” according to the Cloud Security Alliance.
But rather than rely on the cloud provider’s good will, it’s possible to encrypt data before it goes to the cloud using either a file-system or full-disk encryption product that encrypts data before it’s written to disk if the data is staying put, or before it’s sent to the cloud.
File- or disk-based encryption, or even encryption built into the hypervisors of virtual machines may catch all the data inside the firewall, but could miss all the strands going up to the cloud from user laptops, smartphones and other devices.
One way to address that is with a gateway or cloud-security service provider which all users must connect to before touching any cloud service. The security proxy applies security rules or policies, and makes a direct connection with the targeted cloud using a set of specially coded APIs designed to keep performance high.
One such company is CipherCloud, which offers a gateway that encrypts data using an AES 256-bit algorithm, and allows encrypted data to be searched without having to decrypt it first. It aims for faster performance with additions like a virtual cache to help users offload encryption tasks more quickly and does it through a gateway that acts as a reverse proxy, receiving requests from client machines, applying security policies and then forwarding the requests or uploads to the cloud – already encrypted.
Alternative to key-based encryption
Most encryption schemes use an algorithm that has an opening for a string of numbers that becomes part of the calculation during encryption, and is then removed and held by a third party. If the key isn’t present, the data can’t be decrypted.
In 2005 another approach was developed that uses a randomly generated value that replaces one portion of the data in a packet to be encrypted. The new value is then used in the encryption calculation in the same way an encryption key is.
The primary difference is that the key-encrypted packet still carries a mathematical connection to the unencrypted data because the encryption key was taken from the original data itself.
In tokenization, the data is removed altogether and is usually kept by the organization that encrypted it. Tokenization only replaces sensitive information in a larger packet of data, however, rather than encrypting all the data. In tokenizing a chunk of data containing credit-card numbers, for example, all of the credit cards would be removed and replaced with a random set of nonsense numbers and the credit-card data would be encrypted to be unreadable until the overall data set is returned and the token can be converted using a “look-up table” that shows the original characters and layout of the tokenized text.
CipherCloud, which uses key-based encryption and isn’t fond of tokenization, wrote in a white paper called Information Protection Overview that tokenization is good for the most stringent security requirements because the data is actually removed and kept at home, but requires long-term storage to keep track of the tokens.
Key-based encryption, on the other hand, goes through the server once, is encrypted and the owner doesn’t have to worry about anything but losing the key.
Many tokenization methods are certified using the National Institute of Standards and Technology’s Federal Information Processing Standards guideline FIPS 140-2. CipherCloud’s key encryption scheme is also FIPS 140-2 certified.
There’s no clear superior between these two, though tokenization is required by US federal regulations in some specific instances.
Regardless of the method of encryption or the importance a cloud provider puts on its ability to decrypt any files stored on its systems, the ability to encrypt data in a way that allows only the owner to retrieve it is a fundamental requirement for any organization that needs to keep its data secure according to its own standards of security rather than those of the cloud provider, government agency or other entity.
Encryption is not, in itself, security, in the cloud or anywhere else. It is, however, a large step in that direction, and one that would minimize the impact of major data breaches by ensuring data thieves are able to take only data they can’t read or decrypt.
Encryption – whether provided by an on-premise product or by a cloud-based service or gateway – is just the beginning for IT, which has to start ramping up its security performance in the public cloud.
Right now, 44 percent of corporate data stored in the public cloud is not managed by corporate IT, or, most likely, by anyone else, according to a report posted in October by Ponemon Institute and sponsored by security software developer SafeNet.
Only 39 percent of respondents said they use encryption or tokenization to secure data stored in the crowd; 62 percent said they share data or cloud space with third-party companies, making tampering even more likely and keeping the potential losses of any data breach high by ensuring that any data that would be stolen could be read and easily used by the thieves.
The complexity of public clouds and legacy-ridden corporate IT infrastructures is too great to expect either one to bring itself up to an acceptable level of security outside the firewall within weeks or months.
Encryption offers a tremendous amount of protection for the amount of trouble it takes – an equation that looks even happier when cloud security services can be called upon to fill gaps IT can’t.
Survey figures showing IT’s long-term passive-aggressive effort to ignore the cloud until it went away didn’t even slow the growth of cloud. IT departments are now doing the right thing by starting to take responsibility for wrapping up scattered, disconnected cloud networks created by the whims of users into coherent, reliable resources for the whole company.
They need to do the right thing a little more quickly, though. It’s not acceptable from a technologist’s standpoint to leave all that corporate data sitting exposed on cloud-provider shelves all over the country without even a screen of encryption to provide a little modesty for any secrets they might contain.
It’s not acceptable from an end-user’s point of view or from that of a business-unit manager or, you can be sure, the victims or the judges presiding over the lawsuits that will follow inevitably after attacks like the one that cost Target $236 million after it lost 110 million customer financial records, or the one that netted 56 million cards from Home Depot.
Those weren’t successful hacks against a cloud service, of course. But these two at Amazon sure were. You can quibble about the iCloud naked-celebrity picture thing. But people are starting to ask who’s responsible for stopping the breaches, or at least slowing them down. Responsible, knowledgeable people are starting to ask just how much data breaches are going to cost, especially since there’s a good chance a solid strike on a vulnerable cloud customer or two could get much more expensive, much more quickly than “ordinary” ginormous, embarrassing, expensive data breaches – many of which happened, you’ll remember, because a few people weren’t paying enough attention to something they really should have taken care of, figuring it should have been someone else’s job and probably nothing big would happen even if they didn’t fix a big, obvious problem.
Worked out so well for Target, didn’t it?