A few years back, raising the issue of data privacy was viewed in many IT discussions as a traitorous act. Dazzled by the economic potential in new sources and analytic platforms for Big Data, neither vendors nor technology users appeared very willing to consider the privacy challenge that was rearing its ugly head. In the article below, Kevin Fogarty confronts this issue head on, establishing first the growing use of public cloud, discussing next the impact of impending EU regulation, and circling back with a description of public cloud’s security ‘multiplier effect’. Cloud is essentially insecure, he concludes, but also a horse that has left the gate. The question becomes, he asks, how do we reduce risk? Fogarty’s answer is new technology solutions that provide protection wherever the data travels (that the vendor community is rushing to fulfill). Another response is executive prioritization of this issue — beyond the IT department — that hopefully will be stimulated by the kind of cloud risk review presented here. (ed.)
After years of trying to stick mainly to private cloud due to worries about data privacy and security in the public cloud, enterprises are finally coming out in public in large numbers, according to recent studies.
The use of both private and public cloud services has been growing steadily for the past several years, but far faster for public cloud among small- and mid-sized businesses to which it promises access to world-class IT for local-dealer rental prices. An August study from SMB market-analysis firm Emergent Research — sponsored and publicized by SMB accounting-software cloud provider Intuit — found that 37 percent of SMBs described public-cloud services a critical part of their IT infrastructure now, and that 78 percent expect to be in the same position by 2020.
A long list of other surveys estimate that the preference of large companies to install cloud-management and virtualization software in their own data centres has kept the number of enterprises using cloud services for significant parts of their business-process or IT infrastructures ranging somewhere between 20 percent (Technology Business Research, July, 2014) and 33 percent (TechPro Research, Jun, 2014).
But the word from those selling cloud services is that enterprises really are starting to move aggressively out into the public cloud.
When polled by the association in October, members of the Telecommunications Industry Association (TIA) reported that SMBs made up 41.3 percent of the US-based cloud computing market during 2013, but that enterprises were beginning to take over by increasing their spending more than 61 percent over the next three years, from their current level of US$40 billion to US$64.5 billion in 2017.
They also said that overall spending on cloud computing — by both businesses and consumers — would rise 57 percent, to a total of US$107 billion by 2017, while spending on data-centre construction would increase only 26 percent, to a total of $US29.7 billion in 2017.
TIA officials laid credit for this cloud growth on the increasing volumes of data and Big Data analytics used by large companies to identify new market areas or previously unidentified patterns in the behavior of customers, on increasing machine-to-machine communications traffic from the rapidly growing Internet of Things, and on the cost-cutting US Cloud First Policy pushing federal agencies toward cloud services.
“Data is fueling unprecedented growth and technological change, and it’s clear from our report that this trend is not ending any time soon,” TIA President Grant Seiffert is quoted as saying in the report.
Moving toward public-cloud services to help cope with increasing volumes of genuinely or potentially valuable data, however, multiplies the potential for disaster from the risk of data loss, theft or corruption that kept most enterprises out of the public cloud in the first place.
And it’s not only the volume of data in the cloud that causes risk to increase.
The European Union is in the process of finalizing the first major update of its notoriously strict data privacy laws since 1995 with changes designed to give individual residents of EU countries more control over data about them (the “right to be forgotten”). It will require that companies be more transparent in the way they collect and handle data on customers, and tighten reporting requirements and penalties for failing to report significant breaches in security that result in data being lost or stolen.
Specifics of the new EU Data Protection Regulation probably won’t become final until the end of the year, and likely won’t go into effect until 2017, but currently include fines for major violations of either five percent of a company’s total revenue or 100 million Euros, whichever is greater.
A survey Trend Micro conducted in April among 850 senior-level IT managers in several EU countries showed wild variation in awareness of the impending changes even among EU-based executives and an almost total lack of preparation as well.
Half of IT managers in the UK were completely unaware of the impending changes; 85 percent said their organizations would have difficulty complying and a quarter said it wasn’t realistic to think that they could.
“This affects every organization, regardless of size. If a company processes data then it needs to be aware,” the release quoted Trend Micro VP of Security Research Rik Ferguson as saying. “This is not just an IT issue, duty to comply falls to everyone from the receptionist right up to the CEO.”
The regulations cover data about residents of EU countries held or used by companies with a presence in the EU, even if the company is based elsewhere, has limited presence in the EU and are not concerned with where the data is actually stored.
A US- or Canada-based company with subsidiaries or customer offices in Germany and the UK, for example, would be accountable for all data on EU residents.
Among other things, the new regulations could require any company whose business in the EU is greater than a certain size to do regular privacy impact assessment (PIA) analyses of their data and data-handling processes to gauge how well they are protecting personally identifiable information (PII) on their EU customers.
The specifics are actually designed to simplify restrictions on transfers of data even within a single company if that data is moving physically from an EU country to one outside the EU. Right now, those transfers require that the third business entity (one based in the US or Canada, that may only have a sales office and no data centre facilities in Europe, for example) have their data-protection processes examined and approved as “adequate.” The requirements of adequacy vary between countries, however, as does the volume of red tape required to make the transfer. But even when they’re all finalized and put into effect, the new rules are unlikely to eliminate the differences in process between one EU country and another, though the data-privacy stakes will be higher in all EU countries, according to an analysis of the impending changes made by the Silicon-Valley-based IT Law Group, which specializes in information privacy and security and has offices in the US, Canada and 35 other countries.
It’s not just British or EU IT organizations that are unprepared, however.
The increasing use of public cloud services by enterprise IT organizations — which has required some relaxation in IT’s traditional iron-fisted refusal to allow anyone to touch or see sensitive customer or financial data under any but the most strictly secure and regulated conditions — may actually make many companies less prepared for tighter data-privacy restrictions than they already were.
A study released in September by the Cloud Security Alliance, for example, showed that 62 percent of IT and security professionals polled said users at their companies upload content to 10 or fewer cloud-based applications; 75 percent said users upload to fewer than 20 apps. Forty-nine percent said that a quarter or more of the content being uploaded was sensitive information; 48 percent said that less than five percent of that sensitive information had been shared with unauthorized people outside the company.
The reality is that employees in the average organization use 508 cloud-based applications, 88.4 percent of which are not secure enough to be considered “enterprise ready,” and that every file uploaded to a cloud app or storage service was shared an average of three times, almost never within applications whose security controls allowed limits on whether they were being shared inside or outside the company, according to a July report from Netskope, a cloud security company that bases its quarterly Cloud Report on analysis of billions of individual actions of users at its client companies rather than surveys of IT people.
A similar study from cloud-security monitoring company SkyHigh Networks showed that 72 percent of IT security managers still do not know how widely SaaS or other cloud services are being used inside their own companies, and that, even when they do know an app is being used and try to block it, 72 percent fail in five out of every six attempts to block each individual app.
Even in EU countries already accustomed to federal data-privacy regulations, corporate use of public-cloud services rose 23 percent between Q1 and Q3 of 2014, but only 9.5 percent of the services being used were secure enough to qualify for SkyHigh’s enterprise-ready standard for service- and data-security attributes, legal controls and user or device authentication.
The result is that a rapidly increasing percentage of the average corporation’s data is being fed into an insecure, often unrecognized IT infrastructure via what used to be known as “shadow” or “rogue” IT projects, but are now simply subscription-based service connections that slide through corporate firewalls and data protection schemes often without even being noted.
Cloud-based storage, according to an April report from Ponemon Institute, can serve as a multiplier on the cost of a potential data breach, which Ponemon estimated in April to be about $201 per lost record, meaning the loss of 100,000 records would cost US$20 million.
Because of the volume of uncontrolled data, the number of unmonitored services and interconnections among shared files that could turn one breach into many, simply using the cloud for sensitive or even significant data could triple the scope and cost of any security breach, according to the Ponemon report.
The question, then, becomes not how big a risk the cloud has become, or even whether corporations should try to reduce their use of cloud to reduce their exposure. The inability of IT to even see many cloud connections and the widely demonstrated ability of business units to contract their own cloud services without IT’s approval puts across-the-board bans off the table.
The question is what to do about the risk — how to limit the sensitive data that passes out of the control of IT; how to extend IT’s control to make sure data doesn’t escape; and how to verify whether a breach has actually happened or not in the first place.
Choice of vendor and of technology to provide security using each of the two major approaches to securing data in the external cloud — global encryption and web-based services that track, control and apply security policies to all cloud connections to a client organization — are expanding both in capability and in market size as the need becomes more obvious and both cloud providers and third parties begin to offer more complete services.
(More discussion of those providers and technologies to come in an InsightaaS follow-up piece.)
There is no easy, comprehensive or certain solutions, unfortunately. The cloud and the security issues relating to it are numerous, varied and adaptable.
The single message to corporate IT departments, especially in what is likely to be a very short run-up to the final version of Europe’s data-privacy regulation update and the 100-million-euro penalty for violating it — is that the solution to protecting data that travels widely is to make sure there is protection that can travel along with it, which usually means encryption and limits on where data travels, to whom and across what national or technological barriers.
The only way to accomplish that, so far at least, is to encrypt it in ways that can’t be undone, even by users authorized to open and use it, before the data can be sent into the cloud naked and unprotected.