Understanding the cloud security conundrum: What is the answer?
Name an online cloud storage service provider and you’ll have no problem at all finding reports for a breach. Recently, we’ve seen frequent incidents reported in the press involving companies leaking data through misconfigurations involving the cloud, as well as headlines reporting on user inexperience leading to weak security in cloud deployments.
But user error, complexity and misconfiguration surrounding cloud storage resources are only the tip of the iceberg when it comes to common cloud security issues – the reality is that cloud security is a multi-faceted issue involving not only technology, but also how companies look to approach cloud transition.
To truly understand and implement effective cloud security we first must understand the nature of the threats and the value of what’s stored, the significance of the cloud migration and transition processes, and the most effective way of protecting our assets.
Understanding data: A priceless asset
So why are we seeing so many headlines involving leaking data from misconfigured clouds, particularly AWS S3 buckets? Firstly, it’s not fair to say that the main problem is AWS S3 buckets. It’s probably fair to say that they are most frequently targeted, and that this is likely down to the fact that they are used mainly by high end enterprise clients that hold extremely valuable data.
Today, data is often considered more valuable than intellectual property or physical infrastructure because it has value in so many ways, especially when the wrong people manage to get a hold of it. Think about it: data can be misused for gaining direct financial gain, privileges and identity impersonation. It also has great “resale” value on the dark market, and, because it is typically extremely valuable to its owner, it can be used as a powerful ransom subject. We’re also seeing incidents of data being used for political and ideological crime.
Understanding the cloud transition
Complexity and misconfiguration of online storage resources are definitely to blame when it comes to the data leaks we’ve been hearing about, but companies need to dig deeper to see if there are more fundamental issues.
Frequently the problem starts with companies approaching cloud transition as yet another extension to their datacenter. While in an ideal world this is the most compelling selling point of a cloud solution, companies should also be thoroughly re-thinking their internal procedures before taking this significant plunge. For example, have you reviewed creating your DevOps resources in a different way to consider cloud deployment, and is your QA team set to test the multitude of new factors, like bucket configuration, that are potential points where serious leaks could happen?
While the cloud brings fantastic flexibility, it’s important to recognize that the cloud transition time is an opportunity to take advantage of all the tools available to run secure software deployments, and to flag if functionality may be underused or plainly unused.
Here are a few things to consider during the cloud transition stage:
- Can you implement a stricter separation of duties within the enterprise? For example, the security department – not developers – should be setting up the permissions on the public bucket. Beyond this, a third entity should approve them for publishing
- Are you creating standard configurations and images? The cloud is really good at repeating, and if the base is sound the results will be sound as well – or at least they will all fail predictably and in the same way
- Are you using adaptive internal process (depending on the criticality of data stored?)
- Are you using AI and machine learning to evaluate the criticality of data stored on publicly available data stores? Critical data may have been stored there unknowingly or considered a temporary upload for example
Understanding cloud service providers
Cloud service providers are doing really good job in providing security services and better ways to secure data than any of the existing on premise implementations. The reason behind this is that security and data privacy are always on the top of the list of perceived road blocks for adopting cloud. In addition, the unprecedented automation capabilities of the cloud can be largely used for stepping up the security of any infrastructure to the next level of maturity.
On the other hand, the market competition for a place under the cloud drives service providers into a frenzy of providing new services in order to gain better position at the market. This unfortunately brings complexity, which is the top issue they need to address.
Tools like Zelkova automated reasoning for security controls and Tiros, which can help define the exposure footprint, are more than welcome, but they have to be implemented in the simplest possible way so as not to create additional complexity. It’s all about final implementation and how the complex processes will be presented to the end user.
Understanding the past and the future
The online storage and archival space has been one of the first widely adopted cloud services, way before the invention of the ‘aaS’ acronyms, and it’s come a long way. It’s apparent that cloud service providers are doing what they can to address security issues, and as long as they make every effort to address complexity, a move to the cloud will continue to be a comparatively safe option in comparison to on premise storage solutions.
And while most businesses understand the value of what they seek to store in the cloud, it’s high time they put the cloud transition stage under a microscope to address development, operations, QA and security roles in order to run the most secure deployments possible.
Interested in hearing industry leaders discuss subjects like this and sharing their experiences and use-cases? Attend the Cyber Security & Cloud Expo World Series with upcoming events in Silicon Valley, London and Amsterdam to learn more.