Even as enterprises have become more confident of running their business apps on the cloud in recent years, the spectre of cyber attacks is never far away from any discussion about the technology’s adoption.
In a a study that Barracuda Networks put out earlier this year, respondents using the cloud said that 40 per cent of their infrastructure was hosted on the public cloud, which had benefited their businesses.
However, 91 per cent of the 1,300 IT leaders polled were concerned over its use, with 54 per cent citing cyber attacks as their main worry.
In Asia-Pacific, 56 per cent of respondents said their organisations had been targeted by cyber attacks, according to Barracuda Networks, which provides cloud-enabled cyber security and data protection.
James Forbes-May, the company’s vice president of APAC sales, said that customers have to be more aware of the threats themselves and take steps to mitigate the risks.
It’s not enough to simply look to public cloud providers, which have hardened their defences over the years to ward off attacks, he told Techgoondu, in this month’s Q&A.
Many times, he argued, it is customers that have inadvertently left a loophole unpatched or accidentally leaked information in the open that have let hackers in. “Truth is, the cloud is secure, if you make it secure.”
NOTE: Responses have been edited for brevity and house style.
Q: In your report, it is said that more than half of Asia-Pacific respondents have faced cyber attacks. What type of attacks are the most common in the region?
A: Across the board, when it comes to applications hosted on the public cloud, we see that Web application attacks, such as SQL Injection and XSS (cross-site scripting) being very common. This is because, in the cloud, the Web front-end is the biggest attack surface that is available to exploit.
Web applications are easier to break into, in many cases, because of various factors – attacks are easily automated, people use outdated components with known vulnerabilities, and patching of vulnerabilities are not done in time. These, coupled with the fact that most Web applications are public facing, make them a huge target for malicious actors.
A second major issue that we are seeing, is that customers leave a lot of data lying around, wide-open to the world in Amazon S3 buckets and databases. Over the last few months, people have been discovering huge amounts of sensitive data that is left open to the world in S3, leading to some very embarrassed companies.
The problem with S3 is that the customer needs to specifically lock down access to only the relevant people. What typically happens, though, is that they find setting up these permissions to be time consuming.
So they decide to open it up to everyone for a day or so, and then lock it down. And then they forget to lock it down, leading to breaches.
In the case of databases, we’ve seen a bunch of companies that have left their MongoDB installations open to the wide world. These were discovered by ransomware and encrypted en-masse.
Q: It appears that security is still a chief concern for many enterprises going on the cloud. Why haven’t recent efforts at addressing this, like stricter cloud service provider standards, allayed these worries?
A: Security should be a primary concern for anyone going to the public cloud. Cloud providers have been very clear about two things from the beginning.
First, they are responsible for the security of the infrastructure – and they have delivered on this. Second, the customer is responsible for everything they deploy on top of this infrastructure. This is called the shared security responsibility model.
If you look at Amazon Web Services (AWS), they always state that “Security is Job Zero” for everyone, including customers who use their platform. Microsoft have articulated the differences very nicely as well.
It is on customers to understand and use this model properly. We keep seeing customers who believe that the cloud is either very secure or not secure at all. Truth is, the cloud is secure, if you make it secure.
It is like every other deployment, with some additional security built-in on the platform level. Customers should look at the recommendations from Cloud Providers, such as the well-architected framework from AWS.
At this time, I can’t remember any incident in the last year where any of the IaaS (Infrastructure as a Service) providers have been at fault for a cloud data breach. But when it comes to customer errors, I run out of fingers very quickly!
Q: How well are enterprises coping with new cyber threats that are morphing and adapting all the time to new defences?
A: Enterprise are moving towards systems and processes that can cope with advanced cyber threats. This is happening at a slower pace than expected, but it is happening.
When it comes to securing Web applications, for instance, companies are understanding that a proper Web application firewall, like the Barracuda Web Application Firewall, is important, since it provides defence against zero-day threats and more.
Logging and visibility are also a critical part of cyber defence, and this is not seen as important enough by many companies.
Companies need to spend more time on getting proper visibility setup into their networks, with anomaly reporting tuned to find all types of intrusions. This will help with issues like misconfiguration where an attacker may gain access in spite of protective measures.