Whenever a cyber attack occurs, the finger is usually pointed at a single compromised computer or a careless user unwittingly opening up his machine to hackers.
Less is said of a big problem facing all organisations today – the complexity of IT systems. It is the underlying reason why hackers are let through and data is lost, sometimes without even the victims knowing.
With SingHealth’s hack last month, the blame has been placed on a front-end workstation that was breached by hackers. They eventually wormed their way deeper into the system to steal the personal data of more than 1.5 million patients.
Why would the machine be left exposed in the first place? Was it not part of a coherent defence against such intrusions, given the sensitivity of data records that the healthcare provider held?
The answers will come from ongoing investigations. Hopefully, they will shine a light on how complex the systems were at SingHealth as well.
For someone defending against an attack, you have to get your house in order all the time, but the hacker only has to find a single loophole once.
Today, this is the impossible task for administrators of complex systems made up of multiple layers of IT equipment, software and processes. It has become too difficult to cut through the complexity.
This has not been helped in recent years by the push to open up cloud resources to users in a large enterprise, whenever the need arose.
Spin up a disk to store some data. Fire up a virtual machine to test out a project. What happens when the project is over? Or when people leave their jobs? Too often, the resources are forgotten and left online as a potential loophole to be exploited later.
So today, there’s a clear move away from the “move fast, break things” philosophy that was the hallmark of startups like Facebook years ago, said Mike Palmer, chief product officer at data backup vendor Veritas.
Years of access to cheap storage have meant that corporations are storing lots of data they don’t use for analysis but yet can be stolen by hackers, he told Techgoondu in an interview last week.
While some organisations have used the migration to the cloud as an opportunity to streamline their systems, many already have years of “dark data” and forgotten systems still stuck somewhere online.
In the devastating attack on Sony Pictures in 2014, the company suffered from the complexity of running 30 data centres and scores of equipment it could not keep track of. It won’t be the last such victim.
The challenge will get tougher in years ahead, when the number of connected devices increases, thanks to the Internet of Things (IoT).
Already, alarm bells have been going for years on the lack of standards that is making these devices hard to secure on a large scale. Yet, the problem persists because of the added costs for additional security features.
Another issue is the lack of updates for IoT devices over time. Smart door locks are only beginning to offer patches for vulnerabilities and many connected sensors today are still not plugged into a gateway that at least checks that they are not spoofed or hacked.
Add these devices to an expanding IT system and it is clear that the headaches for IT administrators will grow. The same issue – complexity – will get worse.
There are calls today to simplify everything, or to start afresh by baking security into one’s apps, for starters. Those are great but there are always legacy issues to deal with.
One of these is the sheer number of security solutions that enterprises have bought – it can be in the dozens for each enterprise – which means they are getting too much information and not able to act on it.
The biggest problem is that these solutions don’t integrate with one another, said Alessandro Perilli, the general manager for cloud management strategy at Red Hat.
Companies have to make their security solutions programmable and automated to respond to potential threats, instead of trying to manage them manually, he argued.
When attackers start to use artificial intelligence (AI) to increase the speed and magnitude of attacks in the years ahead, the inefficient setups today will not be able to cope, he added. “Even if you have 100 people, it’s useless.”
Indeed, automation has been talked up as a way to make IT systems flexible enough to cope with the risk, without locking down everything so that legitimate users are also denied the tools to do their work.
Can a smart cyber defence solution gauge if an attack warrants a partial lockdown or can it just mitigate the issue without always alerting a human operator and overwhelming him with information?
Soon after the SingHealth hack, the Singapore authorities ordered the PCs used by some doctors to be cut off from the Internet. In a way, they are denying these doctors the resources to perform their jobs well.
The analogy is one of a distributed denial of service (DDoS). When cyber security experts decide to drop all traffic to their servers to prevent an attack aimed at overwhelming their equipment, can they claim a victory? After all, they have denied their own users access as well.
The advice from A10 Networks, which sells network security solutions, is to start with the least invasive response, then go to something more aggressive when needed, to mitigate a DDoS attack.
“It should be a dynamic strategy that says ‘I care about my users’,” said Don Shin, a senior product marketing manager at the company. “The dirty little secret here is collateral damage.”
This is important because the answer to all this complexity is not to take the simplest, straightest path available. It needs a smarter solution that is able to learn and cope better. That is the grand challenge now for this decades-old battle with cyber attackers.