You spend upwards of eight hours a day doing everything in your power to keep your system secure. So when an individual from within your own organisation is the cause of an information security breach, it can be exceptionally frustrating.
The urge to come down on the person or team responsible can be overwhelming, especially when your neck is on the line. Accordingly, much of the rhetoric surrounding accidental insider threats follows the same general thread of blame:
Users are stupid/careless/outright negligent. To futureproof the organisation, we need to focus on education and improve IT awareness.
Nine times out of ten, however, the problem is not bad education but a good attitude. In circumnavigating approved applications or working on unsanctioned tech, your users are only trying to do their job better.
Could recognising your users’ attitudes towards your current systems be key to stopping future leaks, and if so, could developing your systems around the needs of the users better enable them to work to do their job more easily and in a more productive way?
Identifying risky behaviour by walking a mile in your users’ shoes
Negligence occurs when people look for ways to avoid policies they feel impede their work.
The word is inescapable in this context, but it has also become a scapegoat, the first port of call for the security professional looking to define, manage, and take steps to prevent insider breaches.
It strongly implies thoughtlessness, when in actuality, the act of negligence has come about because an employee has found an alternative application or technology solution that they perceive as enabling them to do their job better.
CTO Archana Vemulapalli summed this up in an article for Government Technology last year when she said:
I don’t assume that anybody that has enrolled in any potential what you would call “shadow IT project” has done it because they just want to be defiant or they want to go do something on their own. I think there is a true business need.
In this context, there is nothing thoughtless about the actions that lead to accidental insider threats. ‘Much of the unsanctioned tech … arises not out of ignorance or spite for the foundational IT rules, but rather as the result of unmet business needs.’
Whilst the above scenario goes against best-practice as well as the organisation’s security policy, it’s crucial that you consider why it has happened in the first place—and what you can learn from it in order to simultaneously reduce the risk of unintentional insider threats and improve operational ability.
Building an IT system around the users
According to a recent Ponemon report, unintentional employee negligence severely diminishes the productivity of the IT department, with IT security officers reportedly spending an average of almost three hours each day dealing with the security risks caused by employee mistakes and negligence.
Most employees recognise the importance of compliance and have a general awareness of security risks. They know their workarounds can be risky, but they take that risk anyway because they feel that the current processes or technology in place fails to meet their needs.
For instance, an employee might knowingly find a workaround for a company’s data security policy in order to take work home to meet a pressing deadline by using a consumer-grade file hosting service.
The flaw in the approach across the IT sector is to assume that the current systems, which might be okay, are acceptable. I would contend that if we approached the issue from the perspective of ‘actually, our users prefer these devices’, or ‘our users aren’t getting on with this particular application’, then our approach to what we provide (especially from a security standpoint) would be very different.
Failure to address user attitudes putting you at risk of accidental insider threats
We need to stop thinking of users as the problem, and start thinking about how IT is outfitted to meet their daily requirements. On the whole, your people aren’t willfully setting out to compromise your system, and they certainly aren’t stupid. They want to do their job better.
Of course, organisations need to put systems in place to identify instances of insider data breaches or leaks in as close to real-time as possible. We’d hazard a guess that, like most, that your IT department is understaffed and overworked. This is where the value of modern technologies, powered by machine-learning, becomes most apparent. Statistical analysis that removes excessive or repetitive work from your desk is key. Tools like UEBA provide threat intelligence with user behaviour tracking, while data-loss-prevention (DLP) software can take a lot of weight from your shoulders by automatically enforcing corporate policies.
With insider threat detection systems in place, you should have considerably more time to, say, sit down with your users and find out exactly how they are working with your IT. As conversations go, it’s a pretty good day when you can reduce the risk of breaches, improve operational efficiencies, get your users on side—and still leave the office by half five.