Grant Thornton – Insider Threat Q&A
Q: As insider threat increases at an alarming rate, it continues to be one of the most challenging problems for organizations to solve. Do you think that’s because simply defining insider threat can be problematical?
We spoke to Derek and Rohan to capture some of their thoughts. To hear more, you can register to listen to our webinar Getting Inside Insider Threat.
Looking at the types of user causing the issue, you can have the malicious insider: those bent on theft, fraud or espionage, the negligent insider who just makes a mistake resulting in the accidental disclosure of information, and finally, the compromised user who has been hacked or phished, resulting in data loss which is actually the work of someone else.
A: I do think it can be problematical, because insider threat is not just one thing. The way that Grant Thornton defines it can actually be broken down in two different ways: one, what type of user is causing the issue and two, what happens as a result of this and what’s the impact.
The end result also varies: the typical cost incurred from a malicious insider is $768k per incident, but these only make up 14% of all insider incidents. Far less impactful in terms of financial loss is the negligent user who on average “only” costs an organization $371k – but these types of user make up 62% of all incidents – so the cumulative impact may be greater.
The types of behaviour for each of these users vary also: so it’s hard for employers to monitor for. It could be as simple as sending an email with an attachment – something that we all do every day – or as obvious as the mass upload of sensitive data to a file-sharing site.
Q: What are some of the impacts that you see hitting businesses as a result of insider threat?
(Research from the Ponemon Threat Detection study 2020)
It could mean significant loss of intellectual property, or losing critical information to a competitor.
Insiders are just people: and every person is different. Correspondingly, the impact of insider threat varies greatly. It may be just an embarrassing error of having sent the wrong file externally, although of course due to regulations such as CCPA or GDPR, these breaches could result in large fines.
The length of time an insider threat takes to come to light is also important. On average organizations take 77 days to contain an insider threat incident, and the longer it takes, the more it costs. If it takes 90 days to find and contain, on average that’s costing an organization $13.7 million. For those found and managed within a month, that cost comes down to $7.1 million.
At the heart of it all is a loss of trust, however. Once an insider threat comes to light – and it will do, eventually – the end customer of any business loses trust in the ability of a business to retain and protect critical, and in many cases, personal information.
Insider Threat actors are also, for the most part, driven by financial gain, and so for us working with the financial services industry, we’re going to be a target. We deal with huge volumes of valuable, sensitive information, and so it’s the financial services industry which is most at risk of insider threat. We’re not alone, however – life sciences, retail and telecom firms are also targeted.
Q: Why are financial services companies more at risk?
Most organizations adopt a one-dimension approach, typically using technology, to reduce insider threat incidences only to find that the problems persist and continually grow. It is crucial for organizations to establish a “human-centric” and holistic approach to overcome this challenge. However, developing a strong and scalable insider threat program with a human centric approach is not easy.
Q: How have you helped organizations to address some of these challenges? Do you think you have uncovered some best practices?
You can’t take a broad brush approach to an insider threat program: you need to look at the people within your organization and assess where the highest risks are, and take steps accordingly.
This is why we find our partnership with Forcepoint so productive, as you truly understand and lead on the human-centric approach. You know we need to look at the human behavior behind the threats. For example, people who are not performing well or who are in HR or business disputes are more likely to feel aggrieved and attempt to siphon off or even sell data. In a completely different situation, power users (or privileged users) may be most at risk from either becoming compromised or making a mistake, simply because they have most access to the most sensitive and valuable data.
At Grant Thornton we aim to focus on four strategic pillars: governance, process, people and technology.
Q: If you could give us some key considerations and steps to develop an insider threat program, what would they be?
Then you need your process in place: to protect your data, first you need to know what you’ve got and where it is! Then you can begin, as we mentioned above, to assess your high-risk areas and the protection you’d like to offer them.
Starting with governance, right upfront you need sponsorship for any insider threat program from top executives, and it needs to be guided and have input from a wide range of departments.
Finally – and this really is the last step – it’s about technology. Which solutions do you need, what works for your environment, will it integrate with your existing technology, etc. Independent consultants will be invaluable in this step.
Insider threat is about protecting people, so you need in the next step to work with your people. It’s got to be transparent and proportionate: and your people need to be educated on what you’re doing and why.
This post was first first published on Forcepoint website by Homayun Yaqub. You can view it by clicking here