Corporate security

Security Awareness Is For Life Not Just For Compliance

images-2Like the old adage about puppies and Christmas, educating users about security threats, highlighting their vulnerabilities and countering resulting risks is a long-term commitment. That’s an easy statement to agree with, but a hard one to turn into practical, effective action.

This is part 1 of a 2 parter. Here I offer my perspective on some of the pre-existing problems and challenges linked to educating staff about security and in the next part I’ll take a look at what’s going on in the service and system market to try and improve things.

As a departure from my regular posts I have no references out to stats and articles on people risk. That’s because I see surveys and papers contradicting each other every week (maybe do a quick web search for the Verizon DBIR and go from there). This week human beings seem to be beating tech exploits as the prime cause of data breaches, but someone will tell me that’s changed shortly. Instead I’m basing this on the fact that no technology works (or gets misconfigured, broken or bypassed), without a person making a decision and acting upon it.

We do security awareness…honest!

TickedBoxesIf you work in a regulated industry (or supply goods and services to firms who are), have you ever found yourself casting around for some security awareness related evidence for an RFI or compliance questionnaire?  I’ll help jog your memory:

Q1: Are your staff contractually required to ensure they use IT systems securely and protect the Confidentiality, Integrity and Availability of company and customer data?

A: YES
Evidence: Template contract contents on confidentiality and non-disclosure.

Q2: Are your staff made aware on commencing employment of their security responsibilities?

A: YES
Evidence: Staff handbook and/or statistics showing you ticked the “Computer Based Training (CBT) complete” or “Standard induction including the bit on security complete” boxes.

Q3: Do staff receive security awareness training throughout their period of employment?

A: YES
Evidence: The security CBT package details, the policy bit that says users have to do the CBT every year and the statistics showing % of staff who have completed it.

Did I miss anything? Probably not.

How much has this reduced carelessness, ignorance, annoying control workaround, or criminal intent related incidents? No idea? That’s not surprising. Insider risk is something we all like to talk and worry about, but very few benchmark and act upon it. Well it was very few, it’s now a few more because budgets have been loosened by data breach related media frenzies.

Bloody users

The subtitle refers to a sentiment I’ve heard from a few network administrators over the years. To use it in a sentence: ‘The network would be fine if it weren’t for the bloody users’. Awareness stuff is for ‘them’, not for technical, security, HR, risk or senior staff.

That, if you’ll excuse frankness, is stupid. Which user groups pose the biggest inherent risk because of the kind and quantity of data and access they use? In fact, if we’re talking workaround related incidents, the C-suite can be prime suspects. I wrote a post entitled “Are you a C-suite security vulnerability” because I know how many times security is swerved to help senior staff get their job done.

You may well enforce CBT completion for 100% of staff, but because it’s ‘one size fits all’ it really is a waste of time for many (guess how many entertain themselves by trying to hack a top score, or skip to the end) and you can bet your bottom dollar a chunk of senior managers have their PAs do it or have an exception made by the security team.

This perception problem also hampers engagement and support. It can’t be “We must do user security awareness”. Instead we should work towards “We must tackle OUR insecure behaviour”. That adjustment in thinking and focus underpins the culture change needed to get something worthwhile done.

You what?!

confused-babyWe mostly don’t make sense. That’s a pretty sweeping statement, but most of the security professionals I know are pretty poor at tailoring their conversations for different audiences. Even the ones that think they do are still communicating at a level (or three) more technical than makes sense to the average network user. That’s not to say users can’t get complex concepts, most do, but we’re like boiled frogs. We’ve osmosed our TLAs and the layered context for information over many years and peeling that back to basics is tough. If your totally non-techie Mum/sibling doesn’t get it, neither (i’d argue), do you.

Then we talk about dumbing things down – what a great message to send and indicative of the disdain some can direct at non-techies. Heck I got called a ‘Muggle’ the other day by a coder who didn’t think much of my understanding of technical security. He was right as it happens. I’d got the wrong end of a stick, but 2 hours later I got it, whereas he’ll probably never get or care how counterproductive that kind of reaction is.

Ridicule…what a fabulous way to encourage people to learn…it didn’t work at school (who here has a flashback to a time a teacher humiliated them to make a point, but doesn’t remember the point?), so why should it now?

Informing vs habit forming

UnknownWhat’s tougher still is intuiting what will mean enough to an audience to form a lasting memory. Then helping them to take the next crucial step… ensuring that memorised information will pop up and influence established insecure habits when it matters.

Ask any ex or current smoker how hard that is. All the higher thought processes say “it’ll kill you”, but they still light up. The urge that drives that action is far more visceral (like hunger or fear), than anything reasoned and rationalised. The training to overcome it is therefore more basic too. Part Pavlovian (unpick the ‘have a pint = smoke’, ‘have a meal = smoke’, ‘have an argument = smoke’ action and reaction and replace it with something else) and part finding mental hooks strong enough to kick in before the quick, dirty and easy action is taken.

Pulling it back to a something closer to tech:

Does the average user ever worry about the permissions requested by apps? Going a step further, have they ever worried about how secure they are once installed? At some level I’d expect the answer to be yes, but how often has that stopped them installing (or made them uninstall) an app ‘needed’ to get their hotel/train booked, keep the kids quiet, open a doc in an inconvenient format, or get music downloaded from a proprietary service? The same happens with security. Good intentions struggle to outweigh convenience, cost and other types of temptation.

It’ll never work

BinAwarenessSome high profile bodies in the industry say educating users is a waste of time. Why? Because the sophistication of attacks will always fool a greater or lesser subset of staff and if a subset fall victim, the attacker’s job is done.

Phishing is a case in point (in fact too often the ONLY case in point, because it lends itself to measurement). A skilled social engineer running a spear phishing (individually targeted) campaign will probably get the target to bite 80-95% of the time. Hard to believe? It shouldn’t be. These folk find out about your role, your colleagues, your hobbies, your family…basically whatever it takes to convince you they are kosher.

Talking to the head of security at one retail company, one of their biggest people risks was calls to the C-Suite saying the CFO needed a money transfer authorised quick smart. A credible current deal is referenced, transfer details are exchanged and off goes the cash to somewhere it shouldn’t. That sounds really unlikely doesn’t it, but it’s not and it works. Would we get to see it in stats? Unlikely (would you want to have that shared about your top bosses?).

It’s all pointless then isn’t it

NO! Folk in our trade can be buggers for saying something is pointless unless it’s 100% effective. There are marketing, education and training techniques that can dramatically decrease the chance that people will be fooled by iffy computer, phone, email and face to face interactions. They can also reduce the risk of a WTF moment when doing the right thing is a tad tougher than the alternative.

The practical and tech details of the next effective scam will always get ahead of technical means to spot, curtail and stop them. Users will bridge the gap until the monitoring and defensive solutions catch up, so we need to equip and motivate them to pause for thought and (like your network kit if you’re doing it right), default to deny.

It won’t prevent all exploits, but a well targeted, effectively delivered tech/education approach can reduce your human attack surface and make you a darn sight less attractive victim.

Where then does one start?

Being very, very honest I would suggest binning what you do now. Not because it’s all bad. Instead because it’s almost certainly generic and loathed by staff. How much of yourself would you invest in learning something part bloody obvious, part irrelevant and generally feeling like a waste of valuable time? Whether that’s fair to content or not, it’s what happens and once it has you’ve lost them.

A whizzy new logo, ‘funky’ character introducing it, or making it available on smartphones ain’t going to magic up the kind of engagement needed for stuff to really sink in.

What might work?

  • Unknown-2Tailoring education to match professional and personal psychological hooks
  • Weaving it into work life
  • Implementing it creatively
  • Continual delivery
  • Entertaining to embed it

In other words, equipping and encouraging people to Think TWICE (yes that is Celine and no I’m not going to break into song…but you can because I’ve written some cybersecurity karaoke lyrics for your musical pleasure) when confronted with a choice between secure and insecure behaviour. And for that we need to consult some experts. Being a security professional doesn’t automatically mean you get psychology and marketing, but I’m willing to bet your company employs a load of people who do…


So that’s part 1, part 2 coming soon.

6 replies »

  1. Perfectly summarised Sarah. At the moment Security “Awareness” is driven by our audit/compliance function…need I say more!!!?? They “measure” who completes the training annually – that’s apparently is good enough for auditors and 3rd party security assessments. As you know this in turn gives the exec’s of the company a very false sense of security which makes my job unnecessarily more difficult. I could go on but I wont 🙂

    Very much looking forward to part 2 !!!

    -markc

    Like

    • Thanks for the generous review Mark and as you saw, I feel your pain. Part 2 will definitely come. Right now a tad snowed with paying bills type activity (writing doesn’t pay for much of that right now – surprise, surprise 🙂 ), but it’s still on the list.

      Like

    • I follow your tweets and read your articles often Pete. Love that presentation. It’s applying the same smarts we pride ourselves on technically to the bits of the network we can’t directly control. So vital. One of my fave parts was the guy who suggested getting marketing experts in to redefine light as a way to turn the light off :-). I wasn’t half so original..my least traditional suggestion was getting my 6’5″ hubby to pick one of the kids up so they could smear it with something…they’re world beaters at smearing stuff with stuff, so playing to family strengths. Will put aside more time to look round isecom.org. Thanks for taking the time to comment!

      Like

      • Thanks Sarah! Totally agree with the smearing! Kids are awesome at being sticky 🙂 If this is a topic that interests you send me an e-mail (pete at isecom dot org) and we can collaborate on SALT. It needs a manager….

        Like

Want to add to the discussion?

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s