The Metrics

The Metrics

What does a mass murderer has to do with information security metrics?

By Eh’den (Uri) Biber, CISM/CISSP/CISA/CRISC, member of the NeuroLeadership Institute.

A few days ago, on the 13th of December 2011 Belgians were shocked to discover that in Liege a gunman had killed 5 people and injured scores of people.

To anyone who don’t know where Belgium is, or where Liege is – I’ve enclose below a map. If you can’t even identify Belgium in the small map I suggest to search for it – it might be small in size, but it compensate with its beer and chocolates.  Liege is located about 60 miles or 90km to the east of the capital, Brussels.

Below you can find some photographs from the horrifying incident.

Profile of a killer

This is the killer, a man by the name of Nordine Amrani from a Moroccan origin. In 2008 he was sentenced for 58 months in prison due to the fact they discovered at his home a cannabis growing factory with 2800 plants plus enough weapons for a typical Arnold Schwarzenegger movie (including Law rocket launcher, an AK-47 assault rifle, a sniper rifle, a K31 rifle, a Fal assault rifle and hundreds of cartridges). He was paroled in October 2010 by a sentencing court, and after receiving psychological guidance he moved to an apartment with his wife. It all ended last week.

Here’s a quote from Belgian VRT:

Two days after the carnage Nordine Amrani inflicted on the Place Saint-Lambert in Liège, forensic psychiatrists have spoken of the difficulty to predict repeat offending in such cases. It has been established that a prison psychiatric report drawn up about the attacker in May of last year stated that there was no big risk that Nordine Amrani would commit any serious crimes in the future… Leuven forensic scientist Rudy Verelst says that it is difficult to predict whether or not a convicted criminal will return to his old habits: “We can’t say accurately whether anybody will become a repeat offender. There is a graduation which we can draw up using several instruments. We look at the subject’s life and his earlier offences. We can make a judgement but it’s not watertight.”

During his evaluation the psychologists who tested him said he is not dangerous to society and even though the prison authorities gave a negative recommendation the sentencing court chose the mind experts. The “mind experts” were wrong.

What does this has to do with information security? Well, everything.

On the 7th of December I’ve participated in an online RSA webinar called “Metrics are Bunk!?: The Zombie Apocalypse, Baseball, and Security Metrics”. The webinar was led by Alex Hutton and Josh Corman, showing the importance of metrics in everyday life, and how they make a difference vs. our own biased view of reality. The session was fascinating, and at the end of it I’ve asked Alex and Josh a question on what kind of metrics we can use to identify human risks. The answer that I’ve received was that we can utilize different metrics to try to identify the risk if a specific person will conduct an information security breach.

The small, minor, insignificant problem is that the metrics currently used by information security experts are as effective as the metrics used to determine that Nordine Amrani was sane.

The way most “experts”, including people in the information security field approach the subject of human behaviour evaluation and behaviour prediction is mostly driven by the field of science they are practicing. In this post I will give some examples on some of the problems this brings, and it’s true in the case of convicted prisoners, and it’s true for end-users in small, medium and large organizations.

Do not judge me

Let’s start with the penal system- the justice system that tries to preserve the essence of the society we live in from what we think could be a human zombie-land. We punish people for their behaviour when they do something we believe is bad – each and every one based on his culture. If you were living in South Africa during the days of Apartheid you will be punished severely if you were passing the racial laws of the country. If you are living in Saudi Arabia you would be punished if you’re a woman and you will be driving without a male escort. If you are a gay person in a Muslim country like Iran and you have had gay sex, you are facing a risk of being hanged.

The problem with all those approaches is that they are based on behavioural models we developed as civilizations and religions WAY before we had any knowledge about the way our brain and body works. The more we start to understand the complexity of our brain and how much it defines people’s perception, it leads to very interesting ethical questions about out penal system. You cannot “change” gay people to stop being gay, especially not by trying to scare them by telling them that if they will be have gay sex they will die. Preventing women from having equal rights because thousands of years ago their main role in society was to produce kids while we men were fighting endless wars does not correspond to the changes in society, the progress of technology and the advantages women have over men when it comes to some cognitive tasks. And thinking that because you’re white you’re better then black person is well… simply sad.

The same mistakes of bad assumptions escort us into our corporate rules. The deterrence approach which defines consequences to employees’ activities might look good on paper for auditors, but it does not provide any real mean of handling the real problem which is the fact it cannot be considered as a control mechanism. You can’t scare people away from not connecting to each other by telling them it’s bad, you can’t prevent people from connecting to your network via unauthorized devices just because you said it wrong, and you cannot say to people that if they will be nice to other people they might risk their jobs.

False commitments

Shlomo Benartzi who is a professor and Co-Chair of the Behavioral Decision-Making Group in UCLA has recently stated in an interview that our brain has a very strange paradox when it comes to thinking about the future. “Humans are pretty bad when it comes to self-control in the present, but we don’t have any problem delaying gratifications when the reward will be presented to us in the future. One research asked people to choose what they would eat in a week from that day, either a banana or chocolate, and to write it down the answer on a piece of paper. Some chose chocolate, but most of the people chose the banana. A week later they invited the same group of people, told them that accidently their answer papers were thrown away and asked them to eat what they chosen. All of them took the chocolate. In the long term we have those fantasies we will be acting responsible, but in the present we all act like children.” (calcalist, 2011)

This is one of the reasons why information security training fails. As was shown in a recent data from the Corporate Executive Board, although 61% of organizations track user completion of training as the primary measure of success, only 7% says there is a demonstrable link between training and sustained behavior improvement.

EVEN if your organization did a wonderful job in its information awareness training (and most organizations have a very poor program, see my ranting in all my previous posts lol) – people might say they will behave responsible and they will believe it – but when it comes to reality they will probably act totally different because at that point of time it was what their brain felt they must do.

The approach of trying to scare them does not work, because if any it will trigger in them a strong emotional response, raise their fear levels, and when fear is strong the ability of any person to conduct rational thoughts is gone, because our brain at that stage is mainly being run by our emotional systems.

We don’t know Jack

Adding to the complexity of identifying who is more susceptible for a human manipulation attack by looking at “known patterns of behaviour” people tends to totally ignore the fact that each and every one of us might totally alter his susceptibility by a change of our biochemistry. And I’m not only talking about the obvious alcohol/drugs/hormones. Stress can have a huge impact on our ability to handle reality, yet we don’t even know how to quantify it.

Let’s take for example a person who is suffering from a severe stress which can cause the release of epinephrine which will be translates to adrenaline which might be translated in his body to adrenochrome which might be translated to adrenolutin. The last two, adrenochrome and adrenolutin are hallucinogens– meaning they causes people a disorders  of perception  (disturbances  of color and shape vision), thought  disorder, altered social responses and paranoia of  the  type often  seen in schizophrenia – but no vivid visual hallucinations of the  LSD type (“The  Neurotoxicity  of Glutamate,  Dopamine, Iron  and  Reactive  Oxygen  Species: Functional  Interrelationships  in  Health and  Disease: A Review – Discussion, 1999, JOHN SMYTHIES).

Such a person can then go on and start shooting other people, or going to act in a very unpredictable manner when it comes to information security – but as he does not suffer from the vivid visual hallucinations his perception of reality does not give him any clue to the fact his brain is now translating reality in a way different then he had before.

Due to the fact the transformation of adrenaline to adrenochrome is a result of oxidation, if you will treat that person with niacin (B3) and change his diet his brain will calm down. Because we can change a person’s perception of the world and his social behaviour just by simply changing his diet and the stress levels he operate in, should we judge him based on the fact that now he is OK or should you judge him based on the fact that when he conducted his actions his mind was hijacked by a chemical imbalance? We don’t know, and even worse – most of us don’t even consider such a possibility at all when we define our metrics.

I believe the current metrics used to identify if a person readiness for human manipulation are useless. No one came forward and presented a “live metrics” model for information security readiness because no one thinks of monitoring employees’ biochemical state. No one is thinking of monitoring his users’ biometrics signals, and until we will do so we are relying on statistics for our security. So even if you will try to identify users who might have risky behaviour, anyone of your employees (including yourself) might become one due to bad diet, light conditions, food and noise levels – and most chances they (or you) will not be even aware of it.

Brain scam

Now what about our brains? Can we find neurological metrics to identify people who have a higher risk of conducting an information security felony either intentional or non-intentional?

This brings us to another problem – we have no idea whatsoever – at least from an academic standpoint.

Not only very little academic research was done in the last 20 years about the subject of information security awareness, I think we need to consider it as not really useful. Many of the research papers that were published were done via surveys, some research was done on “real” people, but even so I am not aware of researches with control group, and I am also not aware of even one research that was trying to check for subconscious, body oriented signals etc. To my knowledge no one had ever put a candidates inside an MRI machine, presented them with information security questions and saw what parts of their brain show increased activity (if you know of any academic research on the subject, please direct me to it, I’m very interested to know). Why does it matter what parts of the brain are active? Because if you don’t know what parts of the brain people are reacting to possible information security threat – how can you even begin to identify what type of people will pose a higher risk.

I don’t even dare dreaming of a research that checks for more complex correlation body state with mental state like does caffeine increase the level of ability to be more or less aware and responsible when it comes to information security? What is the best wall colour to be used in order of increasing information security awareness and response level? Those are just two examples of environmental parameters that impact cognitive abilities, but no one seems to go that path yet when it comes to our field and check them up. Why? Probably because most of the people who work in the information security were never trained in neuroscience, those ideas seem to them as bizarre to say the least.

A brave new world 2.0?

As you saw, we are facing grave difficulties in trying to define usable metrics when it comes to prediction of human behaviour. If we want to reach a higher level of information security readiness we do not only need to change our users’ behaviour, but also teach them ways to overcome what they are not aware off – and the second one is really a hard problem to solve. As this should be a subject to another much longer post I will leave you with few tips:

So the next time someone one will asks you for metrics for humans, you can simply quote William Shakespere: “The fool doth think he is wise, but the wise man knows himself to be a fool.” [As you like it]

The more people will be aware of their lack of understanding, the wiser they will be, and the more secure they will become.

© All rights reserved 2011.


7 thoughts on “The Metrics

  1. Here is a link to that webinar recording

    Metrics are Bunk!?: The Zombie Apocalypse, Baseball, and Security Metrics

    Now that I read this post, I realize we didn’t even come close to answering the intent of your text based question.

    P.S. I’ve asked the securitymetrics list about MRIs during security questions.

    • Hi Joshua
      First of all, let me say again that the presentation you and Alex gave was truly awesome, and if someone didn’t see it yet it’s a must.

      Just to explain how scary Alex Pentlands’ work is, I’ve updated the blog with a link to Alex Pentland’s latest presentation on the subject of Influence Networks. It’s a video of a presentation he gave on the 9th of November, and in it he talks about his latest research.

      Around the 43th minute he makes the following statement, which to me was the most mind blowing human metrics I’ve ever heard:

      It turns out we can do even better than this. Since then what we’re done is we put the two things together – the fact that we can predict when you’re likely to buy, and that we can recruit – we can plan plot things that cause recruitment – and you can actually write down equations, which gives you analytic modules for ad campaigns and behavioral campaigns. That’s the first time you’ve ever heard that, ok? People always say “If you’ll spend that money we will give you that much yield” but we can actually tell you with 95% confidence, and we done this on stock purchases and app downloads, so believe that it’s real. Pretty interesting – and scary because it also work for political views.

      This why I asked about the question about monitoring people – because from the look of it, we’re heading fast to a very monitored future. What do you think?

      Thanks for the reply

  2. Hey. As I tweeted, I liked your post, but cringed (violently) at the sight of that pie chart. Given what it was trying to communicate, I would think just an ordered list (with some magnitude description) would have been a better approach. Charts like that help foster the image of our profession being slightly more advanced than cavemen drawings and traveling dowsers.

    • Hi
      First of all, let me say that when I saw the pie chart for the first time I also sounded like Barbara Walters when Herman Cain told her that he wants to be Secretary of Defense. (WHAT?????)

      But than I thought to myself – isn’t it a great way to show how bad the current metrics are?

      So wonderful of you for getting it 🙂


    • Hi Bob again
      I was thinking that maybe I should have given the translation of the chart (as I understand it), to show how funny it is.

      61.2% of the organizations think that their awareness program was successful by looking at the percentage of users who completed their information security awareness training (probably defining a % of the user population that is considered to be OK).
      12.2% of the organizations think that their awareness program was successful because when they look at current vs. past number of security incidents caused by user errors they see a reduction.
      06.1% of the organizations think that their awareness program was successful because when they surveyed users that completed awareness training it seems they understand information security better
      04.1% of the organizations think that their awareness program was successful because the percentage of users who successfully completed at least one information security awareness session seems to them as a good figure.
      02.0% of the organizations think that their awareness program was successful because when they checked it by actively targeting users and see if they responded responsibly (was it every user? a sample? we don’t know…)
      14.3% of the organizations think that their awareness program was successful because when they … We don’t really know. I’m also certain those organizations also don’t know as well…

  3. such an insightful obersevation. I had never thought of this in that way, but as i read it is so so true. as you have shed light in that area of neuroscience for human behaviour in relation to info sec i believe people will start explore that area. great posting.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s