Wednesday, September 30, 2009

Too much trust is not a good thing

In any organization, you won't disagree that we need to have some level of trust in order to have a healthy working environment. Project managers trust developers to meet deliverables and develop according to the specification. System/network administrators are trusted not only to keep the infrastructure functional but also safe-guard from outsiders. Hospital employees are trusted not to misuse patient records. Bank employees are trusted not to misuse/illegally modify financial records. This very own trust could be a negative factor. I found this interesting report which explains three traps. The report is a result of a workshop of 25 research from various disciplines in 2004 to come up with a systems dynamic model in order to better understand insider threats/attacks.

The following diagram shows a simplified version of the full systems dynamic model. '+' indicates a proportional relationship and '-' indicates a inversely proportional relationship.

Detection trap:
Have you ever wondered, most of the time when an organization comes under an attack, they are usually under-invested on security control or they don't have any security controls at all? The detection extracted from the above diagram is a good explanation of this observation.

When the organization's perceived risk increases, the management is willing to invest on detection measurements (in the hope that the perceived risks will lower). With better detection mechanisms, it is likely to detect more insider attacks/attempts for such attacks. When the number of cases go up, it is natural to perceive that the organization is under higher risk. See, this loop feedbacks. At the same time the inverse is also true! At some point in time, the organization may perceive that the perceived level of risk is low (due to better education, better controls in place, better management, etc.). This motivates the management to invest less on detection capabilities. With a few measures to catch wrong doing, it is like that not many cases are caught. Now the organization may perceive even less risk as not many cases are detected. Notice the loop feedbacks in this case as well. Hence the detection trap.

Trust trap:
Sometimes, good intentional measurements from the management may themselves lead to attacks. The following digram shows how it unfolds with the level of trust the management has on its on employees.

When the management perceives a higher trust on the employees, they may decide that they don't need to have extensive security controls to monitor their employs in the belief that there will be hardly any employee who will turn an enemy of the company. With less detection capabilities, it is natural to see that there will only be a few detected attacks while many go unnoticed. With fewer reported attacks, the managerial trust goes even higher. This loop also feedbacks and hence creates the trust trap. Why does it happen like this? One possible reason, as the loop feedbacks, the perceived level of risk by the employees of getting caught falls down.

Unobserved emboldening:
While those two pitfalls continue to feedback, the following shows how the perception of risk by the employees/insiders change and then lead to full blown attacks.

When an insider attempts to do something wrong and it goes unnoticed, their perceived risk of doing that falls down. Hence, they they tend to do more probing. Notice that this loop also feedbacks, lower the perceived risk each iteration. (This scenario is true with other situations. When a person does something that is not acceptable by the society and it goes unnoticed, that person may tend to even bigger crimes. It does not always need to be a crime. The intention could innocuous. For example, a person may speed for fun. If that person is never copped, they may be tempted to go even faster.) When the perceived risk goes below a certain threshold, the insider may carry out the actual attack.

It should be noted that not all insiders act like this. In fact, this is only the minority. (Security controls are there to protect against a few bad people while making sure the good majority is not negatively affected by these measurements) This happens only when things go wrong, when things don't work out the way the employees want - for example - no recognition for work, no bonus/salary increase or less pay, possibility of being laid off, etc. In any case, in order to have a healthy and safe working environment, the management need to show a certain level of trust while keeping the perceived level of risk (as perceived by insiders) at an acceptable level (e.g. by training, by legally prosecuting wrong doers, security controls, etc.).

Ref: Preliminary System Dynamics Maps of the Insider Cyber-threat Problem, 2004.

No comments: