The following diagram shows a simplified version of the full systems dynamic model. '+' indicates a proportional relationship and '-' indicates a inversely proportional relationship.

Detection trap:
Have you ever wondered, most of the time when an organization comes under an attack, they are usually under-invested on security control or they don't have any security controls at all? The detection extracted from the above diagram is a good explanation of this observation.

Trust trap:
Sometimes, good intentional measurements from the management may themselves lead to attacks. The following digram shows how it unfolds with the level of trust the management has on its on employees.

When the management perceives a higher trust on the employees, they may decide that they don't need to have extensive security controls to monitor their employs in the belief that there will be hardly any employee who will turn an enemy of the company. With less detection capabilities, it is natural to see that there will only be a few detected attacks while many go unnoticed. With fewer reported attacks, the managerial trust goes even higher. This loop also feedbacks and hence creates the trust trap. Why does it happen like this? One possible reason, as the loop feedbacks, the perceived level of risk by the employees of getting caught falls down.
Unobserved emboldening:
While those two pitfalls continue to feedback, the following shows how the perception of risk by the employees/insiders change and then lead to full blown attacks.

When an insider attempts to do something wrong and it goes unnoticed, their perceived risk of doing that falls down. Hence, they they tend to do more probing. Notice that this loop also feedbacks, lower the perceived risk each iteration. (This scenario is true with other situations. When a person does something that is not acceptable by the society and it goes unnoticed, that person may tend to even bigger crimes. It does not always need to be a crime. The intention could innocuous. For example, a person may speed for fun. If that person is never copped, they may be tempted to go even faster.) When the perceived risk goes below a certain threshold, the insider may carry out the actual attack.
It should be noted that not all insiders act like this. In fact, this is only the minority. (Security controls are there to protect against a few bad people while making sure the good majority is not negatively affected by these measurements) This happens only when things go wrong, when things don't work out the way the employees want - for example - no recognition for work, no bonus/salary increase or less pay, possibility of being laid off, etc. In any case, in order to have a healthy and safe working environment, the management need to show a certain level of trust while keeping the perceived level of risk (as perceived by insiders) at an acceptable level (e.g. by training, by legally prosecuting wrong doers, security controls, etc.).
Ref: Preliminary System Dynamics Maps of the Insider Cyber-threat Problem, 2004.
No comments:
Post a Comment