(The last) Talk #3 for the day: If I rated this talk by 1 to 10 scale with 10 being the best, I'd give all ten points without holding anything back! It goes with the line "best things come last" (actually I am rather short of words to express it, maybe there's a related line, anyway, this is how we say it in our mother tongue "hoda hoda sellum eliwena jameta" :-)
The presenter was Dr Charles P Pfleeger, a consultant, speaker, educator and author on computer and information system security. It was done as part of CERIAS seminar.
He talked about 6 dumbest ideas in computer security. These are additions to what Markus J. Ranum had to say about the $subject. Here is the summary of what I remember out of his talk.
#1 We'll do security later.
The idea is you cannot retrofit security. You have to think about security up front.
#2 We'll do privacy later.
The idea is we should have fair information practices. You have to tell user what you are going to do with their data.
#3 Encryption cure all.
Encryption is over-rated. You need to think about the period where the data is in the plain. For example, even if you have end-to-end encryption where you don't have data in the clean in transit, you still have data in the clear at the start of the transaction and at the end of the transaction. Other issues, which are less sever compared to the one I mentioned above, are key management and implementation/algorithm weakness.
#4 You have either perfect security or nothing
Putting in his own words, providing security is not like riding a tight rope where you are either on the rope or not. Security is a continuum. It's not practical, even unnecessary, to seek to provide more security than required. You have to quantify the risk you have and you need to decide how much risk you are willing to take. In other words, we need to keep the security requirements sensible.
#5 Separation is Unnecessary
The idea is controlled sharing requires separation. Just like, you have to draw a line between spectators from players, you have to think about different levels of security. This idea is not new; in fact, it was introduced in operating systems way back in 1970's..we went back into old habits in 1980's..now now we are again getting back to good old principles.
#6 It's easy - we can do security ourselves
This idea is not very clear to me yet. Have to read some of the references he cited to get a real understanding of it. However, the idea is the program complexity inhibits security; so it is more difficult to enforce security than you may think with the complexity of the code.
Some post-talk thoughts:
Most of the papers I have seen in security areas try to come up with security solutions (complete security solutions, if you will) that do not consider about some of the misconceptions mentioned above (for example, risk vs. gain analysis).
Another thing that I ponder about is, do we, in the security field, actually put enough weight into understanding who is going to use our solutions, how they are different in interpreting solutions we provide, etc.
That's it from me about the talk!
I've just looked at the world cup super 8, cricket match score between England and Bangladesh. England has managed to limp pass Bangladesh. Had Tigers (Bangladesh team) put a few more runs on the board, they could have caused the third (first, second) famous upset in the world cup. Tomorrow we, lions (Sri Lankan team) take on black hats (New Zealand team). I think our team is in a good shape to seal the victory.
Now it's time to get back to other fun stuff. I need to do some finishing touches on the operating system paging lab which is due tomorrow 11.59.59 pm ;-) and get into the fast track of preparing for the two quals I am taking in two weeks time. And also I need to prepare a report for the independent study I am doing. I will have to wait to reveal more about that work until the paper gets accepted. So stay tuned!
There are bunch of papers related to the $subject. Maybe I'll put out a list with my interpretation when I get some time.