About a month late I got to see this news item about a survey that results in a conclusion that people are finally getting used to DRM.
Among other things, it says that:
The overall messages from these studies are: higher-priced DRM-free downloads resonate with a percentage of consumers but not a very large one; ...
and specifically that:
... the EMR/Olswang study found that only 43% would prefer “paying a little extra” for DRM-free tracks; and the In-Stat study found that only 19% would be willing to pay 30% more for a DRM-free track, as opposed to 29% who would not (44% said that it depends on other factors).
So, on the face of it, it seems as people start to not care much if their content is DRM-crippled; at least that's what the article implies. It also compares these statistics to those of a survey done years ago that presumably reflected more hostility towards DRM.
However, before I got the chance to be amazed enough at the outcome, I bumped into a seemingly unrelated observation of that same survey...
A couple of nights ago I drove back from some family event and got pulled over by a cop. Okay, I agree that this for itself is not worth a blog post. The cop asked me to open the window, he looked at me, asked me where I come from and where I am going to, and sent me off my way, without even bothering to carry out the standard papers check. The entire event took no longer than two minutes.
What took more than two minutes was my discussion with my wife about whether or not this sort of “examination” is worth anything. She believes it is probably a waste of tax payers money, to stop people just to ask them how they're doing. I happen to think that not only that this is not a waste of money, but it's probably one of the most effective uses for this money; at least for the money that is devoted to security.
Here is a question that was raised in a discussion forum, along with my response to it. I figured it is interesting enough to post it here.
Why not just deploy a Enterprise Right Management solution instead of using various encryption tools to prevent data leaks?
The “encryption tools” function according to simple, well understood, and more-or-less enforceable security models. Their assumptions are well understood and, most importantly, match the environments they run on. They solve a simple problem, and solve it effectively.
Rights management solutions have complex security models, and run in environments that do not always satisfy the assumptions. They aim at providing complex functionality, but they often (always?) fail to deliver due to their over-complexity and unrealistic assumptions.
If your security needs can be met by the simple functional model of the “encryption tools”, then you will prefer to enjoy the assurance and thereasonable robustness they provide, which is the most desirable feature after all.
The Department of Homeland Security (DHS) wants to have the root master keys of DNSSEC. This will allow them to fake DNS responses at will. Read all about it at:
Homeland Security grabs for net's master keys
Department of Homeland and Security wants master key for DNS
It caused quite a lot of fuss. I agree with the political feeling of discomfort, but I somehow cannot understand the threat that some people attribute to this.
This depends on who you ask. Some people think that the more secure a system is, the better; with no exceptions. This school of thought is often attributed to product vendors. This approach helps them believe (and thus convince) that their product is a great buy, regardless of the situation. This approach is also common among information security newbies who believe that an additional requirement or mechanism can only make you more resistant, not less, and thus is always worth adding. The fancier of these guys call it an additional “layer”, so they sound more confident.
I guess it can be told by my tone so far that I disagree. Making a system or a network more secure is sometimes worthwhile and sometimes it is not.
For a while IT security professionals are warning against the impacts of Personal Digital Assistants (PDAs) on corporate security. A PDA can be lost or stolen and lead to undesired disclosure of the information that is on it. The emerging of micro-drives leads to these tiny devices having gigabytes of storage. Due to the high storage capacity of the PDA and the reduced file formats it uses (resulting in smaller files), a modern PDA can easily store the entire document repository of its owner. This document repository may contain masses of sensitive corporate information in a physical size that is way too easy to lose or to have stolen. This poses a real threat to organizations, as also pointed out by Bruce Schneier in an essay called “Risks of Losing Portable Devices”.
Information security officers are not unaware of the risk and attempt at finding solutions. The most immediate solution that comes to mind is password-protecting the PDA. Realizing that these mechanisms can be hacked, encryption is put to use, enciphering all or some of the PDA databases using a key that is entered by the user. This method carries notable inconvenience for the user, who is forced to enter a key each time he is looking for a phone number, an e-mail address, or a meeting time. It is clumsy, but it solves the problem. However, does it solve all problems?
No; at least not for everyone, to my opinion.
It is already obvious that security is hard to do right. Bruce Schneier has written a good essay called: Why Cryptography Is Harder Than It Looks. This essay refers to cryptography, but touches on the subject as a whole. It is still not always clear, however, where the hard-core of security analysis work is, and where exactly the difference from QA, and from other system engineering domains, lies.
I would like to take a shot at explaining the fundamental difference between assuring functionality and assuring security, and pinpoint the toughest part of security analysis.