Adam quoted some interesting thinking about infosec incentives. However, I’m not sure it’s that simple. Gordon and Loeb say that you shouldn’t spend more than 37% of an expected loss.
However, at last summer’s WEIS (Workshop on the Economics of Information Security), Jan Willemson published a paper, “On the Gordon & Loeb Model for Information Security Investment.” In it, Willemson directly challenges the 37% number.
Here’s Willemson’s abstract:
In this paper we discuss a simple and general model for evaluating
optimal investment level in information security proposed by Gordon
and Loeb. The authors leave an open question, whether there exists
some universal upper limit for the level of optimal security investments
compared to the total cost of the protected information set. They
also conjecture that if such a level exists, it could be
1/e ~= 36.8%. In this paper, we disprove this conjecture by constructing an example
where the required investment level of up to 50% can be necessary. By
relaxing the original requirements of Gordon and Loeb just a little bit,
we are also able to show that within their general framework examples
achieving levels arbitrarily close to 100% exist.
So here’s the first problem — that it may behoove one to spend more than 37%.
The next problem that I see is the whole nature of an expected loss. How do I know what to expect? I’m a cynic, so I can see using some math. If there is a 2% chance that any of my employees will lose a laptop, there’s a 40% chance that a laptop has personal data on it, and I have 10,000 employees, then I expect to have 200 employees lose laptops, and 80 of them are going to cause me a problem. That’s bad. It is only another matter to take the Ponemon $182/name number and multiply that by the number of names, and I have a dollar figure.
To me, the right way to solve this problem is to put some sort of disk encryption on those laptops. Just (heh, just) deploy that and Alice is your auntie. No incentive plan needed.
As a last problem, do I really want to deal with an incentive plan? Incentive plans have evil senses of humor. The people affected by them will inevitably do things based not on what is good for the company, but what affects their incentive plan. If we also assume 100 people in the security department, if they come to my conclusion — encrypt those laptops — they will see $100 in their own pocket for every $1 they save on the software. If they buy software that is cheaper, but less reliable, it can cost the company
Even better for them would be to ban all dangerous data on laptops. We’ve all worked where there were asinine, dictatorial decrees on security. Decrees are cheap. They are, however, not good for the company because the company wants people to be able to work flexibly.
It gets worse, though. Here’s another suggestion:
Here’s the kicker: If there is a breach, the costs come out of the bonus pool first. This would be a bummer, but it would also give you first hand data for budgeting ;).
It also creates in incentive to ignore breaches. If you’re an admin looking over logs at a major university, and you think you see a breach, but aren’t sure — what do you do? Very likely, it’s hope it isn’t a breach, not investigate further. And how are you going to feel when the bonus you were counting on sublimates when Bob over there finds a breach two weeks before the end of the year. Thanks, Bob. Couldn’t you have at least waited until January?
Creating a system where the security team is not looking at security, but how little they spend is not good for security, nor is it good for the company. It has been an encouraging trend in security that we’re starting to think of how good security can be liberating. Security that liberates people is a cost on the security end, but a benefit somewhere else. It might even have indirect benefits, like lowering turnover and making it easier to hire good people.
It is also not good when the incentives reward bureaucratic stiffness, See-No-Evil behavior, and punish people for the conscientious behavior of their co-workers.
Always, always beware when you set up incentives. People will act according to the incentive. (If they didn’t, it wouldn’t be an incentive.) The incentive distracts from the goal. If the incentive points in the direction of the goal, it might be a reasonable approximation of the goal, but it is not the goal. From here we get unintended consequences.