We've been discussing Outrage Considered Useful. Alex remarked in a comment:
The term "Outrage" suggests that risk cannot or should not be discussed in a rational manner.
What I think Sandman is getting at is that often risk isn't discussed in a rational manner, because managers' (and security people's) egos, fears, ambitions, etc. get in the way. In a perfect Platonic world perhaps things wouldn't be that way, but in this one, people don't operate by reason alone, even when they think they are doing so.
Outrage x Hazard may be a means to express risk within the context of the organization, but I like probability of loss event x probable magnitude of loss better for quantitative analysis.
Indeed, quantitative analysis is good. However, once you've got that analysis, you still have to sell it to management. And there's the rub: that last part is going to require dealing with emotion.
The average security technical person probably isn't going to be an expert at dealing with emotions, because if they were, they wouldn't have gone into that line of work in the first place. Even the thought of doing so leads some of them to think they're being asked to lie. No, just to link the quantitative analysis to effects and to the emotional meaning of those effects. Thus the need for people to act as bridges to deal with that.
Meanwhile, here's an example of a CIO who did find quantitative measures to determine an answer to a risk question management had made prominent:
For years, members of the military brass have been warning that soldiers' blogs could pose a security threat by leaking sensitive wartime information. But a series of online audits, conducted by the Army, suggests that official Defense Department websites post far more potentially-harmful than blogs do.
Unfortunately, management didn't like the answer, and it being a pretty strong quantitative result simply made it more imperative for management to bury it.
On the other hand, sometimes there's too little outrage, and nothing gets done because people think the risk can't happen here, or it's not solvable in the time available, or it's too big for us to deal with:
Dan Rather Reports presents conclusive evidence of the failure of touch screen voting machines across the country. The episode, "The Trouble with Touch Screens", is an entire hour devoted to new information on this story. From scientists involved in testing the equipment, to manufacturers in third world countries who shipped these defective voting machines to the United States, Dan Rather Reports presents new information showing that these defective machines may have altered the outcome of multiple elections.
The Trouble with Touch Screens, Dan Rather Reports, HDNet, August 2007
Whenever I post about this I get either silence or excuses, such as we can't fix it by the next election. Look, folks, we know this problem is fixable: India already did it years ago. And it's not as if we haven't known about this for a long time. I've been harping on it since 2006 as has Gunnar, and who knows how long Bruce Schneier and Ron Rivest and the other people who have been seriously examining the problem have been on about this!
Consider the risk, here. Subversion of multiple elections in the most powerful country on earth. Is this not a threat to its own populace as well as the rest of the world? We could use some outrage here!
There's a difference between FUD and outrage. When we know there really is a risk, that it has been exploited multiple times, and that it's very likely to be exploited again, and the probable magnitude of loss is very large, maybe there should be fear, because there's not much uncertainty and doubt. Even better, maybe there should be outrage. Then maybe something would get done.