It’s become common for people thinking about security economics to call for liability around security failures. The idea is that software creators who who ship insecure products could be held liable, because they’re well positioned to address the problems.
I don’t think this is a trouble-free idea. There are lots of complexities. As one example, are open source vendors going to be liable? Fyodor, who writes and gives away nmap? RedHat.com? What about Apple, when they include a package, say bind or bzip, both of which were included in their latest security update. Including such third party software allows Apple to provide basic functionality at lower cost.
Now, the UK Information Commissioner has proposed that doctors who lose laptops with patient data could be subject to a £5,000 fine.
Mr Thomas said: “If a doctor, or hospital [employee] leaves a laptop containing patients’ records in his car and it is stolen, it is hard to see that is anything but gross negligence.”
The commission can currently issue enforcement notices but these “do not impose any element of punishment for wrongdoing”. But Lord Lyell of Markyate, a former Attorney-General, said it would be disproportionate to criminalise doctors for losing a laptop.
Mr Thomas said the intention was not to prosecute for a single incident, but that for gross negligence there was “a need to have some deterrent in place”. He said anyone holding personal data should know the basics of “encryption” to protect that material. (“Doctors may be prosecuted if their laptops are stolen,” Times Online, UK)
I’m with Lord Lyell here, and think that there’s a great deal of specific thinking to be done before we should impose more liability for software flaws. Software creators, including Mozilla, know that it’s hard to make bug-free software, so my employer probably thinks similar things.
Possibly related, “Government ignores Personal Medical Security.”