Skip navigation

Are Security Bugs an Unfair Liability?

Every time I hear about a new way to steal files from a system, I get suspicious about why such a security bug exists in the first place. As end users, we carry most of the liability for bug-ridden software. Most manufacturers disclaim that liability by having us accept some type of end-user license agreement. Such agreements prevent end users from seeking damages when a manufacturer's product proves to be faulty. So when someone uses a programming flaw (such as the nasty hole recently discovered in Netscape Communicator) to steal files from our systems, it's our fault because we used the software. I'll never understand how that makes sense.

If Dodge sells me a new Viper sports car with a steering defect that causes the car to go out of control and crash, I can sue Dodge and win—as long as I can prove its product was defective. I fail to understand how computer-related hardware and software are any different. Why should we carry the liability and responsibility for a defective computer product? I can't think of any other market space where manufacturers enjoy such protection against shoddy workmanship. Perhaps it's time we stop forcing users to accept liability for items they can't control.

Why should we push so hard for digital signature laws and the rapid migration of businesses into e-commerce when developers of the foundation for those efforts are not liable for defects in workmanship? I realize no manufacturer can prove its products are bug free, and we all know there will be bugs anyway, but why must that be our burden? This question is especially important now that computer hardware and software are absolutely mission-critical to a huge percentage of people all over the world.

Most hardware and software manufacturers get their products debugged for free by the world's countless independent hackers, and those hackers rarely get significant thanks from vendors in return for their hard work. Instead, corporate-sponsored lobbyists push for tough federal laws that strive to prohibit reverse engineering, which would make most forms of software hacking illegal. So in the future, it's possible that if a hacker finds a serious security problem in the code from some widely adopted OS, that hacker could go to prison simply by reporting the flaw to the manufacturer: The hacker would be indirectly admitting to reverse-engineering the product. But if we can't reverse engineer product code, we can't protect ourselves. We'll be forced to trust the manufacturer.

I don't think we cyber-citizens have our priorities straight. As I mentioned, we seem to be systematically migrating the entire global economy onto the Internet. We're even migrating governments onto the Internet at a blistering pace. Before we get too carried away, maybe we should reconsider who should carry the liability for products that can destroy otherwise healthy entities. Until next time, have a great week.

TAGS: Security
Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish