Reason magazine is a great source of libertarian thought, but it’s also the home of the Volokh Conspiracy blog which, while also libertarian-leaning, nevertheless includes some decisively non-libertarian opinions. Case in point, this post by by Stewart Baker of the Cyberlaw Podcast, explaining how the EARN IT act could be used to attack so-called “end-to-end” encryption.
The central change made by the bill is this: It would allow civil suits against companies that recklessly distribute child pornography. It would do this by taking away a piece of their immunity from liability when transmitting their users’ communications under Section 230 of the Communications Decency Act (CDA).
Right away that makes me nervous, because anything that chips away at the protections of Section 230 can’t be good for the internet. Baker’s characterization of Section 230 doesn’t help me trust him:
Originally added to the CDA as part of a legislative bargain, Section 230 was one of the best deals tech lobbyists ever struck. The CDA was a widely popular effort to restrict internet distribution of pornography to minors. Tech companies couldn’t stop the legislation but feared being held liable for what their users did online, so they agreed not to fight the CDA in exchange for Section 230. The next year, the law’s measures to protect children online were ruled unconstitutional, leaving the industry protections in Section 230 as more or less the only operative provision in the entire CDA.
Section 230 doesn’t just protect big tech. It protects everybody. I feel safe having comments on my blog because Section 230 says I’m not liable if my readers post defamatory statements about celebrities or threats against public figures. Note that this does not mean that defamation and threats are immune from legal action, it just ensures that the legal action is directed at the correct target: The person who actually created the problem content.
This brings us to the EARN IT Act, which would treat child pornography more or less the way FOSTA treated sex trafficking content—by lifting the immunity of companies that don’t take reasonable measures to prevent its distribution. […] But now that FOSTA has shown the way, it’s no surprise that a similar attack on child pornography has gained traction.
Indeed it’s not. Everyone who opposed FOSTA warned that it was the only the first step in gutting Section 230.
[…] EARN IT imposes civil liability on companies that distribute child pornography “recklessly.” Victims—presumably the individuals whose images are being circulated—could sue online platforms that recklessly ignored the exchange of child pornography on their services. By itself, that rule is hard to quarrel with. Recklessness requires more than simple negligence. A party is reckless if he deliberately ignores a harm that he can and should prevent.
For anyone who has defended a tort case, being reassured that the jury has to find your client acted recklessly is cold comfort.[…]
To address that fear, EARN IT offers a safe harbor to companies that follow best practices in addressing child pornography. […] the bill creates a commission to spell out the safe harbor best practices. […] To protect the government’s interests, the attorney general is given authority to review and modify, with reasons, the best practices endorsed by the group. Companies that certify compliance with the best practices cannot be sued or prosecuted under federal child pornography laws. […]
To see what this has to do with encryption, just imagine that you are the CEO of a large internet service thinking of rolling out end-to-end encryption to your users. This feature provides additional security for users, and it makes your product more competitive in the market. But you know it can also be used to hide child pornography distribution networks.
This is an argument that proves too much. Sure, an encrypted channel can be used to hide a child pornography distribution network, but that’s because encrypted channels can be used to hide messages about absolutely anything — child pornography, ransom demands, securities fraud, conspiracies to nuke a U.S. city, collusion with foreign governments to affect the election, prosecutors instructing cops to hide exculpatory evidence, the timetable for a military coup — you name it, encryption can hide messages about it.
You can make similar arguments about a lot of things. Just think of all the evil stuff people can hide in cardboard boxes! Any given box could contain a bomb, or ebolavirus, or plutonium! And just imagine all the things people can do behind locked doors! Gosh, every locked door could be hiding a terrorist planning to send pipe bombs in cardboard boxes packed with Ebola! Lions and tigers and bears, oh my! Won’t someone think of the children?!?
Arguing that people might use privacy to do evil things is an argument against all privacy. Then again, Baker worked for both the NSA and DHS, and he wrote a book that apparently warns about the “privacy and business lobbies that guard the status quo” against “a hardheaded recognition that privacy must sometimes yield to security.” So maybe he sees that as more of a feature than a bug.
After the change, your company will no longer be able to thwart the use of your service to trade in child pornography, because it will no longer have visibility into the material users share with one another. So if you implement end-to-end encryption, there’s a risk that, in future litigation, a jury will find that you deliberately ignored the risk to exploited children—that you acted recklessly about the harm, to use the language of the law.
In other words, EARN IT will require companies that offer end-to-end encryption to weigh the consequences of that decision for the victims of child sexual abuse. And it may require them to pay for the suffering their new feature enables.
The word “enables” does a lot of heavy lifting in that sentence. Again, it’s a case of proving too much. If end-to-end encryption “enables” child porn, do trucks enable smuggling? Do gloves enable theft because they protect burglars from leaving fingerprints? Do telephones enable con artists because they can use them to call victims? And God only knows how many crimes were “enabled” by the invention of the car. Did anyone sue Ford Motors for the crimes of Bonnie and Clyde?
You can play this stupid game with anything, but it’s still a stupid game. Holding a product’s manufacturers liable for how criminal third parties misuse the product is garbage policy.
I don’t doubt that this will make the decision to offer end-to-end encryption harder. But how is that different from imposing liability on automakers whose gas tanks explode in rear-end collisions? [..] That makes the decision to offer risky gas tanks harder, but no one weeps for the automaker. Imposing liability puts the costs and benefits of the product in the same place, making it more likely that companies will act responsibly when they introduce new features.
Does Baker really not understand this? In his hypothetical, the companies making end-to-end encryption available aren’t the ones making the child pornography, they aren’t the ones distributing the child pornography, and they aren’t the ones consuming the child pornography. The reason it makes sense to sue automakers whose gas tanks explode in rear-end collisions is because they made the gas tanks. We don’t sue third parties that weren’t involved in the decision. We don’t sue the railroads and trucking companies that transported the cars to the dealerships.
There is nothing radical about EARN IT’s proposal, except perhaps for the protections that the law still offers to internet companies. Most tort defendants don’t get judged on a recklessness standard. Juries can award damages if the defendant was negligent, a much lower bar. Indeed, in many contexts, the standard is strict liability: If your company is best able to minimize the harm caused by your product, or to bear its cost, the courts will make you the insurer of last resort for all the harm your product causes, whether you are negligent or not. Compared to these commonplace rules, Section 230 remains a remarkably good deal, with or without EARN IT.
This is not a new concept. Similar principles protect plenty of other businesses from being held liable for misuse of their product by bad actors. You can’t sue Verizon when your boss uses his cell phone to sexually harass you, you can’t sue the post office when someone sends you a letter bomb, you can’t sue Louisville Slugger when someone beats you with a baseball bat.
(Well, you can sue, but you almost certainly won’t win.)
[…] EARN IT doesn’t impose liability on companies that fail to follow the commission’s best practices. They are free to ignore the recommendations of the commission and the attorney general, which are after all just a safe harbor, and they will still have two ways to avoid liability. First, they can still claim a safe harbor as long as they adopt “reasonable measures” to prevent child sex exploitation.
Why should they have to? Before the internet, plenty of pedophiles shared printed photos through the mail, and nobody sued USPS for “enabling” that. This is little more than fear-mongering over new (-ish) technology.
The risk of liability isn’t likely to kill encryption or end internet security. More likely, it will encourage companies to choose designs that minimize the harm that encryption can cause to exploited kids. Instead of making loud public arguments about the impossibility of squaring strong encryption with public safety, their executives will have to quietly ask their engineers to minimize the harm to children while still providing good security to customers […]
That’s another bullshit argument. The whole point of end-to-end encryption is that nobody but the sender and the receiver can learn anything useful about the contents of messages. There’s no way the company carrying the the encrypted traffic can “minimize the harm” if they can’t even read the messages. And if they can tell which messages are harmful, then they aren’t actually providing end-to-end encryption. Either Baker doesn’t understand this, or he doesn’t think his readers will.
That, of course, is exactly how a modern compensatory tort system is supposed to work. Such systems have produced safer designs for cars and lawn mowers and airplanes without bankrupting their makers.
Only because you haven’t asked them to do something literally impossible.
Look, I’m sure tech companies are against this legislation because they want to make money, but that isn’t the only thing at stake here. Legislation like this would undermine the adoption of secure messaging systems. Government security and law enforcement agencies have been scheming to do that for decades, and the EARN IT act sounds like just another attempt to break privacy using whatever convenient boogeyman seems to be on hand this time.
The very idea of panicking over end-to-end encryption is absurd, because encryption has been end-to-end for as long as we’ve had encryption: Somebody has a message to send, and they don’t want it read by anyone else along the way, so they encrypt it before they send it, and the intended recipient decrypts it to read it. That’s how Julius Caesar did it, and that’s how it’s still done today.
When you visit your bank’s website, everything you type is encrypted in your browser and decrypted at the bank’s data center, and everything your bank sends you is encrypted in their data center and decrypted by your browser. You and your bank are the endpoints for the encrypted streams. It’s end-to-end encryption. This keeps your financial data a secret between you and your bank, and the entire secure web works that way. That includes my blog. If you’re reading this at Windypundit, you’re using end-to-end encryption.
The world of email and messaging systems is an unusual exception to end-to-end encryption for various technical and historic reasons. Much of our email technology was invented before the desire for security was well understood, and no single solution has emerged as a standard for encryption. Even when inventing communication technologies with security in mind, scenarios such as internal messages, organization-to-organization messages, organization-to-individual, and person-to-person all present different challenges for key management that invite incompatible solutions. Mailing lists can mix all of these together, and if you want automatic spam protection, rule-based sorting, or full-text searching, your software has to have access to the plaintext. In essence, we mostly don’t use end-to-end encryption because we prefer the convenience that comes with handling information in the clear.
Nevertheless, there are still situations where end-to-end encryption is desirable. Plenty of people are already doing end-to-end encryption using tools like PGP, Enigmail, and Mailvelope. These are pieces of software you can install on your computer to help you encrypt your email messages.
In fact, that’s all end-to-end encryption is: Software on your computer (or your phone or tablet or whatever) that you can use to secure your messages. Baker says big tech wants to roll out end-to-end encryption to be “more competitive,” but he doesn’t bother to discuss why it would be more competitive. End-to-end encryption is not some sort of nefarious scheme by big tech. It’s “more competitive” because people want it.
We want end-to-end encryption because we want privacy. Privacy from hackers, privacy from big tech companies that carry our data for us, and privacy from government bootlickers who try to sell us false promises of safety.