How The EU’s Proposed New ‘Privacy’ Rules Will Be A Tool For Massive Censorship
We recently wrote about some concerns about the new Data Protection Directive that is being set up in Europe. The law is driven by people with good intentions: looking to better protect the privacy of European citizens. Privacy protection is an important concept — but the current plans appear to be so focused on privacy protection that it gives very little regard for the unintended consequences of the way it’s been set up. As we wrote in our last post, Daphne Keller at Stanford’s Center for Internet and Society is writing a series of blog posts raising concerns about how the new rules clash with basic concepts of free speech. She’s now written one about the immensely troubling setup of the “notice and takedown” rules included in the General Data Protection Regulation (GDPR). For years, we’ve been concerned by problematic notice and takedown procedures — we’ve seen the DMCA frequently abused to stifle speech, rather than for genuine copyright challenges. But, for some reason, people often immediately leap to “notice and takedown solutions” for any kind of content they don’t like, they and the drafters of the GDPR are no different.
Except, it’s worse. Because whoever drafted the notice-and-takedown portion of the GDPR actually made the process worse than the notice and takedown rules found elsewhere. Here’s the GDPR process, as explained by Keller:
- An individual submits a removal request, and perhaps communicates further with the intermediary to clarify what she is asking for.
- In most cases, prior to assessing the request’s legal validity, the intermediary temporarily suspends or “restricts” the content so it is no longer publicly available.
- The intermediary reviews the legal claim made by the requester to decide if it is valid. For difficult questions, the intermediary may be allowed to consult with the user who posted the content.
- For valid claims, the intermediary proceeds to fully erase the content. (Or probably, in the case of search engines, de-link it following guidelines of the Costeja “Right to Be Forgotten” ruling.) For invalid claims, the intermediary is supposed to bring the content out of “restriction” and reinstate it to public view — though it’s not clear what happens if it doesn’t bother to do so.
- The intermediary informs the requester of the outcome, and communicates the removal request to any “downstream” recipients who got the same data from the controller.
- If the intermediary has additional contact details or identifying information about the user who posted the now-removed content, it may have to disclose them to the individual who asked for the removal, subject to possible but unclearly drafted exceptions. (Council draft, Art. 14a)
- In most cases, the accused publisher receives no notice that her content has been removed, and no opportunity to object. The GDPR text does not spell out this prohibition, but does nothing to change the legal basis for the Article 29 Working Party’s conclusions on this point.
If you don’t see how that process is likely to lead to widespread abuse and the censorship of perfectly legal speech, you haven’t been paying much attention on the internet over the last decade plus. To be fair, you can understand why the drafters think this process makes sense. They’re thinking solely about truly problematic and embarrassing information. If, say, your personal medical records have been posted online, it makes sense to look for a way to have that info removed as quickly as possible. But, given how frequently people use these processes in the copyright context to takedown just content they “don’t like” (and how often people admit they do so because it’s the only way to get such content down), you know it’s going to get massively abused for issues that have nothing to do with privacy protection.
Once again, it seems like regulators focus solely on solving for the “worst case” scenario, with little thought towards how that will be applied in much more common cases, and what that means for free speech and society.