Online content regulation and human rights: what should responsible private actors look like? Associate Prof. Sophie Stalla-Bourdillon, University of Southampton, UK Director of Ilaws and iCLIC @SophieStallaB
There is this great (French) saying…
With great power comes great responsibility
This is exactly our slogan for regulating online platforms! EC
With great power comes great responsibility
What the hell does responsibility really mean?
I know! I know! EC
So what should responsible actors look like?
A Tale of 2 Sticks
You shall! I. The hard Stick
The proposal for a new Copyright Directive in the Digital Single Market 2016
Art. 13 + Recitals 38 & 39 of the proposal
Online platforms that store and give access to user generated content Information society service providers that store and provide to the public access to large amounts of works or other subject-matter uploaded by their users shall take appropriate measures, e.g. implement content recognition technology
So what? X
Are they not simply asking platforms to automatically detect potentially infringing content? X
What’s wrong with that? X
Well…
1
Content Recognition Technology (CRT) is meant to be used for:
(automatic) Detection + (automatic) Removal
Why?
Because CRT is described as a measure to prevent the availability of © works
However
CRT is not (yet) able to asses context * CRT is not (yet) able to asses context * “The Commission supports further research and innovative approaches going beyond the state of the art with the objective of improving the accuracy of technical means to identify illegal content…” EC 2017
Back in 2012
CJEU Sabam/Netlog 2012 (CJEU Scarlet/Sabam 2011)
That injunction could potentially undermine freedom of information, since that system might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications. Indeed, it is not contested that the reply to the question whether a transmission is lawful also depends on the application of statutory exceptions to copyright which vary from one Member State to another. In addition, in some Member States certain works fall within the public domain or may be posted online free of charge by the authors concerned! CJEU
It is true
In Article 13 of the proposal
“Member States shall ensure that the service providers … put in place complaints and redress mechanisms that are available to users in case of disputes over the application of the measures referred to in paragraph 1.”
BUT
These mechanisms are meant to be applied after the removal of content!
Yet
Research shows that complaint mechanisms are not a robust safeguard against abuse
To sum up
CRT is not able to distinguish between legal and illegal content because context matters
Freedom of expression
2
If CRT is used systematically
There is an argument
That this amounts to general monitoring
Article 15 of the E-commerce Directive prohibits general monitoring
It is true
Rec. 47 E-commerce Directive
Monitoring obligations in a specific case are ok
BUT
With CRT
Are targeted: All users All user activities
Back in 2012
CJEU Sabam/Netlog 2012 (CJEU Scarlet/Sabam 2011)
The injunction imposed on the hosting service provider requiring it to install the contested filtering system would oblige it to actively monitor almost all the data relating to all of its service users in order to prevent any future infringement of intellectual-property rights. It follows that that injunction would require the hosting service provider to carry out general monitoring… CJEU
The injunction requiring installation of the contested filtering system would involve the identification, systematic analysis and processing of information connected with the profiles created on the social network by its users. The information connected with those profiles is protected personal data because, in principle, it allows those users to be identified! CJEU
Therefore
Violation of Art. 15 ECD
+
Right to data protection
3
CRT has a cost
Back in 2012
CJEU Sabam/Netlog 2012 (CJEU Scarlet/Sabam 2011)
In the main proceedings, the injunction requiring the installation of the contested filtering system involves monitoring all or most of the information stored by the hosting service provider concerned, in the interests of those rightholders. Moreover, that monitoring has no limitation in time, is directed at all future infringements and is intended to protect not only existing works, but also works that have not yet been created at the time when the system is introduced. Accordingly, such an injunction would result in a serious infringement of the freedom of the hosting service provider to conduct its business! CJEU
Freedom to conduct one’s business
It’s not exactly best practice for a law- maker to informally add to Art. 13: “trust us CRT will be required only when they exist and when they are cheap”
Remember
With great power comes great responsibility
Are they not meaning
With great power comes greater power?
Power to determine what is illegal Power to monitor user activity * * If you want complaint and redress mechanisms, which you should, reference files will need to be linked to users
Not to mention that not everyone can develop its own CRT
So the strongest are even more empowered!
PLEASE II. The Soft Stick
The Communication on tackling illegal content online 2017
Towards an enhanced responsibility for online platforms
I can have Art. 13 without Art. 13 and even more than Art. 13 ! EC
Being proactive does not mean being active! All platforms should be proactive!
Wait!!!!!!!!! X
What does proactive really mean? X
Easy peasy!
Let’s assume there is a database of © protected works somewhere The platform filters (i.e. restrict access to) all the matching files!
This is “great” because there is no Human Rights issue as the platform is the one deciding! * * Obviously the Communication is not binding!
This is “great” because this means platforms do not need to put in place upload filters!
Platforms only need re-upload filters!
Which should mean [but this is not written] this is monitoring in a specific case
Help! Sophie
I don’t understand what the difference between upload and re-upload filters is!?!?! Sophie
Isn’t it the case that both upload filters and re-upload filters cannot assess context? Sophie
Very naively…
Isn’t it what Art. 13 is trying to do, I mean, to impose these “re-upload” filters? X
Remember
With great power comes great responsibility
Are they not meaning
With great power comes greater power?
Power to determine what is illegal Power to monitor user activity
III. A call for a Wise Stick
With great power should come ! safeguards
If CRT cannot properly determine what is illegal CRT should NOT be systematically used for automatic detection + automatic removal
Processes should be put in place to ensure all interests at stake are taken into account before removal
Could these processes be judicial processes?
Judicial processes are slow…
Nota Bene
There is one consideration that is sometimes overlooked
While some platforms might have an interest in “cleaning” their systems as quickly as possible
Law enforcement has an interest in keeping allegedly illegal content online to investigate
Thank you!