Download presentation
Presentation is loading. Please wait.
Published byCornelius Richard Modified over 6 years ago
1
Online content regulation and human rights: what should responsible private actors look like?
Associate Prof. Sophie Stalla-Bourdillon, University of Southampton, UK Director of Ilaws and iCLIC @SophieStallaB
2
There is this great (French) saying…
3
With great power comes great responsibility
4
This is exactly our slogan for regulating online platforms!
EC
5
With great power comes great responsibility
6
What the hell does responsibility really mean?
7
I know! I know! EC
8
So what should responsible actors look like?
9
A Tale of 2 Sticks
10
You shall! I. The hard Stick
11
The proposal for a new Copyright Directive in the Digital Single Market 2016
12
Art. 13 + Recitals 38 & 39 of the proposal
13
Online platforms that store and give access to user generated content
Information society service providers that store and provide to the public access to large amounts of works or other subject-matter uploaded by their users shall take appropriate measures, e.g. implement content recognition technology
15
So what? X
16
Are they not simply asking platforms to automatically detect potentially infringing content?
X
17
What’s wrong with that? X
18
Well…
19
1
20
Content Recognition Technology (CRT) is meant to be used for:
21
(automatic) Detection + (automatic) Removal
22
Why?
23
Because CRT is described as a measure to prevent the availability of © works
24
However
25
CRT is not (yet) able to asses context
* CRT is not (yet) able to asses context * “The Commission supports further research and innovative approaches going beyond the state of the art with the objective of improving the accuracy of technical means to identify illegal content…” EC 2017
26
Back in 2012
27
CJEU Sabam/Netlog 2012 (CJEU Scarlet/Sabam 2011)
28
That injunction could potentially undermine freedom of information, since that system might not distinguish adequately between unlawful content and lawful content, with the result that its introduction could lead to the blocking of lawful communications. Indeed, it is not contested that the reply to the question whether a transmission is lawful also depends on the application of statutory exceptions to copyright which vary from one Member State to another. In addition, in some Member States certain works fall within the public domain or may be posted online free of charge by the authors concerned! CJEU
29
It is true
30
In Article 13 of the proposal
31
“Member States shall ensure that the service providers … put in place complaints and redress mechanisms that are available to users in case of disputes over the application of the measures referred to in paragraph 1.”
32
BUT
33
These mechanisms are meant to be applied after the removal of content!
34
Yet
35
Research shows that complaint mechanisms are not a robust safeguard against abuse
36
To sum up
37
CRT is not able to distinguish between legal and illegal content because context matters
38
Freedom of expression
39
2
40
If CRT is used systematically
41
There is an argument
42
That this amounts to general monitoring
43
Article 15 of the E-commerce Directive prohibits general monitoring
44
It is true
45
Rec. 47 E-commerce Directive
46
Monitoring obligations in a specific case are ok
47
BUT
48
With CRT
49
Are targeted: All users All user activities
50
Back in 2012
51
CJEU Sabam/Netlog 2012 (CJEU Scarlet/Sabam 2011)
52
The injunction imposed on the hosting service provider requiring it to install the contested filtering system would oblige it to actively monitor almost all the data relating to all of its service users in order to prevent any future infringement of intellectual-property rights. It follows that that injunction would require the hosting service provider to carry out general monitoring… CJEU
53
The injunction requiring installation of the contested filtering system would involve the identification, systematic analysis and processing of information connected with the profiles created on the social network by its users. The information connected with those profiles is protected personal data because, in principle, it allows those users to be identified! CJEU
54
Therefore
55
Violation of Art. 15 ECD
56
+
57
Right to data protection
58
3
59
CRT has a cost
60
Back in 2012
61
CJEU Sabam/Netlog 2012 (CJEU Scarlet/Sabam 2011)
62
In the main proceedings, the injunction requiring the installation of the contested filtering system involves monitoring all or most of the information stored by the hosting service provider concerned, in the interests of those rightholders. Moreover, that monitoring has no limitation in time, is directed at all future infringements and is intended to protect not only existing works, but also works that have not yet been created at the time when the system is introduced. Accordingly, such an injunction would result in a serious infringement of the freedom of the hosting service provider to conduct its business! CJEU
63
Freedom to conduct one’s business
64
It’s not exactly best practice for a law- maker to informally add to Art. 13: “trust us CRT will be required only when they exist and when they are cheap”
65
Remember
66
With great power comes great responsibility
67
Are they not meaning
68
With great power comes greater power?
69
Power to determine what is illegal Power to monitor user activity
* * If you want complaint and redress mechanisms, which you should, reference files will need to be linked to users
70
Not to mention that not everyone can develop its own CRT
71
So the strongest are even more empowered!
72
PLEASE II. The Soft Stick
73
The Communication on tackling illegal content online 2017
74
Towards an enhanced responsibility for online platforms
75
I can have Art. 13 without Art. 13 and even more than Art. 13 !
EC
77
Being proactive does not mean being active!
All platforms should be proactive!
78
Wait!!!!!!!!! X
79
What does proactive really mean?
X
80
Easy peasy!
81
Let’s assume there is a database of © protected works somewhere
The platform filters (i.e. restrict access to) all the matching files!
82
This is “great” because there is no Human Rights issue as the platform is the one deciding!
* * Obviously the Communication is not binding!
83
This is “great” because this means platforms do not need to put in place upload filters!
84
Platforms only need re-upload filters!
85
Which should mean [but this is not written] this is monitoring in a specific case
86
Help! Sophie
87
I don’t understand what the difference between upload and re-upload filters is!?!?!
Sophie
88
Isn’t it the case that both upload filters and re-upload filters cannot assess context?
Sophie
89
Very naively…
90
Isn’t it what Art. 13 is trying to do, I mean, to impose these “re-upload” filters?
X
92
Remember
93
With great power comes great responsibility
94
Are they not meaning
95
With great power comes greater power?
96
Power to determine what is illegal
Power to monitor user activity
98
III. A call for a Wise Stick
99
With great power should come !
safeguards
100
If CRT cannot properly determine what is illegal CRT should NOT be systematically used for automatic detection + automatic removal
101
Processes should be put in place to ensure all interests at stake are taken into account before removal
102
Could these processes be judicial processes?
103
Judicial processes are slow…
104
Nota Bene
105
There is one consideration that is sometimes overlooked
106
While some platforms might have an interest in “cleaning” their systems as quickly as possible
107
Law enforcement has an interest in keeping allegedly illegal content online to investigate
108
Thank you!
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.