Download presentation
Presentation is loading. Please wait.
1
Service-centric policies – Update (NA3.2)
Authentication and Authorisation for Research and Collaboration Uros Stevanovic NA3.2 task lead KIT – SCC AARC project meeting, Amsterdam November 2017
2
ToC Split in two short (unrelated) topics GDPR talk ~10 mins
News and updates GDPR task force Utrecht meet WP29 opinions (data breach notification, guidance on automated decision making) Community engagement for policy requirements ~ 10 mins Needs and requirements from communities regarding polices, hopefully non-standard requirements Interactive (hopefully)
3
GDPR task force Established by GÉANT (SURFnet) Open to everyone
Public meetings: First (constitutive) in Berlin Second Utrecht Mail list (open): Wiki, with documents:
4
Utrecht meet (30.10.2017) Participants:
DeIC (Danish e-Infrastructure Cooperation) CERN DFN SURFnet Jisc GÉANT KIT Discussion about the GÉANT services compatibility with the GDPR Risk level assessment on the type of service provided (Jisc perspective) Discussion about the WP29 opinions on data breach and automated decision-making
5
Data breach notification (WP29 opinion)
Link: Breach of personal data must be disclosed to the competent national supervisory authority Now mandatory for all controllers (previously defined by national law) Benefits: Info on how to proceed (also whether to inform affected users) Tool to enhance compliance for data protection Upon failure to report, possible monetary sanction Planning in advance, risk assessment (within DPIA, e.g.), appropriate technical and security measures Incident response plan
6
Data breach notification (WP29 opinion)
What is personal data breach? “a breach of security leading to the accidental or unlawful destruction, loss, alteration, unauthorized disclosure of, or access to, personal data transmitted, stored or otherwise processed” Examples: Loss of decryption key (used to encrypt personal data) Significant disruption of services (power failure, DDoS, ransomware) that may cause significant risk for users Hospital can not perform normal operations due to unavailability of personal data Not being able to send a newsletter for some hours does NOT present significant risk for users
7
Data breach notification (WP29 opinion)
Failure to disclose may result in (up to) 10M€ or 2% global annual turnover (higher) Failure to implement adequate security measures is a separate infringement (and potential separate fine) Disclosure within 72hrs of controller becoming aware of it. If not, explanation needed “Aware” reasonable certainty If unsure, better to report When risks exists (for users), MUST report to DPA When high risk exists for users, users also MUST be notified (timeline depends on the nature of the risk) What to report Art 33(3) What happened DPO contact Consequences Measures taken Emphasis on incident detection and response
8
Automated decision making (WP29)
Link: Concepts: General profiling Decision-making based on profiling Solely automated decision-making, including profiling Art 22(1): “The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her” Therefore, according to WP29: “as a rule, there is a prohibition on fully automated individual decision-making, including profiling that has a legal or similarly significant effect”. With exceptions (contract, law, explicit consent) Measures to safeguard data subjects’ rights and freedoms Intention is to protect data subjects’ rights and freedoms, stop discrimination, etc. Monetary fine (20M)
9
Automated decision making (WP29)
Previous interpretation: Data subject, upon automated decision was applied, may insist on the review by a human person Supported by WP paper, UK Information Commissioner’s 2017 opinion Automated decision-making may continue, with addition subject’s rights, safeguards, controllers’ obligations Current interpretation is different: Automated decision making “as a rule” is prohibited Introduces uncertainty (not clear what is now allowed) Action taken by A. Cormack (with some comments by GDPR task force) to ask for clarification from the WP29 Possible examples of impact: Automated network defense mechanisms (are they now illegal?) Analytics conducted by universities (automated reading lists, etc.) – when reviewed by tutors may introduce breach of privacy May introduce an opposite effect: Increasing the use of consent Deter the use of new technologies Increase the risks for data subjects
10
Data Protection Impact Assessment (WP29 opinion)
Link: DPIA (Data Protection Impact Assessment) Assessing the necessity, proportionality, risk managements (in relation to users) Not defined in GDPR per se, however its concept is introduced with Art. 35(7), rec. 84 Not mandatory, only when processing is “likely to result in a high risk to the rights and freedoms of natural persons” (Article 35(1)) Monetary fines for non-compliance Not conducting Conducted incorrectly Not consulting competent supervisory authority (where necessary) Fine up to 10M or 2% total annual turnover (whatever is greater)
11
Data Protection Impact Assessment (WP29 opinion)
A “risk” is a scenario describing an event and its consequences, estimated in terms of severity and likelihood. “Risk management” can be defined as the coordinated activities to direct and control an organization with regard to risk. DPIA may address single processing, or multiple that are similar in purpose, risk. When is a DPIA mandatory? When processing is “likely to result in a high risk”. Nine criteria (systemic monitoring, automated-decision making with legal or similar significant effect, sensitive data, data processed on a large scale, etc.); two or more Should be carried out before processing, by controller, DPO, processors (controller is responsible) Controller should consider publishing the DPIA (not mandatory) Supervisory authority should be consulted when the residual risks are high where the identified risks cannot be sufficiently addressed by the data controller
12
Questions?
13
Community policies engagement
Engagement of communities with special requirements: Acceptable Use Policy Monitoring, logging policy Security policies Possible requirements: Separate, per project, logging aggregation and data processing (i.e. projects should be insulated from each other within community) Separate data privacy policies (e.g. one for providing a service, separate per project if needed) Policies for processing of special data (medical data?) Possible questions: Are you processing special data? Do you have a need for increased anonymity of researchers? Do you have a need for a per project logging policy?
14
Questions (contd.) Do you have special requirements for:
Security policies Membership policies
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.