Presentation is loading. Please wait.

Presentation is loading. Please wait.

Cybersecurity & Privacy challenges

Similar presentations


Presentation on theme: "Cybersecurity & Privacy challenges"— Presentation transcript:

1 Cybersecurity & Privacy challenges
And what Communities can do

2 Introduction Bart van den Heuvel, CISM Corporate Information Security Officer Maastricht University Ir. Anita Polderdijk-Rijntjes, CISSP Security & Privacy Officer University of applied sciences Windesheim, Zwolle

3 Interaction Feel free to ask Feel free to tell

4 Agenda Challenges: Open, but still safe enough Communities
From Prevention to Reaction; know your users Social Media Learning Analytics Global Collaboration Are we in control? Communities Examples Results

5

6 Show of hands (1) We need open access. So we need to open up our Firewall Yes, in a way. But a modern Firewall can detect traffic patterns and react on anomalies. We have to be aware of what’s normal and what’s not. Users can not physically be on 2 places at the same time, but their devices can! At least “logically”. Laptop with VPN, Tablet with WiFi and Smartphone with 4G. But we can still recognize strange browsers, suspicious addresses, high traffic and connection rates, etc. With smart filters and thresholds we can (automatically) respond by warning users and administrators or even blocking traffic. e.g. SPLUNK to detect criminal logins.

7 Show of hands (2) “this is quite a strong password indeed”
Yes it is! The longer the better. A pass phrase of 38 lowercase characters is very safe.

8 Password strength: we recommend….
Most IT systems require “strong” passwords which users can’t remember, but are easy to crack Pass phrases are easy to remember and nearly impossible to crack & Easy to type on a smartphone Source: Stanford.edu

9 Other recommendations
Use encryption and a PIN-code to unlock devices (in combination with (remote) wiping) Implement Single Sign On Use 2 Factor Authentication when necessary: high sensitive data always medium sensitive data context based (location, time, device)

10 Show of hands (3) Executive managers should not trust personal assistants with their passwords. Yes, you have to trust your assistant. But there are technical solutions to provide assistants the right access and functionality, using their own account. It is not a matter of trust. It is a matter being accountable for your own actions (and of setting the right example)

11

12 Show of hands (4) Use of social media and free webtools is acceptable to enhance educational processes. Social media – e.g. Facebook Webtools – e.g. blogs, Dropbox, survey and data gathering tools

13 Show of hands (5) The Institution should not provide student data to (commercial) third parties.

14 Privacy challenge (1) Use of Social media (e.g. Facebook)
Free webtools (e.g. blogs, Dropbox, survey/data gathering tools) Teachers point of view Easy to use no support from IT department necessary Free no need to ask permission for funding

15 Privacy risk Facebook users aren’t consumers they are the product!

16 General Data Protection Regulation (GDPR)
“Controller” determines purposes and means of the processing of personal information “Controller” is responsible to implement appropriate technical and organizational measures to ensure and to be able to demonstrate that processing is performed in accordance with GDPR “Processor” processes personal data on behalf of the “Controller” Processing by a “Processor” shall be governed by a contract. In order to address the privacy risk/ compliancy risk involved with the use of these kind of tools, Let me introduce the following terms as defined in the GDPR (entered into force may 2016, applicable from may 2018) In case of a personal data breach Controller is responsible for notification Storage is processing 'processing' means any operation or set of operations which is performed on personal data or on sets of personal data, whether or not by automated means, such as collection, recording, organisation, structuring, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, restriction, erasure or destruction;

17 Privacy Risks: free webtools and social media
“Controller” has no control over personal data for which he is responsible Student provides personal info to processor which is not governed by a contract Even if Student is asked for consent, what choice does he really have?

18

19 Show of hands (6) Insight in student performance provides sufficient information to organize adequate student guidance

20 Show of hands (7) Monitoring students use of the virtual learning environment provides valuable information for guiding students to better academic performance Observed data – recorded automatically e.g. by cookies, sensors or facial recognition from CCTV pictures » Derived data – produced from other data e.g. calculating customer profitability from the number of items purchased in a store and the number of visits » Inferred data – produced using analytics to find correlations between datasets in order to categorise or profile people e.g. predicting future health outcomes

21 Privacy challenge (2) Learning analytics:
collection and processing of student data to enhance educational processes Ethical en legal issues awareness, consent, ownership, control, the obligation to act, interventions, impacts on student behaviour transparency around algorithms and metrics Consent – current systems often difficult to implement

22 Privacy Risks: Learning analytics
Big data generally uses observed, derived or inferred rather than provided data – this has implications for privacy as individuals may be unaware that the data is being collected and processed. Accuracy of data Interpretation of data Learning analytics challenge: The challenge is to discover what the most relevant data is for you to be collecting in order to make the right decisions Provided data – consciously given by individuals e.g. when filling in an online form » Observed data – recorded automatically e.g. by cookies, sensors or facial recognition from CCTV pictures » Derived data – produced from other data e.g. calculating customer profitability from the number of items purchased in a store and the number of visits » Inferred data – produced using analytics to find correlations between datasets in order to categorise or profile people e.g. predicting future health outcomes Accuracy -Another problem is “enmeshed identities” where the data does not differentiate between an authenticated individual and a group. Students working together on a device may unwittingly leave enmeshed fingerprints in their data. Meanwhile when data is collected against identifiers such as IP addresses or cookies and attributed to an individual there is a danger that it does not actually relate to that person at all.

23 Learning analytics Lawfulness of processing primarily: Consent or
Necessity for the purposes of the legitimate interests pursued by the controller In both cases; Student has a choice, he has the ability to opt out of particular uses of his data Current systems often have not been developed conform privacy-by-design principles And do not support opt-out for individual students Consent should be given well informed SURF whitepaper: Learning Analytics onder de wet bescherming persoonsgegevens SURF whitepaper: Hoe data de kwaliteit van het hoger onderwijs kunnen verbeteren

24 General Data Protection Regulation (GDPR)
Upon data gathering the subject should be informed about a.o.: Contact information about controller and data protection officer Purpose of the processing and legal basis for processing and if applicable the legitimate interests pursued Retention period(s) Rights to access, rectification, erasure… Right to withdraw consent at any given time Whether subject is obliged to provide the personal data and possible consequences of failure to provide Existence of automated decision-making, including profiling, meaningful information about the logic involved, as well as significance and the envisaged consequences for the data subject Consent should be given freely and well informed (and can be withdrawn) When legal basis is “necessity for the purposes of the legitimate interests pursued by the controller” describe the legitimate interests Retention – the period for which the personal data will be stored, or if that is not possible, the criteria used to determine that period; the right to lodge a complaint with a supervisory authority Transfer to third country Last four bullits are to ensure fair and transparant processing Last point particularly important regarding learning analytics

25 General Data Protection Regulation (GDPR)
Into force per may 2016, applicable may 2018 Key elements: Obligation to inform data subject Privacy Impact Assessments, Privacy by design Data Protection Officer Enforce rights of the data subject (a.o. right to be forgotten) Be able to demonstrate compliancy Notification of personal data breaches Penalties Penalties - Up to 20 million Communities like SCIPR help each other with best practices. For instance : model for PIA, roadmap implementation notification data breach, model agreement for processing

26

27 Show of hands (8) Encryption of sensitive data in transit ensures confidentiality of messages. Not really. It helps of course, but without adequate measures and procedures our data will be more at risk when it is at rest.

28 Show of hands (9) Dropbox encrypts your data at rest, so it’s safe to use. Not really. It helps of course, but Dropbox uses it’s own keys to encrypt your data. So Dropbox can easily decrypt your data.

29 and so are our own secret services….
Obama is watching us! and so are our own secret services…. So Encryption is important, but you have to implement it “all the way”.

30 Collaboration services
Within SURF, the Dutch Cooperation to support IT in Higher Education: SURFconext and SURFteams (federation of Identity and Service Providers) SURFdrive (“Dropbox” within our own “Community cloud”) Filesender (File-exchange service up to 1 TeraByte, with encryption)

31

32 Show of hands (10) You can safely use a cloud service as long as it uses certified datacenters (e.g. ISO27001). Not really. It helps of course, but not only the datacenter has to be certified. The service provider itself has to prove it’s reliability too. Within SURF, SURFmarket negotiates for it’s constituency, which is efficient for Higher Education institutions AND Service Providers

33 Show of hands (11) If you want to be sure your organization in in control, certification is the way to go (e.g. ISO27001). It is certainly a good idea, but in most cases not necessary

34 Setting the standards It’s all about Risk
What is your organizations Risk Appetite? Comply or explain Within SURF: Security Baseline Higher Education (based on ISO27002) SURFaudit Standards framework with Maturity Model EDUstandard ? Self assessment with Benchmarking tool Peer Review

35

36 Communities: Examples
SCIPR : SURF Community for Information Security & PRivacy Strategical/Tactical, Policies, best practices; SCIRT : SURF Community for Incident Response Teams Tactical/Operational, Best practices, Intel and tips (Traffic Light Protocol), Incident coordination.

37 Communities: Results SCIPR SCIRT Mailing list and Intranet
Information Security Framework (e.g. Acceptable Use Policy) Privacy templates and tools (e.g. PIA-tool) CyberSave Yourself program SURFaudit Educational session and trips Advisories (e.g. WindowsXP) SCIRT Training sessions Incident exercise Advisories (e.g. tooling)

38

39


Download ppt "Cybersecurity & Privacy challenges"

Similar presentations


Ads by Google