The EU’s data protection reform – a lost opportunity?

Originally published at EDRi-gram on 04-November-2015 https://edri.org/eu-data-protection-reform-lost-opportunity/

“Someone who knows things about us has some measure of control over us, and someone who knows everything about us has a lot of control over us. Surveillance facilitates control.”

– Bruce Schneier, cryptographer and security expert

When the European Union talks about modernising EU rules on data protection in the digital age, the most important challenge is unquestionably “Big Data”, and the most important challenge of Big Data is profiling.

When the European Union talks about modernising EU rules on data protection in the digital age, the most important challenge is unquestionably “big data”, and the most important challenge of big data is profiling.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

Big data is not “more data” – big data is the massive merging of data to generate more data, more assumptions and more knowledge about you and me. If you are this age, went to this website and bought that product, big data will predict that you can be offered higher prices, or you shouldn’t be offered insurance, or you might vote in a particular way. Innocuous morsels of personal data interact and become pregnant, producing offspring that could be anything but harmless. The “I’ve nothing to hide” argument never made much sense, but it makes no sense at all in a world where you have no idea what guesses are being made on the basis of the data that you know about.

Our devices are feeding information into large databases 24/7: Our mobile devices gather and send information about our movements while we walk.. Many of the apps installed in our phone demand unnecessary access to our contact list. Our home smart meters will know when we get home after work and if we have guests. Our search engine keeps records of our interests and fears. Facebook has successfully experimented with its power to make people happier or sadder, and even to make them more (or less) likely to vote. Professionals called “data brokers” collect and aggregate personal data from a wide range of sources to create detailed profiles of individuals which are then sold to third parties.

So, how do the proposed new EU rules (the General Data Protection Regulation, GDPR) address this huge new challenge? Not very well. First, the article dealing with profiling (Article 20) was weak in the European Commission’s initial proposal, was diluted by the European Parliament and eviscerated by the Council of the European Union. The current Council text says that data subjects, the individuals to whom the collected personal data relates, cannot oppose to the profiling itself only to “decisions based solely on automated processing, including profiling”. Therefore, if there is a profiling activity but no formal “decision” has been made, or if that automated processing and profiling is only part of the process and not the sole basis for the decision, there would be no specific right to object under EU data protection law.

Flanking protections, which could normally be relied upon, even if profiling and decision-making rules were weak, have also been diluted: Data-minimisation becomes “not excessive” data processing. Access and rectification become problematic when profilers can hide behind their algorithms as “trade secrets” or pseudonymisation. “Purpose limitation”, the principle that data must be collected for specified, explicit and legitimate purposes only, is undermined by watery compromises on what “compatible use” might be, while the need for the user’s consent can be bypassed by the open-ended “legitimate interest” loophole.

If this had not watered down the safeguards enough, profiling has been re-inserted into the list of exceptions for which Member States may restrict rights and obligations for purposes related to “national security”, “defence”, “public security” and, for fear that these provisions were not vague enough, “other important objectives of general public interests of the Union or of a Member State”. This, in practice, allows national governments to circumvent EU data protection law and allow profiling when the goal is allegedly linked to any of these ill-defined goals.

A harmonised, modernised legal instrument for the EU is more necessary than ever. The GDPR needs to be future-proof and needs to have strong safeguards without loopholes. The current negotiating text of the GDPR looks like set to fail its biggest test. If the ongoing negotiations between the European Parliament and EU Council do not resolve these and other problems, we might be facing the loss of a fundamental right, the loss of trust, and take-up of technologies based on big data. This should not be worrying for EU citizens only: The GDPR is crucial for global norm setting in the field of data protection and privacy. We have one opportunity – we must do better than this.

Surveillance-based manipulation: How Facebook or Google could tilt elections (26.12.2015)
http://arstechnica.com/security/2015/02/surveillance-based-manipulation-how-facebook-or-google-could-tilt-elections/

Facebook reveals news feed experiment to control emotions (30.06.2014)
http://www.theguardian.com/technology/2014/jun/29/facebook-users-emotions-news-feeds

General Data Protection Regulation: Document pool (25.06.2015)
https://edri.org/gdpr-document-pool/

Obfuscation: how leaving a trail of confusion can beat online surveillance (24.10.2015)
http://www.theguardian.com/technology/2015/oct/24/obfuscation-users-guide-for-privacy-and-protest-online-surveillance

Our obsession with explaining past atrocities could destroy our free speech (22.10.2015)
http://www.telegraph.co.uk/news/uknews/law-and-order/11947492/Our-obsession-with-explaining-past-atrocities-could-destroy-our-free-speech.html

 

Advertisements

Smart Borders package: Unproportionate & unnecessary data collection

Originally published at EDRi-gram on 04-November-2015: https://edri.org/smart-borders-package-unproportionate-unnecessary-data-collection/  )
https://www.flickr.com/photos/neccorp/16250748818/

Photo by NEC Corporation of America with Creative Commons license. https://www.flickr.com/photos/neccorp/16250748818/

“The proposal is fear-driven and fear-triggering at the same time, placing emphasis on a putative need to protect the EU from those coming from outside.”

(Extract from EDRi’s response to the consultation)

In an attempt to overcome the failed proposal from 2013 on the Smart Borders package, the European Commission launched a consultation to prepare a revised text, to which EDRi submitted its response on 29 October 2015. The new EU Entry/Exit System (EES) plans to extend biometric ID checks to all non-EU nationals entering or leaving the EU. Despite the numerous questions about the costs and serious implications to civil liberties raised in relation to the 2013 proposal, the European Commission seems decided to give it another try.

................................................................. Support our work - make a recurrent donation! https://edri.org/supporters/ .................................................................

The Smart Borders Package, which is aimed at improving the management of migratory flows , consists of three legislative proposals: (1) a Regulation establishing an EU Entry/Exit System (EES); (2) a Regulation establishing a Registered Traveller Programme (RTP) and (3) a Regulation amending the Schengen Borders Code to take into account the establishment of the EES and the RTP.

EDRi’s submitted the position that such a vast collection of sensitive personal data risks undermining the right to privacy of millions of people. As any other restriction of fundamental rights, this measure needs to be guided, inter alia, by the necessity and proportionality test of the Article 52.1 of the Charter of Fundamental Rights of the European Union. The new entry system could include biometric ID checks including the collection of ten fingerprints and facial images. The Commission has yet to demonstrate clearly why these privacy invasive measures are necessary, effective and proportionate, and whether the system could operate without some or all of them.

In our submission we mentioned the need to learn from the case law of both the European Court of Human Rights (ECtHR) and the Court of Justice of the European Union (CJEU), and recalled that if an intrusive measure such as data retention was to be considered, the legislators would have the obligation to verify the “proportionality of the interference”. Therefore, no data retention mandates should be approved until a credible, independent test, proving compliance with CJEU and ECtHR case law has been conducted. In addition to the European courts, the issue of biometric databases has been the subject of debate in various Member States, for example in the French Constitutional Court.

Once the European Commission has analysed the responses, it will produce a legislative proposal. This proposal needs to take into account the concerns that were raised before and that are still under analysis by experts like the EU Fundamental Rights Agency. As we have seen with the Safe Harbor agreement and the Data Retention Directive, legislation which was in clear violation of EU core norms can lead to the violation of citizen’s rights that can drag on for years, as well as costs for companies, citizens and the European courts. The Commission and the European Parliament cannot fail again and drag us into years of litigation, nor can it leave it to the CJEU to fix the breaches of fundamental rights law that they willfully or negligently foist on individuals. The EU needs to produce the right policies to achieve its goals, and stop suggesting the dragnet collection of personal data as the solution to all European problems.

Response from EDRi to the Smart Borders Consultation (29.10.2015)
https://edri.org/files/smartborders/consultationresponse.pdf

EDRi-gram: France: Biometric ID database found unconstitutional (28.03.2012)
http://history.edri.org/edrigram/number10.6/french-biometric-database-unconstitutional

Biometric data in large EU IT systems in the areas of borders, visa and asylum – fundamental rights implications
http://fra.europa.eu/en/project/2014/biometric-data-large-eu-it-systems-areas-borders-visa-and-asylum-fundamental-rights?_cldee=ZG5AZGllZ29uYXJhbmpvLmV1&urlid=1

(Contribution by Diego Naranjo, EDRi)