Popular Posts

Monday, September 4, 2017

Why you should be concerned by the GDPR even if your company is not located in the EU


The European personal data protection directive of 24 October 1995 applied to data processing carried out by companies, i.e. data controllers, located within the European Union. Data processing activities carried out by data controllers located outside of the EU were generally not subject to the provisions of the European directive as transposed into the national laws of the Member States. (1) With the development of technology and of online services around data, many companies located outside of the European Union, such as Google, Amazon, Facebook or Apple (the “GAFA”) for example, collect and process data from Europeans and “escape” the European regulations, even though data transfers to these American companies can be subject to the Privacy Shield principles.

Now, data and more specifically personal data is at the core of the digital economy. It then became necessary to update the European personal data laws to take into account the technology developments that have occurred since the 1995 directive, and assure a high and homogenous level of protection to personal data. This was done with the General Data Protection Regulation (GDPR). This European text was adopted on 27 April 2016 after over four years of intensive debates. It will become applicable on 25 May 2018. (2)

One of the purposes of the GDPR is to take into account, cases where several data controllers and/or processors located in different regions in the world are involved in data processing; but also cloud computing and big data services (with servers installed and data collected in several regions); and the activities carried out by the GAFA, so that the personal data of the people living in Europe remain protected regardless of where the data controller is located in the world.

The scope of the regulation covers not only businesses in the European Union but also non-EU companies targeting the European market. These non-EU companies are therefore concerned by the GDPR and must get compliant with these new rules.


1. The GDPR is applicable in Europe and beyond

The 1995 directive had to be transposed into national law of the Member States. These national data protection laws did however include differences between the Member States, certain countries having opted for a strict transposition of the European directive, whereas other countries chose a more liberal approach.

The GDPR will become enforceable directly in all the European Union. Its provisions will apply almost identically in all the Member States, except for a few provisions which may differ slightly among the Member States. (3)

But where the directive had moderate impact outside of the EU, the regulation will apply not only within the EU but will also produce extra-territorial effects, beyond the EU borders. (4)

    1.1 Application within the European Union

The regulation shall apply to any processing of personal data in the context of the activities of an establishment of a controller or a processor in the Union, regardless of whether the processing itself takes place within the EU.

The establishment located in the EU implies the effective and real exercise of activity through “stable arrangements”. However the establishment is not subject to any particular legal form. It may be the headquarters, or a subsidiary or even a branch of a company itself located outside of the Union.

The processing may be carried out in or outside the EU. With this provision databases hosted via a cloud computing service can be governed by the GDPR, regardless of where the servers are actually installed in the world.

    1.2 Extra-territorial application

The regulation shall also apply to processing regarding individuals located in the EU, carried out by a data controller or a processor not established in the Union where the processing activities are related to offering goods (e.g. e-commerce activity) or services (e.g. mobile applications, cloud hosting services) to such data subjects, whether connected to a payment or free of charge.

To establish whether the data controller or the processor is actually targeting the European market by proposing goods or services to persons located in the EU, one must gather a number of elements such as the use of a European language or of a currency such as the euro and the fact that the products or services can be delivered in Europe. The mere accessibility of the web site of the company in Europe, or an email address are not sufficient to establish that that company targets the European market.

The data processing of persons located in the Union by a company, controller or processor, which is not established in the Union is also subject to the GDPR when the purpose of such processing is to monitor the behaviour of these persons, if such behaviour takes place in the EU. This provision is mainly about online profiling, “particularly in order to take decisions concerning her or him or for analysing or predicting her or his personal preferences, behaviours and attitudes.” (5)

One should also note that these provisions shall apply to data controllers and to processors. The latter should also take all necessary measures to comply with the GDPR.

The GDPR is not limited to controllers and processors located in the European Union. Its geographical scope reaches beyond the EU borders whenever personal data of European data subjects are processed.


2. What are the consequences for non-European businesses?

Companies that have no establishments in the European territory but that target the EU for their commercial activities (see criteria above), and that in doing so collect and process personal data of European subjects will therefore have to comply with the GDPR, the deadline being 25 May 2018.

    2.1 The designation of a representative in the Union

Beyond the GDPR compliance work to be carried out, controllers and processors that have no establishment in the EU must designate a representative in the EU, “in writing”. (6)

This representative must be established in one of the Member States where the data subjects, whose personal data are processed in relation to the offering of goods or services to them, or whose behaviour is monitored, are located. The representative, as the agent of the controller or processor shall be the point of contact for the supervisory authority and for the data subjects having questions about the processing. The controller and processor shall however remain primarily legally liable with regards to GDPR compliance and its due application.

It must be noted that no representative must be designated in the following cases:
processing which is occasional,
which does not include, on a large scale, processing of special categories of data as referred to in Article 9(1) or processing of personal data relating to criminal convictions and offences referred to in article 10, and
and is unlikely to require a privacy impact assessment (PIA) subject to article 35 of the GDPR.

Also, non-European public authorities or bodies are not concerned by the designation of a representative.

    2.2 The United Kingdom after Brexit

Once the United Kingdom is no longer a Member State, the European regulation will no longer apply to it. However, the UK government has declared that they wanted to pass a new law, repealing the Data Protection Act 1998 currently in effect, so as to include the GDPR into English law.

The purpose of this Bill is to reassure businesses after Brexit, on the ability to keep transferring personal data between the UK and the EU. In doing so, the UK wants to ensure that its Data protection law will be considered as offering an adequate level of protection by the Brussels Commission, allowing businesses to keep transferring personal data between the UK and the EU without restrictions. (7)

    2.3 GDPR compliance

The European regulation includes several new principles and existing rights that were reinforced. These principles and rights must be integrated in the internal procedures of businesses processing personal data of Europeans. This can be a costly, burdensome and time consuming process. These principles can be divided up between the rights of data subjects and the obligations of the controllers and processors.

a) The rights of data subjects
    - The conditions to obtain consent from the data subjects are reinforced (art. 7): the terms regarding consent must be drafted in clear and explicit language;
    - The right to be informed is modified toward more transparency and simplification (art. 12, 13 and 14)
    - Data portability (art. 20) permits data subjects to request the controller to recover or to transfer their collected data to a new data controller;
    - For online services targeting children (i.e. children below 16, or 13 in certain Member States), processing children data will be subject to the consent or authorisation of the person having parental authority. (art. 8)

b) The obligations of the controllers and processors
    - Automated process and profiling techniques will be regulated. (art. 22) Such process will be authorised under certain conditions and provided the data subject has given his consent;
    - According to the accountability principle, the controller must implement clear and accessible internal rules to guarantee and demonstrate compliance with the regulation (art. 5 and 24);
    - During the development of new products or services, the controller must include personal data protection by default in the definition of the processing system and within the data process  (“privacy by design” principle) (art. 5 and 25);
    - The GDRP imposes stronger data protection security rules. Security breaches must be notified by all controllers, regardless of their main activity (art. 5 and 32 to 34);
    - A data protection officer (DPO) must be appointed in all companies where the core activities of the controller or processor consist of processing data which require monitoring of data subjects on a “large scale” or processing of specific categories of data on a “large scale” (art. 37, 38 and 39).

Finally, the GDPR includes the possibility for the supervisory authorities to impose more stringent sanctions. (art. 83) Depending on the type of infringement, the supervisory authorities can impose administrative fines up to 10 million euros or 2% of the total worldwide turnover of the company during the preceding financial year, whichever is higher, or up to 20 million euros or 4% of the total worldwide turnover of the company during the preceding financial year.
                                                                * * * * * * * * * * * *


(1) See article 4 “National law applicable” of Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data

(2) Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)

(3) For example, each Member State can choose the minimum age for a child to give his/her consent, between 13 and 16 years (art.8).

(4) See GDPR, recitals 22 to 24 and article 3 “Territorial scope”

(5) Recital 24

(6) GDPR, article 27

(7) “UK Government announces proposals for a new Data Protection Bill”, in Technology Law Dispatch, 16 August 2017

(8) For a more detailed analysis of the GDPR, see our previous articles on this matter: New European General Data Protection Regulation (GDPR): the compliance clock is ticking, How to prepare for GDPR compliance and be ready by May 2018



Bénédicte DELEPORTE
Avocat

Deleporte Wentz Avocat
www.dwavocat.com

September 2017

Friday, August 11, 2017

How to prepare for GDPR compliance and be ready by May 2018


The General Data Protection Regulation (GDPR) will come into effect in the European Union in less than a year from now, on 25th May 2018. (1) The GDPR is a thorough and complex reform of data privacy law, which means that companies have to get organised to be compliant and ready by May 2018.

There are many differences between the existing European data privacy legal system based on the 1995 Data Protection directive and the new GDPR. Whereas, the 1995 Data Protection directive had to be transposed into the legal systems of each member-state, with national data protection laws which didn’t come into effect at the same time (France transposed the 1995 directive in 2004!) and with some differences between the national data protection laws, the GDPR will apply (almost) identically across the European Union from 25th May 2018.

The 1995 Directive was outdated regarding certain processing activities not available at the time, or regarding the development of the role of processors, especially those providing cloud computing services. The GDPR takes into account the evolution of technology and of data processing activities and aims to reinforce the rights of the individuals (data subjects) on their personal data with clearer rules regarding consent for data collection and processing and more stringent obligations on the data controllers and processors.

With the GDPR, companies will be subject to a new “accountability” regime. Accountability under the GDPR includes the implementation of new procedures such as the privacy-by-design principle which implies that data privacy must be included into the design stage of a new product or service; data privacy impact assessments when new data processing is likely to result in a high risk for the rights of the data subjects; the obligation to maintain a record of processing activities listing the processing and procedures implemented and the obligation to notify personal data breaches to the supervisory authority (following a security breach or a cyber attack for example).

The fines for breaching GDPR obligations will be much higher than before since depending on the nature of the breach, administrative fines may reach between 10 million euros or 2% of the worldwide revenue of the company and 20 million euros or 4% of the worldwide revenue of the company…

The data privacy agencies of the member-states, and the members of the Article 29 Working Party (representatives of the data privacy agencies of the member-states) are working actively to help companies get prepared for GDPR. For example, the French data privacy commission (CNIL) has published a plan to help companies get organised to prepare GDPR compliance. And the members of the Article 29 Working Party (WP29) have adopted guidelines providing more detailed information on the new principles of the GDPR.


1. The compliance plan recommended by the French data privacy commission


The French data privacy commission (CNIL) has published a plan to help companies work on GDPR compliance. (2) This plan is comprised of six steps, as follows:

    - Step 1: Appoint a “compliance pilot”
Given the complexity of implementing a GDPR compliance plan, an individual - or depending on the size of the organisation, a dedicated task force - should be specifically appointed to drive this phase. This individual, who may be an existing or future data privacy officer (DPO), or an external consultant, shall have several tasks, including informing, advising and consulting the internal teams. He/she should also perform internal audits and should be key in organising and coordinating the compliance tasks to be performed.

    - Step 2: Map out the processing activities
The compliance team should carry out an inventory of the data processing activities carried out by the company and record them. This will allow the compliance team to assess the practical impacts of the GDPR on the data processed by the company.

    - Step 3: Prioritise the tasks to be carried out
Based on the types of data processing activities, the team will then be able to identify the compliance tasks to be implemented. These tasks should be prioritised, taking into account the risks of the processing on the rights and freedoms of the data subjects.

    - Step 4: Manage risk
If the team has identified data processing activities that are likely to generate high risk on the rights and freedoms of the data subjects, a data privacy impact assessment (DPIA or PIA) must be carried out for each such processing. Companies can use the PIA guidelines to help them implement these new procedures (see below).

    - Step 5: Develop or update your internal procedures
The company’s internal procedures will have to be updated to be able to apply a high level of protection to personal data. These procedures must protect data at any time taking into account all the events which may happen during data processing (such as a data breach, managing  correction or access requests, modification of the data collected, etc.).

    - Step 6: Document compliance
To be able to prove that the company complies with the GDPR, the necessary documents must be drafted and regularly updated. These documents shall include the company’s internal procedures, data privacy impact assessment, internal audit reports, etc.


2. The Article 29 Working Party guidelines

The WP29 has published several support documents to help with GDPR compliance. The purpose of these documents is to clarify the new principles that must be implemented by the companies by May 2018. At the end of June 2017, the following guidelines were published:

    - Guidelines on Data Protection Impact Assessment (“DPIA” or “PIA”) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679
These guidelines provide details on the types of processing activities that should trigger a privacy impact assessment, the existing methods to carry out a PIA, the rules governing the release of a PIA and/or notification to the supervisory authority and when the supervisory authority should be consulted in case of a potentially risky processing. Typically, a PIA will include the following four features: i) a description of the proposed processing and its purpose; ii) an assessment of the necessity and proportionality of the processing; iii) an assessment of risks to data subjects; and iv) the measures to address the risks and demonstrate compliance with the GDPR.

The data protection impact assessment principle is defined under article 35 of the GDPR. PIAs are one of the mechanisms included in the principle of accountability. When performing a PIA, data controllers adhere to the GDPR and can demonstrate that appropriate measures have been developed to ensure GDPR compliance. Failure to carry out a PIA is subject to an administrative fine of up to 10 million euros or 2% of the worldwide revenue of the company for the preceding year.

    - Guidelines on Data Protection Officers (“DPOs”)
These guidelines provide details on how a Data protection officer should be appointed, as well as the role and responsibilities of the DPO.

Although this role is not new, the appointment of a DPO was not mandatory under the 1995 Directive. To be compliant with the GDPR, certain companies, data controllers and processors, will have to appoint a DPO. The role and responsibilities of the DPO are described under articles 37 to 39 of the GDPR.

The DPO allows companies to ensure GDPR compliance (including, for instance, for internal audits, to act as a liaison between the different internal departments, and with the data subjects). However, DPOs are not liable in case of non-compliance to the GDPR. The data controller or the processor are responsible for GDPR compliance and implementation.

    - Guidelines on the right to data portability
These guidelines define the data portability principle, identify the main aspects of this new right, identify when this right should apply, define how the rules concerning the data subjects apply to data portability, and define how the data should be conveyed to the data subject or to a new data controller.

Data portability is slightly different from the right of access under the 1995 directive. Data portability allows the data subjects to receive the data provided to the data controller in a structured and machine-readable format, and to transfer this data to a new data controller. The right to data portability will typically be used when a consumer switches service providers. The right to data portability is defined under article 20 of the GDPR.

    - Guidelines for identifying a controller or processor’s lead supervisory authority
The GDPR set up another new principle: the lead supervisory authority, to take into account transborder data processing.

These guidelines identify the supervisory authority competent for transborder processing, especially when the principal place of business of the data controller is different from its European headquarters, when several companies within a multinational group of companies are concerned or when there are several joint data controllers. The issue of data processors is also addressed by the guidelines.


     Other guidelines are being developed and should be published before the end of 2017. These include guidelines on certification, guidelines on data privacy breach notifications, guidelines on consent by the data subjects, and guidelines on profiling.
 
                                                                   * * * * * * * * * * * *

(1) Regulation (EU) 2016/619 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation)

(2) Available on the CNIL website (in French)

(3) The WP29 Guidelines are available on the CNIL website (in English) : Guidelines on Data Protection Impact Assessment (DPIA) and determining whether processing is “likely to result in a high risk” for the purposes of Regulation 2016/679 ; Guidelines on Data Protection Officers (“DPOs”) ; Guidelines on the right to data portability ; Guidelines for identifying a controller or processor’s lead supervisory authority



Bénédicte DELEPORTE
Avocat

Deleporte Wentz Avocat
www.dwavocat.com

August 2017

Tuesday, May 9, 2017

From science fiction to law: the European Parliament proposes a legal framework for robotics

On 16 February 2017, the European Parliament adopted a resolution which includes a series of recommendations to the European Commission regarding civil law rules on robotics. (1) With this document, the Parliament calls on the Commission to submit a proposal for a directive. These recommendations have been under review for two years, a time necessary to conduct a rich and thorough reflection on a multi-faceted matter which will deeply disrupt our civil, industrial and economic societies.

Robotics includes not only robots and artificial intelligence (“AI”), but also bots, drones, autonomous vehicles. This area raises ethical and legal questions which must be addressed now at a supranational level, especially since robotics is already present in a number of industries, such as the automotive and electronics industries.

The resolution of the Parliament stresses the necessity to define an ethical framework around the development, programming and use of robots, to define a legal framework around robotics to allow a harmonised and legally secured development, and to define new legal liability principles for actions performed by smart robots.


1. An ethical framework based on Asimov’s laws of robotics

Good science fiction has often been predicting the evolution of technology and society. Numerous technology tools appear in our daily environment which are directly inspired from communication “gadgets”, from the Star Trek saga (smart phones and connected things), to motion pictures such as Minority Report and Moneyball (predictive analysis), or 2001, A Space Odyssey and I, Robot (smart robots). (2)

Prior to these movies, Isaac Asimov, the famous 20th century science fiction writer, set down the three laws of robotics governing the relationship between man and robot:
    1. A robot may not injure a human being or, through inaction, allow a human being to come to harm;
    2. A robot must obey the orders given it by human beings except where such orders would conflict with the First Law;
    3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Laws.
(3)

These laws have inspired the members of the European Parliament to establish the foundation of their recommendations on a preliminary draft of European civil law on robotics, reminding “the intrinsically European and universal humanistic values that characterise Europe’s contribution to society”. These laws are directed primarily at the designers, producers and operators of robots.

Based on these principles, the European Parliament recommends to develop a clear, precise and efficient ethical framework applicable to the design, development, production, use and modification of robots.

Robots must serve humanity especially by performing repetitive, difficult or dangerous tasks. But robotics, through its social, medical and bioethical implications also comes with societal risks for humans, including in the areas of liberty, safety, health, privacy and personal data protection, integrity and dignity.

This resolution takes a practical approach by integrating a Charter on robotics comprised of a Code of ethical conduct for robotics engineers, a Code for research ethics committees (REC), and licences for designers and for users.

The Code of ethical conduct for robotics engineers covers all R&D activities and recalls the strict obligation for researchers and designers to respect the dignity, privacy and safety of humans. This ethical framework should be based on principles of beneficence (robots should act in the best interests of humans), non-maleficence (robots should not harm a human), autonomy (the capacity to make an informed, un-coerced decision about the terms of interaction with robots), and justice (fair distribution of the benefits associated with robotics; affordability of homecare and healthcare robots). The Code also defines principles of fundamental rights, rights of precaution, transparency, safety, reversibility and privacy.

The Code for research ethics committees (REC) stresses the principle of independence to avoid conflicts of interest between the researchers and those reviewing the ethics protocol, and between the reviewers and the organisational governance structures. The Code also defines the role and constitution of a research ethics committee and monitoring rules.


2. The foundations of a legal framework- to define the notion of “robot” and support the development of cyber technology

The resolution also includes several recommendations aimed at setting the ground rules of a harmonised European legal framework adapted to robotics. Such legal rules must permit the cross-border use of robots (principle of mutual recognition), thereby avoiding fragmentation of the European market.

    - The notion of “smart robot”
The Parliament calls on the Commission to propose common definitions within the European Union regarding the notions of cyber physical systems, autonomous systems and autonomous and smart robots, and their sub-categories. A “smart robot” would include the following characteristics:
. the acquisition of autonomy through sensors and/or by exchanging data with its environment (inter-connectivity);
. self-learning capacity from experience and by interaction;
. at least a minor physical support;
. the capacity to adapt its behaviour and actions to its environment; and
. absence of life in the biological sense.

A Community system of registration for certain “advanced” categories of robots could be created for purposes of traceability.

    - Intellectual property rights
The Parliament draws attention to the necessity to address the issue of intellectual property rights in robotics through a horizontal and technologically neutral approach applicable to the different sectors in which robotics could be used.

    - Right to privacy and personal data protection
Extending the right to privacy and personal data protection to the relationship between humans and robots is fundamental. Indeed, the robots used by individuals in a domestic environment (autonomous vehicles, domestic robots, care robots and medical robots) will collect and process personal data. These robots will usually be connected, making it easy to analyse and shared the data collected.

The Community rules on the right to privacy as well as the provisions of the General Data Protection Regulation (GDPR), especially the rules regarding systems security, must be extended to robotics. However, such rules must be complemented, where necessary, to take into account the specificities of robotics.

    - Standardisation, safety and security
The development of robotics includes the creation of technical standards that must be harmonised internationally to avoid dividing up the European market, and foster a high level of product safety and consumer protection. Communication between robots shall also require the adoption of open and interoperable standards.

To avoid the fragmentation of the European market, testing, certification and market approval in a Member State should be recognised in the rest of the EU.

    - Education and employment
The development of the use of robots will create a new industrial and societal revolution. Even though its actual impact on employment is not fully known, less skilled jobs will be more severely affected as well as labour-intensive industries. Automation will lead to more flexibility of skills. For that matter, the Parliament calls on the Commission to monitor medium and long-term job trends as a result of the increased use of robots, and to support education to digital skills so as to align the job market with the demand.

Finally, the Parliament recommends the creation of a designated EU Agency for Robotics and Artificial Intelligence to provide its technical, ethical and regulatory expertise at the Community and National levels.


3. The issue of legal liability: can an autonomous robot be considered as a person responsible for its actions?

An autonomous robot (having the ability to adapt and learn) can make decisions and implement them independently, which means that its behaviour includes a level of unpredictability. Such autonomy is however merely technical. Also, the more autonomous a robot is, the less it can be considered as a simple tool controlled by a human (manufacturer, operator, owner). Therefore, a specific status - the electronic person - could be created for autonomous robots.

The current legal liability rules are not adapted to autonomous robots, which cannot be held liable  in case of damages caused to a third party. Under the current state of the law, humans are liable, i.e. the manufacturer (product liability), the operator, the owner or the user of the robot (liability for damages).

The Parliament calls for the Commission to review liability laws to determine the regime that will be more adapted to this matter, i.e. either a regime of strict liability (ability to prove the damage, the defect in the robot and the causality between the defect and the damage), or a liability regime based on risk management (ability to manage risk and its consequences).

The liability of the parties involved should be proportional to the level of instructions given to the robot and its degree of autonomy (the greater the robot’s autonomy, the greater the responsibility of its trainer). In parallel, a specific insurance system for robots should be created.

    As a conclusion, this resolution by the European Parliament manages to provide practical orientations about a very complex matter, especially since we don’t yet know the full extent of the impacts of robotics on our society. This document provides a good overview of the issues raised by robotics. This resolution draws the major trends of a legal framework with a purpose to secure the development of robotics and of its multiple uses. It lays necessary ethical foundations and tries to contain fears related to the consequences of an uncontrolled development of AI. The ball is now in the camp of the European Commission to propose a directive within a reasonable timeframe so that Europe is not overtaken by the evolution of robotics which is happening very fast.


                                                              * * * * * * * * * * * *

(1) “European Parliament resolution of 16 February 2017 with recommendations to the Commission on Civil Law Rules on Robotics” (2015/2103(INL))

(2) These movies are mostly adapted from books: Minority Report (by Philip K. Dick, published in 1956!); Moneyball (The Art of Winning an Unfair Game, by Michael Lewis, published in 2003); I, Robot (by Eando Binder, published in 1939 and re-written by Isaac Asimov in 1950)

(3) Asimov’s three laws of robotics appear in “Runaround”, published in 1942.


Photo © ClaudeAI.uk (https://claudeai.uk/ai-blog/ )


Bénédicte DELEPORTE
Avocat

Deleporte Wentz Avocat
www.dwavocat.com

May 2017