You Are Viewing US Law

Enhancing Safeguards for US Signals Intelligence Activities

Posted by fgilbert on October 13th, 2022

President Biden October 7, 2022 Executive Order on

Enhancing Safeguards for US Signals Intelligence Activities –

Towards an Updated EU-US Privacy Shield Framework

When the European Court of Justice issued its decision on Schrems and Facebook Ireland v. Data Protection Commissioner in July 2020 (Schrems II),[1] it triggered a brutal disruption and stoppage in the operations of the EU-US Privacy Shield framework (Framework). It also caused significant chaos in the operations of numerous US or EU/EEA businesses and organizations that were relying on the Framework as a strategic tool and structure for providing a legal basis for exchanges or transfers of personal data for commercial and business purposes between the two sides of the Atlantic.

After lengthy and challenging negotiations between representatives of the European Commission and the United States, a new proposed Trans-Atlantic Data Privacy Framework was published at the end of March 2022. According to the White House, the EU-US Trans-Atlantic Data Privacy Framework of March 2022 was intended to lay the ground for providing a legal basis for transatlantic data flows by addressing concerns that the Court of Justice of the European Union raised in July 2020 in the Schrems II case.

Under the March 2022 EU-US Trans-Atlantic Data Privacy Framework, the United States made commitments to:

  • Strengthen the privacy and civil liberties safeguards governing the U.S. signals intelligence activities;
  • Establish a new redress mechanism with independent and binding authority ; and
  • Enhance the existing rigorous and layered oversight of signals intelligence activities.

On October 7, 2022, President Biden signed an Executive Order on Enhancing Safeguards for United States Signals Intelligence Activities (E.O.), which defines the steps that the United States will take to implement the commitments it made in the March 2022 European Union-U.S. Trans-Atlantic Data Privacy Framework. The Executive Order addresses in depth the three commitments made in the Trans-Atlantic Data Privacy Framework, as detailed below.

[1] Strengthening the Privacy and Civil Liberty Safeguards

The October 7, 2022 Executive Order requires that U.S. signals intelligence activities be conducted only in pursuit of defined national security objectives; take into consideration the privacy and civil liberties of all persons, regardless of nationality or country of residence; and be conducted only when necessary to advance a validated intelligence priority and only to the extent and in a manner proportionate to that priority.

Principles and Objectives

Section 2(a) of the EO defines the principles that will be used to determine whether a signals intelligence activities may be authorized and conducted. Section 2(b) of the EO identifies those objectives that will be deemed legitimate and those that will be prohibited.

Privacy and Civil Liberties Safeguards

Section 2(c) of the Executive Order focuses on the safeguards that must be used to ensure that privacy and civil liberties are integral considerations in the planning and implementation of the signal intelligence activities.

  • Collection of Signals Intelligence

Section 2(c)(i) identifies general requirements that apply to all forms of such intelligence activities, while Section 2(c)(ii) provides specific requirements in the event bulk collection of signals intelligence. Bulk collection may be used only in the pursuit of specified objectives, such as protection against espionage, sabotage, or protection against cybersecurity threats created or exploited by or on behalf of foreign person, organizations or government.

  • Handing of Personal Information Collected Through Signals Intelligence

In Section 2(c)(iii), the EO defines mandatory handling requirements for personal information collected through signals intelligence activities. It also extends the responsibilities of legal, oversight, and compliance officials to ensure that appropriate actions are taken to remediate incidents of non-compliance.

The most prominent requirement is minimization of the dissemination and the retention of personal information collected through signals intelligence. In addition, there are specific requirements for data security and limitation of access to the information. Other provisions focus on ensuring data quality, accuracy and objectivity.

  • Policies and Procedures to be Updated within One Year

Section 2(c)(iv) focuses on policies and procedures. U.S. Intelligence Community services are required to update their policies and procedures to reflect the new privacy and civil liberties safeguards contained in the Executive Order within one year of the date of the Executive Order. The review of these updates must be conducted in consultation with the Attorney General, the Civil Liberties Protection Officer of the Office of the Director of National Intelligence (CLPO), and the Privacy and Civil Liberties Oversight Board (PCLOB).

  • Review of the Policies and Their Implementation

The Executive Order provides for numerous levels of review, such as a review of the updated policies and procedures by the Privacy and Civil Liberties Oversight Board (PCLOB), once they have been issued to ensure their consistency with the enhanced safeguards contained in the Executive Order.  Moreover, there are provisions for rigorous legal oversight as well as the use of compliance officials to conduct periodic oversight of signals intelligence activities, including an Inspector General, a Privacy and Civil Liberties Officer and the appointment of Officers with compliance roles to conduct oversight and ensure compliance with applicable US laws.

[2] Establishment of a New Redress Mechanism

Section 3 of the Executive Order provides for a redress mechanism to review qualifying complaints transmitted by the appropriate public authority in a “qualifying state”[2] concerning United States Signal intelligence activities for any covered violation of US laws.

The new redress mechanism will be multi-layer, independent and binding, and is intended to enable individuals in qualifying states and regional economic integration organizations, as designated under the E.O., to obtain an independent and binding review and redress of claims that their personal information collected through U.S. signals intelligence was collected or handled by the United States in violation of applicable U.S. law, including the enhanced safeguards in the E.O.

Initial Investigation of Qualifying Complaints by the CLPO

Under the first layer, the Civil Liberties Protection Officer in the Office of the Director of National Intelligence (CLPO) will conduct an initial investigation of qualifying complaints received to determine whether the enhanced safeguards or other applicable U.S. law were violated and, if so, to determine the appropriate remediation.

The process to be followed by the CLPO will be established by the Director of National Intelligence, in consultation with the Attorney General. Section 3(c) of the Executive Order defines in minute details the elements of that process, including review of the information necessary to investigate the complaint, determining whether there was a violation, preparation of a classified report on the alleged violation, and issuing a classified decision. The complainant or the element of the Intelligent Community affected by the decision may seek review of the CLPO’s decision by the Data Protection Review Court. Otherwise, the decision becomes binding.

Independence of the CLPO

One of the issues raised in the decision of the European Court of Justice in the Schrems II case was that the oversight over the data processing conducted by the US intelligence agencies as defined under the 2016 version of the EU US Privacy Shield lacked independence from the US government. Section 3(c)(iv) of the Executive Order, titled [Independence], specifically provides that the Director of National Intelligence shall not interfere with a review by the CLPO of a qualifying complaint, and shall not remove the CLPO for any action taken unless there has been misconduct, malfeasance, neglect of duty or incapacity.

Data Protection Review Court

The second layer of review, described in Section 3(d) of the Executive Order is provided by a Data Protection Review Court. Section 3(d) directs the Attorney General to establish a Data Protection Review Court (DPRC) to provide independent and binding review of the CLPO’s decisions, upon an application from the individual or an element of the Intelligence Community. The EO directs the Attorney General to promulgate regulations establishing the Data Protection Review Court along the lines defined in the EO, within sixty (60) days of the publication of the EO.

Independence of the Data Protection Review Court

In accordance with the focus on ensuring the Court’s independence, as discussed above, the Judges designated to serve on the DPRC must be appointed from outside the U.S. Government. review cases independently, and enjoy protections against removal. In addition, they must have relevant experience in the fields of data privacy and national security.

Further, Section 3(d)(iv), titled [Independence], specifically mandates that the Attorney General shall not interfere with a review by a Data Protection Review Court panel of a determination made by the CLPO regarding a qualifying complaint, and shall not revoke any judge appointed to service on that court except in case of misconduct, malfeasance, breach of security, neglect of duty or incapacity.

Binding Effect

Decisions of the DPRC regarding whether there was a violation of applicable U.S. law and, if so, what remediation is to be implemented will be binding. Under Section 3(d)(iii) each element of the Intelligence Community and each agency containing an element of the Intelligence Community is required to comply with any determination by the Data Protection Review Court panel to undertake appropriate remediation.

Annual Review of the Redress Process by PCLOB

In addition to the reviews and oversight described above, Section 3(e) of the Executive Order “encourages” the Privacy and Civil Liberties Oversight Board (PCLOB) to conduct annual reviews of the processing of qualified complaints by the redress mechanism discussed above, with respect to issues such as timeliness, full access to information, and whether the elements of the Intelligence Community have fully complied with determinations made by the CLPO and the Data Protection Review Court. The role and powers of the PCLOB are discussed in the next section.

[3] Enhancement of Oversight of Signals Intelligence Activities by the PCLOB

The CJEU decision in Schrems II voiced concern about the lack of oversight of the intelligence activities and the weakness of the protection granted to the personal data being collected and processed. The October 7, 2022 Executive Order gives specific authority to the Privacy and Civil Liberties Oversight Board (PCLOB) to review Intelligence Community policies and procedures to ensure that they are consistent with the Executive Order and to conduct an annual review of the redress process, including to review whether the Intelligence Community has fully complied with determinations made by the CLPO and the DPRC. The role of the PCLOB is detailed in several sections of the Executive Order, as explained below.

Participation in the Drafting of the New Policies and Procedures

First, in Section 2, which defines the rules concerning privacy safeguards, Section 2(c)(iv)(B) provides for PCLOB participation in the drafting of the updates to the policies and procedures. In this case, PCLOB only has a consultative role, and the goal is to ensure that the updates to the policies and procedures required by the Executive Order implement the privacy and civil liberty safeguards outlined in the Executive Order.

Review of the Final Policies and Procedures

Once the policies and procedures have been updated and issued as described above, Section 2(c)(v)(A) encourages the PCLOB to conduct a review of the updated policies and procedures to ensure that they are consistent with the enhanced safeguards contained in this order. In addition, Section 2(c)(v)(B) requires that, within 180 days of the completion of the PCLOB review, the head of each element of the Intelligence Community “carefully” consider and implement or otherwise address all recommendations contained in the PCLOB review, consistent with applicable law.

Participation in the Appointment of Judges to Serve on the Data Protection Review Court

Section 3 of the Executive Order, which focuses on Signals Intelligence Redress Mechanism, allocates a role to the PCLOB in connection with the activities of the Data Protection Review Court. Under Section 3(d)(A) of the Executive Order provides that the Attorney General, must consult with the PCLOB – as well as with the Secretary of Commerce, and the Director of National Intelligence –, to appoint individuals to serve as judges on the Data Protection Review Court.

Annual Review of the Redress Process

Finally, in addition to the consultation, reviews and oversight described above, Section 3(e)(i) of the Executive Order “encourages” the Privacy and Civil Liberties Oversight Board (PCLOB) to conduct annual reviews of the processing of qualified complaints by the redress mechanism discussed above, including whether

  • the CLPO and the Data Protection Review Court processed qualifying complaints in a timely manner;
  • the CLPO and the Data Protection Review Court are obtaining full access to necessary information;
  • the CLPO and the Data Protection Review Court are operating in a manner consistent with the Executive Order
  • the safeguards established in the Executive Order a properly considered in the processes of the CLPO and the Data Protection Review Court; and
  • the elements of the Intelligence Community have fully complied with the determinations made by the CLPO and the Data Protection Review Court.

To assist the PCLOB in its review, Section 3(e)(ii) instructs the Attorney General, the CLPO, and the elements of the Intelligence Community (inter alia) to provide the PCLOB with access to information necessary to conduct the review. In addition, Section 3(2)(iii) provides for the preparation of a classified report to be provided to the President, and the congressional intelligence committees (inter alia) and requires the PCLOB to make an annual public certification as to whether the redress mechanism is processing complaints consistent with the terms of the Executive Order, and to release to the public an unclassified version of the report.

[4] Designation of the Qualifying States for Purposes of the Redress Mechanism

Several provisions of the Executive Order refer to the rights granted to citizens of a “qualifying state.” Section 3(f) provides the criteria for a country or regional economic integration organization for be deemed a “qualifying state” for purpose of the redress mechanism defined in the Executive Order. Section 3(f)(i) grants the Attorney general the authority to designate a country or regional integration organization the status of “qualifying state”. The designation must be made in consultation with the US Secretary of State, US Secretary of Commerce, and the Director of National Intelligence.

The criteria for make the determination that the state or economic integration organization is a “qualifying state,” as listed in Section 3(f)(i)(A) include:

  • the laws of the country, organization, or member of the organization require appropriate safeguards in the conduct of signals intelligence activities for United States persons’ personal information that is transferred from the United States to the territory of the country or a member of the organization;
  • the country, organization, or member of the organization permit, or are anticipated to permit, the transfer of personal information for commercial purposes between the territory of that country or those member countries and the territory of the United States; and
  • such designation would advance the national interests of the United States.

The designation may be revoked or amended.

 [5] Next Steps and Ultimate Goal

The next steps in the development of a new EU-US agreement on trans-Atlantic data transfers and data protection will likely focus on the development or update of the building blocks necessary for the preparation of an Adequacy Evaluation package, that, in the end, will be presented to the European Commission for its review and approval and issuance of a new adequacy decision. In the end, once the formalities have been completed, entities that wish to take advantage of the updated crossborder personal data transfer framework will continue to be required to adhere to the EU-US Privacy Shield Principles – or an updated version -. Those that had self-certified under the 2016 version of the EU-US Privacy Shield Framework, will have to re-certify their adherence to the Principles through the US Department of Commerce, and update their legal terms accordingly.

[1] Schrems and Facebook Ireland v. Data Protection Commissioner (2020) CJEU Case C-311/18; press release available at:; July 16, 2020 decision available at:;jsessionid=EBF54609D179D36D02BD7BB10DC3BDF3?text=&docid=228728&pageIndex=0&doclang=en&mode=req&dir=&occ=first&part=1&cid=1285293.

[2] While Section 4(k), in the Definitions section, provides the criteria for a complaint to be deemed a “qualifying complaint”, in Section 4(k), there is no similar definition of the tern “qualifying state”. Instead, the criteria for a state to be deemed a “qualifying state,” and the method to be used for identifying a state as a “qualifying state” are defined in Section 3(f) of the Executive Order.

Posted in Europe, US Law
Comments Off on Enhancing Safeguards for US Signals Intelligence Activities

Meet the Upcoming California Privacy Rights Act (CPRA)

Posted by fgilbert on November 12th, 2020

California voters approved Proposition 24 on November 3, 2020, paving the way to the California Privacy Rights Act (CPRA). Starting in January 2023, CPRA will expand California consumers’ ability to limit the use of their personal information in the context of targeted advertising, beyond the rights already acquired under the current provisions of CCPA, and create additional rights for consumers. There will be, as well, additional obligations and restrictions for businesses related to the use of consumer’s personal information, including limits to data collection and retention, among other.

Unfortunately, this takes 52 pages of clauses that are anything but clear and easy to understand

In practice, the will be additional benefits for consumers, and additional administrative and financial burdens for businesses. CPRA is not really a CCPA 2.0.  It introduces new concepts that have not yet permeated US laws, for example data minimization and retention limitation, which is likely to require most businesses within its scope to re-evaluate their activities and develop new processes beyond those that they may have just finished implementing to comply with CCPA.

CPRA is intended to replace the California Consumer Privacy Act (CCPA) in 2023. Most of CPRA will become operative on January 1, 2023, and the law will apply to personal information collected after January 1, 2022. There will be a 6-month delay between the effective date of the act and its enforcement, with enforcement actions commencing on July 1, 2023. In the meantime, CCPA will remain in full force and effect until it is superseded by CPRA.

New or Updated Definitions

CPRA changes existing definitions and introduces new terms. The most noticeable changes include the following:


CPRA introduces “sharing” as an activity different from “selling”. “Sharing” is defined as disclosing, making available, transferring, or communicating a consumer’s personal information to a third party for “cross-context behavioral advertising”, whether or not for monetary or other valuable consideration. The new definition is especially relevant to affiliate advertising networks, advertisers and data brokers in the context of re-targeting and behavioral advertising, in which advertisements are targeted to a consumer based on information derived from information collected about that consumer’s activities across different websites, applications or services.


CPRA revises the definition of “business”, i.e., those entities subject to the law. The current definition under CCPA identifies three threshold: gross revenue, number of records processed, and percentage of revenue from the sale of personal information compared to gross revenue. The threshold associated with the number of records purchased or sold is increased from 50,000 to 100,000, and the threshold associated with calculating the percentage of revenue from the use of personal information is now computed by combining both revenue from selling and revenue from “sharing” personal information.

Contractor; Service Provider

CPRA introduces the notion of “contractor” and updates the definition of “service provider” to keep the two definitions consistent. Under CPRA, a business “makes available” personal information to a “contractor” for a business purpose pursuant to a written contract that prohibits the contractor from selling or sharing the personal information and includes other restrictions.

The definition of Service Provider is modified to include the new concept of “sharing”. A service provider is a person that “receives personal information” from, or on behalf of, a business and processes the information on behalf of that business for a business purpose pursuant to a written contract that prohibits the service provider from selling or sharing the personal information and includes other restrictions.

Sensitive Information

CPRA creates the concept of “sensitive personal information”, which includes, among other, Social Security numbers and other identity-related information; financial account or payment card information in combination with access code; precise geolocation data; race, ethnic origin, religion; sexual orientation; genetic, biometric information when used to uniquely identify a consumer; and certain health information outside the context of HIPAA.

New Rights for Individuals

The CPRA introduces several new consumer rights. Some of these rights are similar to those found in most data protection laws, such as Canada’ PIPEDA or the EU General Data Protection Regulation. Examples of new rights include:

Right to Know what Personal Information is Sold or Shared

The right to know under CPRA is an expanded version of the “Right to Know” under CCPA. It is a consequence of the introduction of the concept of sharing personal information as a restricted activity. It will be important to keep in mind that the definition of “sharing” is limited to “cross-context behavioral advertising”.

Right to Opt-out of Information Sharing / Behavioral Advertising

Consumers will be granted the right to opt-out of information sharing with third parties for behavioral advertising across websites. This right supplements the pre-existing right to opt-out of the sale of personal information. The new provisions concerning the use of personal information for marketing purposes are detailed below.

Right to Limit the Use of Sensitive Information

Consumers will have the right to direct a business that collects sensitive personal information about them to limit its use of that information to that which is necessary to perform the services or provide the goods, as “reasonably expected by an average consumer who requests such goods or services”. The detail of the definition is left to upcoming Regulations.

Right of Correction

Consumers will have the right to request the correction of inaccurate information. Businesses that receive requests for correction will be required to use commercially reasonable efforts to correct inaccurate personal information, as directed by the consumer.

Right to Object to Automated Decision Making and Profiling

Consumers will have the ability to object to the use of their personal information for automated decision making and profiling. Profiling is defined as automated processing of personal information to evaluate certain aspects relating to a natural person, such as economic situation, health, personal preferences, interests, reliability, behavior, location, movements, or performance at work.

New Obligations for Businesses

The CPRA creates new obligations for businesses, some of them are similar to those found in other data protection laws, worldwide.

Updated Content of the Notices to Consumers

CCPA requires that different types of notices be provided to consumers at different stages of the interaction between the consumer and the business. CPRA modifies the content of these notices to match the new rights of consumers and obligations of businesses.

Retention Limitation

CPRA introduces a data retention requirement. CPRA makes it a “general duty” for a business that collects personal information not to retain personal information for longer than necessary for the purposes for which the personal information was collected. Businesses will also be required to inform consumers of the length of time they retain each category of personal information or if not possible, the criteria used to determine such period.

Data Minimization

Data Minimization is another “general duty” introduced by CPRA. CPRA requires that the collection, use, retention and sharing of personal information be “reasonably necessary and proportionate to achieve the purposes for which the personal information was collected or processed”, and prohibits the further processing of the data for a purpose incompatible with the disclosed purpose.

Reasonable Security Measures

CPRA significantly expands the obligation of businesses to implement reasonable security measures and practices for personal information. These measures are discussed later in this article.

Contract with Service Providers, Contractors and Third Parties

CPRA imposes mostly similar direct or contractual obligations on service providers and contractors and significantly expands those that are currently imposed under CCPA. As a result, businesses will have to review their contracts with their service providers and contractors to ensure these contracts contain all of the newly required provisions. Overall, the new data processing agreements will have significant similarities – and differences – with the corresponding provisions required by GDPR Article 28.

Use of Personal Information for Cross-Context Behavioral Advertising

One of the key changes from CCPA is the introduction of the term “sharing” as the practice of disclosing or communicating a consumer’s personal information for cross-context behavioral advertising, whether or not for monetary or other valuable consideration, including transaction between a business and a third party. Under CPRA, consumers will have the right to opt-out of the sharing of their personal information. This addition is likely to have a significant impact on businesses that use digital marketing techniques to target California consumers.


CPRA gives security and security measures a more prominent place.

General Duty to Use Security Measures

First, CPRA makes it a general duty for businesses to implement reasonable security procedures and practices appropriate to the nature of the personal information to protect the personal information from unauthorized or illegal access, destruction, use, modification or disclosure. Regulations will be needed to clarify whether the obligation applies to all categories of personal data, or to a subset.

Security Audits and Privacy Risk Assessments

CPRA will also impose security audits and privacy risk assessments in certain circumstances. At this point, there is limited detail, and CPRA points to upcoming Regulations but provides minimal guidance, limited to a handful of general requirements.  These obligations with apply only to businesses whose processing of consumers’ personal information “presents a significant risk to consumers’ privacy or security”.

Security Breaches

CCPA provides for a limited private right action in the event of a data breach for failure to provide adequate security, and statutory damages in case of a data breach affecting certain categories of personal information. CPRA makes a minor addition to the type personal information that may trigger action for damages: unauthorized access to an email address in combination with a password or security question.


CPRA increases the protection of personal information of children under the age of 16 by tripling the statutory amounts currently imposed by the CCPA. CCPA §1798.155(b) as amended by CPRA will impose penalties up to $7,500 for “violations involving the personal information of consumers whom the business, service provider, contractor or other person has actual knowledge is under 16 years of age”.

California Privacy Protection Agency

CPRA establishes the California Privacy Protection Agency (CPPA) as a regulatory body with full administrative power and jurisdiction, to enforce any CPRA violations. The CPPA will enforce consumer privacy laws and impose fines. Among its numerous responsibilities and powers, the CPPA will be responsible for providing guidance to businesses regarding their duties and responsibilities, and appoint a “Chief Privacy Auditor” to conduct audits of businesses to ensure compliance with the law and its regulations.

Employee and B2B Exceptions

While most provisions of CPRA will enter into force in January 2023, several provisions have an effective date of January 1, 2021. As a results of amendments to CCPA adopted in October 2019, CCPA contains partial exemptions for the handling of personal information collected in an Employer / Employee relationship (employees, job applicants and independent contractors), and information obtained in the context of a B2B relationship. That exemption, which took employee and independent contractors, and information collected in the context of a B2B relationship out of the scope of the application of CCPA, was due to expire as of January 1, 2021. CPRA extends that moratorium period through the end of 2022.


CPRA requires the development of regulations on a wide range of topics relating to definitions, exemptions, technical specification for opt-out preference signals, automated decision making, cybersecurity audits, risk assessments, and monetary thresholds for the definition of a “business”. The final regulations must be adopted by July 1, 2022.


California voters have approved Proposition 24, and CPRA is here to stay. Starting in January 2023, CPRA will expand California consumers’ ability to limit the use of their personal information in the context of targeted advertising,. But, CPRA does more than just that. It has significant implications for privacy and data management as they exist currently in the United States. It creates a significant paradigm shift towards concepts found in most privacy laws worldwide, outside the United States.

CPRA imposes specific new restrictions on data collection and data retention, making them part of the “general duties” of businesses that collect personal information of California consumers. Both concepts, which were shaped in the 1970’s and laid down in the 1980 OECD Privacy Principles. While they have been an integral part of most foreign privacy laws, worldwide, for decades, United States laws, for most parts, have stay away from these restrictions, allowing enterprises to collect and retain large amounts of data, so long as they disclosed these practices in their privacy notices.

Requiring data minimization and storage limitation paves the way for drastic changes to the framework in which personal data is collected and processed, and the way businesses monetize personal information in the United States. These changes will require that businesses assess the nature and scope of their personal information collection and use practices, and balance those activities against their actual needs or legal obligations, to determine whether they can justify why certain information is needed or why it stored longer than necessary.

Posted in California, US Law
Comments Off on Meet the Upcoming California Privacy Rights Act (CPRA)

European Court of Justice Decision Creates Havoc in Global Digital Exchanges: One Shot Down, One seriously Injured; 5,300 Stranded

Posted by fgilbert on July 16th, 2020

At long last, the European Court of Justice (EUCJ) has published its decision in the “Schrems 2” case. The EUCJ was tasked with reviewing the effectiveness of the mechanisms used in the context of crossborder data transfers. A key question was whether standard contractual clauses (SCC) used as a means of establishing “adequate protection” for personal data transferred out of the European Union or European Economic Area did in fact result in ensuring the level of “adequate protection” defined in the EU General Data Protection Regulation and the European Charter of Fundamental Rights.

The decision, published on July 16, looked at both the EU-US Privacy Shield and the SCCs. It invalidated the Privacy Shield, thereby destroying the virtual bridge that allowed 5,378 US based Shield self-certified organizations to conduct business with entities located in the European Union and European Economic Area. It preserved, but created significant challenges to the SCC (Controller to Processor) ecosystem  by creating new constraints and obstacles, to the countless organizations located both in the US and abroad, in their global digital trade with their European Partners.

The Basic Premise

The premise of the decision is that currently the US national security, public interest and law enforcement laws, have primacy over the fundamental rights of persons whose personal data are transferred to the US.  They do not take into account the principles of proportionality and are not limited to collecting only that data which is necessary. In addition, according to the EUCJ decision, US law does not grant data subjects actionable rights before the courts against US authorities.

EU-US Privacy Shield Invalidation

The EUCJ determined that the protection provided to personal data in the United States is inadequate to meet the level of protection of privacy and privacy rights guaranteed in the EU by the GDPR and the EU Charter of Fundamental rights.

According to the decision, the US surveillance programs  are not limited to what is strictly necessary, and the United States does not grant data subject actional rights against the US authorities. Further, the Ombudsperson program does not provide data subjects with any cause of action before a body that offers guarantees substantially equivalent to those required by EU law. Therefore, the EU-US Privacy Shield is no longer a legal instrument for the transfer of personal data from the EU to the US.

The immediate consequence of the invalidation of the EU-US Privacy Shield is that more than 5,000 US organizations, and their trading partners throughout the European Union and the European Economic Area are left stranded with no way out.  The invalidation declared by the EUCJ take immediate effect.  These transfers must cease.  This is likely to prove a catastrophic hurdle for many companies already weakened by the Covid pandemic.

Standard Contractual Clauses

The Standard Contractual Clauses for the transfer of personal data to processors established in third countries remain valid.  However, the Court found that, before a transfer of data may occur, there must be a prior assessment of the context of each individual transfer, that evaluates the laws of the country where the recipient is based, the nature of the data to be transferred, the privacy risks to such data, and any additional safeguards adopted by the parties to ensure that the data will receive adequate protection, as defined under EU Law.  Further, the data importer is required to inform the data exporter of any inability to comply with the standard data protection clauses.  If such protection is lacking the parties are obligated to suspend the transfer, or terminate the contract.  Thus, while the SCC (controller-to-processor) remain valid, their continued validity is subject to an additional step: the obligation to conduct the equivalent of a data protection impact assessment to ensure that the adequate protection is and will be provided and, subsequently, continuously monitored.

What’s Next?

  • Organizations that exchange or have access to personal data of residents of the EU or EEA should promptly assess the mechanisms currently in place to ensure the legality of their transfer of personal data outside the European Union.
  • If the organization has relied only on the EU-US Privacy Shield as a mechanism to ensure the legality of its personal data transfers, it should immediately halt the transfer of personal data out of the EU.  It should evaluate alternative means, most likely in the form of Standard Contractual Clauses.  For transfers that cannot be covered by SCCs, derogations under Article 49 of the GDPR might apply.
  • If the organization – whether located in the United States, or anywhere in the world – has already in place SCC, the EUCJ decision adds a significant hurdle in the form of a requirement for a prior evaluation of the protection to be offered to individuals and ongoing monitoring.
  • As always, ensure that these decisions and analysis are adequately documented, and proper records kept.
  • Remember to ensure integration and consistency with existing documents such as the organization’s privacy policy or its records of processing activities.
  • Keep in mind that while the Privacy Shield is invalidated as a means to legalize cross-border data transfers, US organizations that have signed up with the Shield program remain responsible for continuing to protect previously collected data in accordance with the promises and representations made in their self-certifications.
  • Stay informed of the developments in the next few days. It is expected that EU/EEA member state data supervisory authorities will publish useful guidance on how to react to the decision.  Some have already published comments and provided guidance.

Proposed Principles for Artificial Intelligence Published by the White House

Posted by fgilbert on January 19th, 2020

A draft memorandum outlining a proposed Guidance on Regulation of Artificial Intelligence Application(“Memorandum“) for agencies to follow when regulating and taking non-regulatory actions affecting artificial intelligence was published by the White House on January 7, 2020. The proposed document addresses the objective identified in an Executive Order 13859 on Maintaining American Leadership in Artificial Intelligence, (“Executive Order 13859”) published by the White House in February 2019.2

The Memorandum sets out policy considerations that should guide oversight of artificial intelligence (AI) applications developed and deployed outside the Federal government. It is intended to inform the development of regulatory and non-regulatory approaches regarding technologies and industrial sectors that are empowered or enabled by artificial intelligence and consider ways to reduce barriers to the development and adoption of AI technologies.

Principles for the Stewardship of AI Applications

The memorandum sets forth ten proposed principles:

  • Ensure public trust in AI
  • Public participation in all stages of rulemaking process
  • Scientific integrity and information quality
  • Consistent application of risk assessment and management
  • Maximizing benefits and evaluating risks and costs of not implementing
  • Flexibility to adapt to rapid changes
  • Ensure Fairness and non-discrimination in outcomes
  • Disclosure and transparency to ensure public trust
  • Promote safety and security
  • Interagency cooperation

Details on each of these principles are provided below

  1. Public Trust in AI.

Government regulatory and non-regulatory approaches to AI should promote reliable, robust and trustworthy AI applications that contribute to public trust in AI.

  1. Public Participation.

Agencies should provide opportunities for the public to provide information and participate in all stages of the rulemaking process. To the extent practicable, agencies should inform the public and promote awareness and widespread availability of standards, as well as the creation of other informative documents.

  1. Scientific Integrity and Information Quality.

Agencies should hold to a high standard of quality, transparency and compliance information that is likely to have substantial influence on important public policy or private sector decisions governing the use of AI. They should develop regulatory approaches to AI in a manner that informs policy decisions and fosters public trust in AI. Suggested best practices would include: (a) transparently articulating the strengths, weaknesses, intended optimizations or outcomes; (b) bias mitigation; and (c) appropriate uses of the results of AI application.

  1. Risk Assessment and Management.

The fourth principle caution against an unduly conservative approach to risk management. It recommends the use of a risk-based approach to determine which risks are acceptable, and which risks present the possibility of unacceptable harm, or harm whose expected costs are greater than expected benefits. It also recommends that agencies be transparent about their evaluation of risks.

  1. Benefits and Costs.

The fifth principle provides that agencies should consider the full societal costs, benefits, and distributional effects before considering regulations related to the development and deployment of an AI application. Agencies should also consider critical dependencies when evaluating AI costs and benefits because data quality, changes in human processes, and other technological factors associated with AI implementation may alter the nature and magnitude of risks.

  1. Flexibility.

When developing regulatory and non-regulatory approaches, agencies should pursue performance-based and flexible approaches that can adapt to rapid changes and updates to AI applications. Agencies should also keep in mind international uses of AI.

  1. Fairness and Non-Discrimination.

Agencies should consider whether AI applications produce discriminatory outcomes as compared to existing processes, recognizing that AI has the potential of reducing present-day discrimination caused by human subjectivity.

  1. Disclosure and Transparency.

The eighth principle comments that transparency and disclosure may increase public trust and confidence. These disclosures may include identifying when AI is in use, for instance, if appropriate for addressing questions about how an application impacts human end-users. Further, agencies should carefully consider the sufficiency of existing or evolving legal, policy, and regulatory environments before contemplating additional measures for disclosure and transparency.

  1. Safety and Security.

Agencies are encouraged to promote the development of AI systems that are safe, secure, and operate as intended, and to encourage the consideration of safety and security issues throughout the AI design, development, deployment, and operation process. Particular attention should be paid to the controls in place to ensure the confidentiality, integrity, and availability of the information processed, stored, and transmitted by AI systems. Further, agencies should give additional consideration to methods for guaranteeing systemic resilience, and preventing bad actors from exploiting AI system weaknesses, cybersecurity risks posed by AI operation, and adversarial use of AI against a regulated entity’s AI technology.

  1. Interagency Cooperation.

Agencies should coordinate with each other to ensure consistency and predictability of AI-related policies that advance innovation and growth in AI, while appropriately protecting privacy, civil liberties, and allowing for sector- and application-specific approaches when appropriate.

Non-Regulatory Approaches to AI

The Memorandum recommends that an agency consider taking no action or considering non-regulatory approaches when it determines, after evaluating a particular AI application, that existing regulations are sufficient, or the benefits of a new regulation do not justify its costs. Examples of such non-regulatory approaches include: (a) sector-specific policy guidance or frameworks; (b) pilot programs and experiments; and (c) the development of voluntary consensus standards

Reducing Barriers to the Development and Use of AI

The Memorandum points out that Executive Order 13859 on Maintaining American Leadership in Artificial Intelligence, instructs OMB to identify means to reduce barriers to the use of AI technologies in order to promote their innovative application while protecting civil liberties, privacy, American values, and United States economic and national security.  The Memorandum provides examples of actions that agencies can take, outside the rulemaking process, to create an environment that facilitates the use and acceptance of AI. One of the examples is agency participation in the development and use of voluntary consensus standards and conformity assessment activities.

Next Steps

The Memorandum points out that Executive Order 13859 requires that implementing agencies review their authorities relevant to AI applications and submit plans to OMB on achieving the goals outlined in the Memorandum within 180 days of the issuance of the final version of the Memorandum. In this respect, such agency plan will have to:

  • Identify any statutory authorities specifically governing agency regulation of AI applications;
  • Identify collections of AI-related information from regulated entities;
  • Describe any statutory restrictions on the collection or sharing of information, such as confidential business information, personally identifiable information, protected health information, law enforcement information, and classified or other national security information);
  • Report on the outcomes of stakeholder engagements that identify existing regulatory barriers to AI applications and high-priority AI applications; and
  • List and describe any planned or considered regulatory actions on AI.


This draft guidance marks defines a concrete structure for outlining regulatory and non-regulatory approaches regarding AI. Businesses should evaluate the extent to which their own AI strategies have the ability to address the ten principles.

In addition, since the development of AI strategies is likely to have global consequences, they should also take into account similar initiatives that have been developed elsewhere around the world, such as by the OECD (with the “OECD Recommendation on Artificial Intelligence”), the European Commission (through its “Ethics Guidelines for Trustworthy Artificial Intelligence”) or at the country level, for example in France (with the “Algorithm and Artificial Intelligence: CNIL Report on Ethics Issues”).

Posted in Trends, US Law
Comments Off on Proposed Principles for Artificial Intelligence Published by the White House

Facebook : Record Settlement

Posted by fgilbert on July 12th, 2019

Facebook might be required to pay a $5 billion dollar fine to the Federal Trade Commission, as a settlement of the investigation of the Cambridge Analytica data scandal, according to a report published by Bloomberg Law.

Customarily, these settlements are published for consultation, and become final several weeks or months later.

Posted in FTC, US Law
Comments Off on Facebook : Record Settlement

CCPA – California Consumer Privacy Act – A Primer

Posted by fgilbert on April 15th, 2019

The California Consumer Privacy Act of 2018 (CCPA), codified as Cal. Civ. Code §1798.100 et seq,is California’s current attempt at regulating the collection and use of personal information of California residents. The statute has numerous similarities with the GDPR – the EU General Data Protection Regulation – especially those provisions of the GDPR that define the rights of individuals.

CCPA grants California consumers the right to know what personal information about them is collected by a business, and how the business uses it. It also gives consumers the means to prevent the sale of their personal information to third parties. The statute becomes effective on January 1, 2020. Regulations are being drafted.  Enforcement actions may not be brought by the Attorney General until the earlier of (i) the publication of the final regulations or (ii) July 1, 2020.

CCPA has been the focus of much attention due to its far reaching provisions.  Within California, numerous bills have been presented to attempt to amend it. Outside California, several states, such as the State of Washington, are evaluating bills with similar goals. At the Federal level, there is also significant activity. Hearings are held regularly for evaluating the possibility of a federal data protection law that would supersede the California statute and address the patchwork of inconsistent state data protection laws derived from the CCPA that might be adopted in the meantime.

For now, it is not clear whether a Federal bill will have sufficient support to pass both houses and be signed by the President before the end of December 2019.  If a Federal law is not signed before the end of 2019, entities, worldwide, that collect personal information of California residents and meet the CCPA definition of a “business” must be prepared to post a Privacy Notice that meets the CCPA requirements, and have in place processes and procedures to respond to consumers’ request for access to information, copy or erasure of information about them, or request to block the sale of their personal information by that business.

Who is Subject to CCPA?

CCPA protects all individuals who are California residents, whether they are interacting with a business in the context of the needs of their households, or as part of an employment relationship.

CCPA applies to “businesses.” A “business” is an entity that does business in the State of California, is organized or operated for profit, collects consumers’ personal information, determines the purposes and means of the processing of such information; and meets at least one of the following criteria:

  • Annual gross revenues in excess of twenty-five million dollars ($25,000,000);
  • Buys, sells, receives or shares for commercial purposes, the personal information of 50,000 or more consumers, households, or devices annually; or
  • Derives 50% or more of its annual revenues from the sale of personal information.

In addition, any entity that controls or is controlled by a business, as defined above, and that shares common branding with the business is also a “business” subject to CCPA.

What Personal Information is Protected by CCPA?

CCPA applies to all forms of personal information (paper or digital). It defines “personal information” as information that identifies, relates to, describes, is capable of being associated with, or could reasonably be linked, directly or indirectly, with a particular consumer or household. The statute provides an exhaustive list of 11 categories of personal data, which includes among other, identifiers, customer records, commercial information, biometric information, online activity, geolocation data, biological data, professional information, education and inferences drawn from other information.

Medical information, financial information, credit information, driver’s license information, and information that is deidentified or aggregated are excluded to the extent that they are regulated under other laws.

Right of Access to Information

Consumers are granted the right to request a business to disclose the categories and specific pieces of personal information that it has collected.  The business must be able to identify the categories of personal information that it collectedabout the consumer; categories of sourcesfrom which the personal information is collected; business or commercial purpose for collecting or sellingpersonal information; categories of personal information that the business sold or disclosedfor a business purpose; and categories of third partiesto whom the personal information was sold or disclosed.

In response to a consumer’s request for information, a business must promptly disclose and deliver the required information, by mail or electronically, and free of charge. It is not required to provide such information to a consumer more than twice in a 12-month period. It has 45 days to respond to a verified consumer request.

Consumer’s Right of Erasure

Consumers have the right to request the deletion of any personal information that the business has collected from the consumer (with exceptions).

Sale of Personal Information: Opt-Out / Opt-In Rights

CCPA allows businesses to sell personal information of individuals older than 16 years of age unless the individual has opted-out of such sale. For children under 16, the sale is prohibited unless the child (between 13 and 16) or his/her parent or guardian (if the child is younger than 13) opts-in to the sale. Consumers may authorize third parties to opt-out on the consumer’s behalf.

Businesses must inform consumers that they have the “right to opt-out” of the sale of their personal information. A clear and conspicuous icon must be displayed on the business’s website or app homepage, titled “Do Not Sell My Personal Information.” The icon must be linked to a page that enables a consumer, or a person authorized by the consumer, to opt-out of the sale of the consumer’s personal information.

Discrimination Based on Exercise of Consumer Rights

Businesses are prohibited from discriminating against a consumer who has exercised any of the rights provided by CCPA. They may not deny goods or services to the consumer, charge different prices or rates for goods or services, or provide a different level or quality of goods or services. However, they are permitted to charge different prices or rates, or provide different levels or quality of goods or services if that difference is “reasonably related to the value provided to the consumer by the consumer’s data.”

Privacy Notice

Businesses that collect personal information must disclose, at or before the point of collection, the categories of personal information to be collected and the purposes for which they will be used; categories of personal information that the business has collected in the preceding 12 months; categories of sources from which the personal information is collected; specific pieces of personal information that the business collects; categories of personal information that the business has sold; categories of personal information that the business has disclosed for a business purpose, or if the business has not sold / disclosed personal information for a business purpose, state that the business has not sold / disclosed personal information for business purposes; business or commercial purpose for the collection or sale; categories of third parties with whom the business shares personal information.  

In addition, the privacy notice must inform consumers of their right to know which information the business has collected, which information has been sold or disclosed, and that consumers have the right to request the deletion of their personal information.  The notice must be updated at least once every 12 months.

Interaction with Service Providers and Third Parties

Businesses that disclose personal information to a service provider or third party should ensure that they enter into written contracts that prohibit them from selling the personal information and from retaining, using, or disclosing it other than for performing the services or business purpose outlined in the contract.  They should also ensure that the recipient of the personal information understands the prohibitions. If they do so, and the service provider or third party violates these restrictions, the CCPA makes them liable for these violations and exempts the business from liability for the activities that are contrary to these instructions.

Enforcement, Injunctions and Fines

Any business, service provider, or other person that is found to violate CCPA may face an injunction and a civil penalty of two thousand five hundred dollars ($2,500) for each violation, or seven thousand five hundred dollars ($7,500) for each intentional violation.

Consumers’ Private Right of Action in Case of Security Breaches

CCPA grants consumers the ability to institute a civil action for injunctive relief and damages in event of a security breach that affects specified categories of personal information, such as social security number; driver’s license number; account number, credit or debit card number, in combination with access code; or medical information and health insurance information. The business must be able to prove that it has met its duty to implement and maintain “reasonable security procedures and practices appropriate to the nature of the information to protect the personal information” as required by California’s Civil Code Section 1798.81.5. Liquidated damages may reach up to seven hundred and fifty ($750) dollars per consumer per incident.

Next Steps

While it is not clear at this time what the California or US privacy law landscape will look like by the end of the 2019, it is certain that a consumer privacy law will govern at least a significant percentage of companies that do business with California residents. Those potentially affected entities should start evaluating their current data handling practices and, at a minimum, collect sufficient information to establish a data map of their activities related to personal information so that they can easily identify, with specificity, the categories of personal information that the business collects, the sources from which the personal information is collected; and the third parties with whom the business shares personal information. Business should also be able to identify whether they sell or share personal information with third parties, and for what purpose, as well as the recipients of this information.

CCPA grants California resident numerous rights. It likely that the next privacy law that will apply to California residents, whether CCPA or a federal law, will grant “privacy rights” to California resident. These rights will allow individuals to request copies of personal information about and at times modification or erasure. Responding to these requests is frequently costly and time consuming. Business that are within the jurisdiction of the CCPA should start evaluating how they would address individuals’ access and other requests concerning personal information about them.


Posted in California, US Law
Comments Off on CCPA – California Consumer Privacy Act – A Primer

Social Networking App to pay $5.7 M Fine in COPPA Case

Posted by fgilbert on February 27th, 2019

On February 27, 2019, the operators of the video social networking app, now known as Tik Tok agreed to pay a $5.7 million fine to settle allegations by the Federal Trade Commission that the company illegally collected personal information from children.[1]This amount is the largest one ever obtained by the FTC in a children’s privacy case.
The app was widely used. Since 2014, 65 million accounts have been registered in the United States and more than 200 million worldwide. The complaint[2]noted that the app operators were aware that a significant percentage of users were younger than 13 and had received thousands of complaints from parents that their children under 13 had created accounts.
In its complaint, the FTC alleged that violated COPPA and the COPPA Rule[3]by failing to notify parents about the collection and use of personal information from users under 13, failing to obtain parental consent before such collection and use, and failing to delete personal information at the request of parents.
To register, users had to provide first and last name, user name, a short biography, and a profile picture, as well as an email address and phone number. The app allowed users to create short videos lip-syncing to music and share those videos with other users.  It also allowed users to interact with other users by commenting on their videos and sending direct messages.
According to the complaint, user accounts were public by default; a child’s profile bio, username, picture, and videos could be seen by other users. While the site allowed users to change their default setting from public to private so that only approved users could follow them, users’ profile pictures and bios remained public, and users could still send them direct messages. The complaint noted that there had been public reports of adults trying to contact children through the app.
Among other things, the settlement includes a $5.7 million fine, an obligation to take offline all videos made by children under the age of 13, and an ongoing obligation to comply with COPPA.[4]

[1]Settlement Order available at
[2]Complaint available at:
[3]COPPA Rule available at:
[4]Settlement Order available at

Posted in Children, FTC, US Law
Comments Off on Social Networking App to pay $5.7 M Fine in COPPA Case

GDPR and Privacy Shield: Different Tools for Different Goals

Posted by fgilbert on November 26th, 2018

By Paola Zeni, Francoise Gilbert, Max Calehuff

Paola Zeni is the senior director of global privacy at Palo Alto Networks.

Francoise Gilbert is a shareholder in Greenberg Traurig LLP where she focuses her practice on
US and global data privacy and cybersecurity.

Maxwell Calehuff is an attorney in the Cybersecurity and Privacy Group of Greenberg Traurig

US-based organizations are realizing that they must comply with the EU General Data Protection Regulation (GDPR) — even if they do not do business anywhere in Europe — because their practices include the collection or processing of personal data of individuals located in the European Union (EU) or the monitoring of their activities. Unlike its predecessor – Directive
95/46/EC, known as the EU Data Protection Directive – the GDPR was drafted to apply to many organizations established outside the EU, so that the protection follows the data when the data is moved or processed abroad.

GDPR Art. 3 is the key provision regarding the territorial reach of the GDPR. Under Article
3(1), the GDPR applies to the processing of personal data in the context of the activities of the establishment of an entity in the European Union. In practice, the protection extends as well to individuals located in Norway, Iceland and Lichtenstein, because, like most laws of the European Union, the GDPR is incorporated into the laws of these three countries, and thus its scope covers the entire European Economic Area (EEA) – which is comprised of the European Union and
these three additional countries.

Article 3(2) extends the territorial scope of the GDPR outside the EU or EEA borders. It states that GDPR applies to the processing of personal data of individuals who are in the EU / EEA by a data controller or processor established outside the EU /EEA, when the processing is related to the offering of goods or services to such individuals, or the monitoring of their behavior. Article
3(2) attaches to numerous US entities and requires them to comply with the entire GDPR.

Some organizations assume that it is enough for them to have self-certified their adherence to the
EU-US Privacy Shield (Privacy Shield) and that their self-certification is sufficient to address all
99 articles of the GDPR. This is incorrect. While the Privacy Shield and GDPR overlap in some areas, the GDPR is much broader and contains many more requirements.

This article compares the Privacy Shield and the GDPR, to highlight commonalities, but also gaps that organizations need to address to achieve compliance under both frameworks.


The EU-US Privacy Shield framework, which relies on the Privacy Shield Principles and Supplemental Principles (collectively Shield Principles), was developed in consultation between the US Department of Commerce and the European Commission, and finalized in July 2016, is a cross-border data transfer mechanism. It addresses the restrictions to the transfer of personal data outside the EU or EEA under Articles 44-50 of the GDPR (and before that, Articles 25-26 of the EU Data Protection Directive 95/46/EC). These provisions require the data exporter to ensure that EU or EEA data subjects will continue to benefit from effective safeguards and protection after their data has been transferred outside the EU or EEA. This assurance can be provided through different means. The EU-US Privacy Shield framework, is one of these means of providing the assurances required by GDPR Art. 44-50.

The Privacy Shield framework was not drafted to meet the requirements of the GDPR or as an alternative to GDPR. It was drafted separately from the GDPR; it is not even mentioned in the GDPR. The Shield Principles meet only a small aspect of the GDPR. The Shield is limited to providing a legal ground for the processing of EU or EEA data in the United States, and to establishing for EU or EEA individuals and regulators a means for reaching US-based organizations in the United States, and initiating enforcement. It is a data transfer mechanism only. It also addresses some concerns regarding access by US national security to EU or EEA data stored in the United States; this aspect of the Privacy Shield framework is not discussed here.

Common elements of the Privacy Shield Principles and GDPR

There are similarities and, at times, overlap between the Shield Principles and the GDPR. The latter is significantly broader, deeper, and more specific than the Shield Principles. In this section, we look at the seven basic Principles of the EU-US Privacy Shield and compare them with the equivalent provisions found in the GDPR.

1. Notice

The Notice Principle requires an organization, among other things, to inform individuals about its commitment to process all personal data received from the EEA in compliance with the Privacy Shield Principles and in reliance upon the Shield; the fact that the organization is subject to investigatory and enforcement powers of the Federal Trade Commission or the US
Department of Transportation; the requirement to disclose personal data in response to lawful requests; the possibility of invoking binding arbitration; how to contact the organization with
inquiries and complaints; and the independent dispute resolution body designated to address such complaints.

An organization must also inform individuals of the types of personal data collected, the
purposes for which it collects and uses personal data about them, the individuals’ rights to access their data, the choices and means the organization offers them to limit the use and dissemination of their personal data, the identity of third parties to which the data is disclosed, and the organization’s liability in cases involving transfer to third parties.

Most of these requirements are found in GDPR Art. 5(1)(a) [Lawfulness, Fairness, and Transparency] and GDPR Art. 5(1(b) [Purpose Limitation], and further detailed in GDPR Art. 12 [Transparent information], Art. 13 and 14 [Information to be Provided], among others.

2. Choice

Under the Choice Principle, an organization must offer individuals the opportunity to opt out of having their personal data disclosed to a third party or used for a purpose materially different from the purpose for which it was originally collected. It is unnecessary to provide choice when the disclosure is made to a third party acting as an agent of the organization. However, the organization must enter into a contract with the agent.

For sensitive information (medical or health condition, information specifying the sex life of the individual, racial or ethnic origin, political opinion, religious or philosophical beliefs, trade union membership), organizations must obtain the individual’s express affirmative consent before such information is disclosed to a third party or used for a purpose that is materially different than the purpose for which it was originally collected.

Most of these requirements are found in GDPR, for example in Articles 6(4) [Lawfulness of the
Processing, 7 [Conditions for Consent], 9 [Special Categories of Data] as well as GDPR Article
5(1)(a), [Lawfulness, Fairness, and Transparency] and Article 5(1(b) [Purpose Limitation].

The Choice Principle requires offering individuals the opportunity to opt-out from the disclosure of their personal data to a third party, or the use of the data for a materially different purpose than the one originally announced. GDPR Art. 21 [Right to Object] grants individuals the right to object to the use of personal data for the legitimate interest of the data controller, and to the use
of personal data for marketing purposes.

Notably missing from the Privacy Shield framework are the right of EU or EEA citizens not to be subjected to automated decision-making, including profiling, found in GDPR Art. 22(1) the right to restrict the processing of their personal data, such as when it is contested or no longer needed, found in GDPR Article 18(1).

3. Accountability for onward transfer

To transfer personal data to a third-party acting as a data controller, organizations must comply with the Notice and Choice Principles and enter into a contract with the controller. The contract must specify that personal data may only be processed for limited and specified purposes consistent with the consent obtained from the individual. The contract must also specify that the recipient will provide the same level of protection as the Shield Principles and will notify the organization if it can no longer meet this obligation, and take reasonable steps to remediate.

To transfer personal data to a third-party agent, organizations must transfer the personal data only for limited specified purposes, and ensure that the agent provides at least the level of protection required by the Shield Principles. They must take reasonable and appropriate steps to ensure that the agent effectively processes the personal data transferred in a manner consistent with the organization’s obligations under the Shield Principles. They must also require the agent
to notify the organization if it can no longer comply with the Principles, and must take reasonable steps to remediate unauthorized processing.

Under the GDPR, when a US-based data controller wishes to transmit data to a data processor located outside the EU or EEA, two sets of provisions apply: GDPR Art. 28 deals with the use of a processor. GDPR Art. 44 and 46 address the adequacy of the safeguards to be provided by the foreign entity; these provisions focus on cross-border data transfers and further transfers to third parties and are consistent with the Shield Onward Transfer Principle.

The comprehensive GDPR Art. 28 outlines in detail the required content of the contract between the controller and the processor. For example, the contract must stipulate that the processor may process the data only on documented instructions of the controller; must assist the controller in responding to data subjects’ exercise of their rights, must obtain the controller’s consent before enrolling a subcontractor, and must notify the controller if the controller’s instructions would infringe applicable law.

4. Security

The Security Principle requires organizations that self-certify compliance with the Shield to take reasonable and appropriate measures to protect personal data from loss, misuse, unauthorized access, disclosure, alteration, or destruction. GDPR Art. 5(1)(f) [Integrity and Confidentiality] also requires organizations to ensure appropriate security of the personal data. GDPR Art. 32 [Security of Process] provides additional parameters for the identification and choice of security measures, including a number of specific security measures that organizations must undertake when handling personal data originating from the EU or EEA.

The Shield Principles do not deal with the impact of security breaches. While the Security Principle requires the use of appropriate measures to protect data from loss, misuse, unauthorized access disclosure, alteration or destruction, it does not address the potential effect of a security incident or require any form of notice to supervisory authorities or affected data subjects.

On the other hand, GDPR Articles 33 and 34 detail with great specificity the actions to be taken in the event of a data breach. Among those, the affected data controller must notify the supervisory authority or authorities within 72 hours of becoming aware of a personal data breach, unless the breach is unlikely to result in a risk to the rights and freedom of individuals.
They must also notify individuals “without undue delay” if the breach is likely to result in a high
risk to the rights and freedoms of the individuals.

Data processors who suffer a data breach must notify the controller without undue delay after becoming aware of the breach. Further, GDPR Art. 28(3)(c) and Art. 28(3)(f) flow down these requirements to processors and their own subprocessors.

5. Data Integrity, purpose, retention

The Shield Principles require that the collection of personal data be limited to what is relevant for the purposes of processing. An organization must take reasonable steps to ensure that personal data is reliable, accurate, complete, and current, and must retain the data in a form that
makes the individual identifiable only for as long as reasonably necessary to serve the purpose for which it has been collected and to which the individual has consented.

GDPR Art. 5(1)(b) [Purpose Limitation], GDPR Art. 5(1)(e) [Storage Limitation] and GDPR Art. 5(1)(f) [Integrity and Confidentiality] cover similar issues.

6. Access

The Access Principle grants individuals the ability to have access to personal data about them that an organization holds. They are also able to request the amendment or deletion of information that is inaccurate or was collected in violation of the Privacy Shield Principles.

The scope of individuals rights under the GDPR is much greater; it extends beyond the right of access, correction or deletion. Art. 20 provides the right to data portability, while Art. 21 [Right to Object], includes, for example, the right to object to certain uses of personal data and the right to object to the use of personal data for marketing purposes. GDPR Art. 22 [Automated Individual Decision-Making] grants the right not to be subject to a decision solely based on automated processing.

The right of erasure, under GDPR Art. 17, is also more complex and more nuanced. The Privacy Shield limits the right of deletion to situations where the data is inaccurate or was collected in violation of the Shield Principles. The GDPR right of erasure or “right to be forgotten” provides for the right to have data deleted when the individual withdraws consent on which the processing is based, if there are no other legal grounds for the processing. It also includes a provision for the deletion of data about children that has been collected in connection with the use of internet services.

7. Recourse, enforcement, and liability

Both the Shield Principles and the GDPR require organizations to have mechanisms in place for ensuring compliance with the applicable rules. In the Privacy Shield, the Recourse Principle requires the use of independent recourse mechanisms (such as the American Arbitration Association, or the Better Business Bureau). The mechanisms must be readily available at no cost to the individual. The recourse mechanism also must allow for the award of damages in
accordance with applicable law or the rules of the recourse mechanism. There must be follow-up procedures for verifying the accuracy of the assertions made by organizations about their data protection practices. Furthermore, organizations must respond promptly to requests from the Department of Commerce for information related to the Privacy Shield and to complaints referred by EU / EEA Member State supervisory authorities through the Department of Commerce.

In addition to the independent recourse mechanisms, violation of the Shield Principles, or misrepresentation as to compliance with them, may be subject to investigations by the Federal Trade Commission (FTC). When an organization becomes subject to an FTC or court order based on non-compliance, it must make public any relevant Privacy Shield-related sections of any compliance or assessment report submitted to the FTC, to the extent consistent with confidentiality requirements. The Recourse and Enforcement Principle allows affected individuals to bring their complaints directly within the purview of US-based enforcement
authorities, private or governmental, which might make enforcement easier, faster, and more effective. The Recourse and Enforcement Principle does not identify specific administrative fines. FTC consent decrees issued after investigations of non-compliance with the Shield Principles have included significant obligations, such as record keeping requirements for 20 years after the issuance of the order, which can present a significant financial burden, among other things.

GDPR Articles 77 to 84, on the other hand, provide extensive remedies and significant fines. Individuals have the right to lodge a complaint with a Supervisory Authority under GDPR Art.
77, and the right to judicial remedy in the courts of the Member State where the individual reside, under GDPR Art. 79. Individuals can also mandate a nonprofit organization to lodge a complaint on their behalf, under GDPR Art. 80, and may receive compensation under GDPR Art
82 [Right to Compensation]. Most important, GDPR Art. 83 [Administrative Fines] allows for the imposition of administrative fines that may reach €20 million or four percent of the total worldwide annual turner of a global entity, whichever is higher.

In the case of recourse and enforcement under the GDPR, it remains to be seen how EU or EEA authorities and courts will be able to assert jurisdiction or to enforce judgments, damages or fines over organizations located outside the EU or EEA. GDPR Art. 27 requires non-EU or EEA controllers and processors to appoint a representative located in the EU or EEA. The representative can be addressed in addition to, or instead of, the controller or processor by supervisory authorities and data subjects for ensuring compliance with the GDPR. GDPR Recital
80 indicates that the designated representative could be subject to enforcement proceedings in the event of non-compliance by the controller or processor.

At this time, there is little clarity on how enforcement proceeding could be conducted and what the potential outcome might be. Would the role of the representative be limited in most cases to that of an agent for receiving communications and providing responses or could the representative become jointly and severally liable with the non-EEA entity? GDPR Art. 27 is silent and so far, no guidelines have been issued. In addition, it is also not clear how a judgment rendered in the EU or EEA against an organization established abroad would be enforced against that foreign entity.

When addressing recourse and enforcement, GDPR and Privacy Shield adopt different routes and pertain to different subject matters. Privacy Shield focuses on enforcement of violation of the Privacy Shield Principles in the United States, where the FTC is likely to have a significant role in stopping a US company from conducting non-compliant activities, and historically has been a tough enforcer.

GDPR focuses on enforcement in the EU or EEA, pertains to the entire GDPR, provides local government agencies with the ability to assess significant fines, and grants individuals a private right of action to seek damages. In the past, EU or EEA agencies have not been as aggressive as their US counterparts but the landscape is likely to change with the significant fines available under GDPR Art. 83.
It remains to be seen what will happen in practice, which of these avenues will be more frequently used in case of a dispute, what the outcome of enforcement action will be, and which mechanism will provide more effective enforcement or recourse for affected individuals or create more barriers or hurdles for organizations.

GDPR concepts that are not addressed in the Shield Principles

In the first part of this article, we showed that in six of the areas covered by the Shield Principles the GDPR takes a more comprehensive view and contains more stringent, detailed, and specific requirements. The seventh Shield Principle, Enforcement, differs significantly from the enforcement provisions of the GDPR. Given that enforcement of the Shield Principles has been limited to a handful of FTC actions, it is difficult to make a practical comparison between the
two enforcement mechanisms at this time.

When we move the analysis and the comparison to other areas, it becomes even clearer that a self-certification of adherence to the Shield Principle is insufficient to show compliance with all GDPR provisions that may be applicable to organizations. We provide several examples below:

1. Legal grounds for processing data

The Privacy Shield Notice and Choice principles require organizations to disclose the purpose of collecting personal data and obtain consent to conduct certain activities, such as disclosure to third parties or use for a purpose materially different from the originally disclosed purpose. However, it assumes, a priori, that the data have been legally collected or that the consent was implied from the conduct of the parties.

The GDPR Article 6 (1) requires that the collection and processing of personal data be lawful. It identifies only six limited grounds for collection and processing to be legal. For example, processing will be lawful if it is necessary for the performance of a contract to which the data subject is a party, or to comply with a legal obligation. Processing will also be lawful if it is conducted for the legitimate interests of the controller or a third party, so long as these interests are not overridden by the fundamental rights and freedoms of the individual. In some cases, a data controller may have no other choice than seeking and obtaining the explicit consent of the individual (opt-in consent) to provide the required legal basis for the contemplated processing.

2. Obligations regarding data subject rights

In addition to providing extensive rights to individuals located in the EU or EEA, the GDPR imposes obligations on data controllers to facilitate the exercise of those rights. Controllers must provide individuals with information about their rights as data subjects and must facilitate the exercise of those rights electronically. Controllers must respond to a data subject’s request within one month, and provide information on actions taken or not taken in response to a request. In addition, data processors are contractually required to cooperate with the data controller to address such rights.

3. Data protection by design and default

GDPR Art. 25 [Data Protection by Design and by Default] requires data controllers to implement appropriate measures to ensure that the processing implements the data protection principles. It also requires that the processing meet the GDPR principles and requirements, assure and protect the rights of the individual, and that, by default, the processing be limited to the personal data necessary for a specific purpose.

4. Documentation of processing and data protection impact assessment

GDPR Art. 30 [Record of Processing Activities] requires controllers and processors to keep electronic records of their processing activities, to be made available to supervisory authorities upon request. When processing activities are likely to result in a high risk for the rights and freedoms of individuals, GDPR Art. 35 [Data Protection Impact Assessment] requires data controllers to assess the impact of the envisaged processing on the protection of personal data. Both Articles 30 and 35 are likely to have a significant operational impact on organizations.


Even if a company does not do business in the European Union or the European Economic Area, it may be subject to GDPR. Compliance with the GDPR requires significant efforts, time and financial investments.

The Privacy Shield Principles provide a simple, easy to, use means for organizations to address their obligations under Chapter V, Articles 44-50 of the GDPR [Transfer of Personal Data to Third Countries or International Organizations]. However, the use of the Shield just serves its original purpose: providing a means for US entities to show their commitment to protecting personal data originating in the EU or EEA when the processing is conducted in the United States, and to respond to complaints and enforcement actions that may be initiated in the EU or EEA and subsequently transmitted to US agencies. The Privacy Shield is not a data protection law or a comprehensive data protection compliance framework. It is a cross-border transfer mechanism.

As both the Privacy Shield and the GDPR are further explained and clarified, organizations should understand the narrow, limited, and specific role of the Privacy Shield, the significant gaps between the Privacy Shield and the GDPR, and that they cannot meet their obligations under GDPR solely through a self-certification of their commitment to observe the Privacy Shield principles.

Posted in Europe, US Law
Comments Off on GDPR and Privacy Shield: Different Tools for Different Goals

Privacy v. Data Protection. What is the Difference?

Posted by fgilbert on October 1st, 2014

I recently participated in a discussion about the difference between “privacy” and “data protection.” My response was “it depends.” It depends on the country. It may also depend on other factors.

When some countries use the term “privacy,” they may mean the same thing or refer to the same principles as what other countries identify as “data protection.” In other countries, “data protection” may be used to mean “information security” and to overlap only slightly with “privacy.” In this case, the term “data protection” may encompass more than just the protection of personal information (but only through security measures). It may cover as well the protection of confidential or valuable information, trade secrets, know-how, or similar information assets.

In the extensive research I conducted when writing my two-volume treatise, Global Privacy and Security Law, which provides an in-depth analysis of the laws of about 70 countries on all continents, I noticed that the use of the terms “privacy” and “data protection” varies from country to country. It may depend on the language spoken in that particular country. It may depend on the region where the country is located.

While in the United States the term “privacy” seems to prevail when identifying the rules and practices regarding the collection, use and processing of personal information, outside the United States, the term “data protection” tends to be more widely used than “privacy.” Among other things, this might be due to the idiosyncrasies of the languages spoken in the respective countries, as explained below.

— “Data Protection” Outside the United States

Throughout the world, “data protection” is frequently used to designate what American privacy professionals call “privacy”, i.e., the rules and practices regarding the handling of personal information or personal data, such as the concepts of notice, consent, choice, purpose, security, etc.


In Europe, “data protection” is a key term used, among other things, to designate the agencies or individuals supervising the handling of personal information. The 1995 EU Data Protection Directive identifies these agencies as “Data Protection Supervisory Authority.” See, e.g. 1995 EU Data Protection Directive, Article 28 defining the “Data Protection Supervisory Authority,” the agency that regulates and oversees the handling of personal data in an EU Member State. The individuals responsible for the handling of personal information within a company – a role similar to, but different from, that of the American Chief Privacy Officer – are designated as “Data Protection Official.” See, e.g. 1995 EU Data Protection Directive, Article 18(2) and Article 19.


Outside Europe, the term “data protection” is also frequently used to designate activities that Americans would designate as “privacy” centric. In Asia, for example, the laws of Malaysia, Singapore, and Taiwan are named “Personal Data Protection Act.” The law of Japan is called “Act on the Protection of Personal Information.” South Korea’s laws, APICNU and the recent Personal information Protection Act also use the term “data protection.”


African countries also use the concept of “data protection” rather than “privacy.” South Africa named its new law “Protection of Personal Information Act.” Tunisia and Morocco, also named their privacy laws “law relating to the protection of individuals with respect to the processing of personal data.”


In the Americas, Canada’s PIPEDA stands for Personal Information Protection and Electronic Documents Act. The new Mexican law is called “Ley Federal de Protección de Datos Personales.”

—  “Privacy” in Foreign Laws

On the other hand, the term “privacy” is seldom used to identify foreign laws or regimes dealing with the protection of personal information. There are, however, a few example of the use of the term “privacy” outside the United States. APEC used the term “privacy” for its 2004 “APEC Privacy Framework.” The law of the Philippines is called “Data Privacy Act.”

— Translations of “Privacy”

When analyzing which term is used to address the protection of personal data throughout the world, it is also important to keep in mind that the word “privacy” (as understood in the United States) does not exist in some languages.


It is very difficult to translate “privacy” into French. There is no such word in French, even though the French are highly private and very much concerned about the protection of their personal information. If you look for a translation, you will find that “privacy” is translated into French as “intimité,” which is inaccurate, or very narrow. The French “intimité” is actually equivalent to “intimacy” in English and has little to do with the US concept of “privacy” or “information privacy.” Indeed, the French law of 2004 does not refer to “intimacy” but is titled “Act relating to the protection of individuals with regard to the processing of personal data.”


There is a similar disconnect with the translation of “privacy” into Spanish where “privacy” is translated into “privacidad,” which has a meaning closer to intimacy, remoteness, or isolation. Unsurprisingly, the Spanish law regarding data privacy is named “Organic Law data protection law on the Protection of Personal Data.” The term “privacidad” is not used.


 — Data Protection as “Security”

On the other hand, in the US, the term “privacy” seems to prevail. We commonly refer to HIPAA or COPPA as “privacy laws.”

What about “data protection”? I have noticed that, many US information security professional tend to use the term “data protection” to mean protecting the security of information, i.e. the protection of the integrity and accessibility of data. In this case, they do not distinguish the protection of personal data from the protection of company data because from a security standpoint, the same tools may apply to both types of data. In other circles, the terms “information security”, “data security”, “cybersecurity” are frequently used as well.

 — Online Searches

Finally, if you are based in the US, and you run an online search for “data protection”, you will see that the search results either provide links to “security” products (e.g. in my case, a link to McAfee Data Protection product that prevents data loss and leakage) or links to foreign laws dealing with what Americans call “privacy”, (e.g. in my case, a link to Guide to Data Protection from the UK Information Commissioner’s Office).

Posted in International, US Law
Comments Off on Privacy v. Data Protection. What is the Difference?

Verizon to pay $7.4 million to settle FCC privacy enforcement action

Posted by fgilbert on September 8th, 2014

The Enforcement Bureau of the Federal Communication Commission (FCC) reached a $7.4 million settlement with Verizon on September 3, 2014, after an investigation into the company’s use of customers’ personal information for marketing purposes. This $7.4 million fine is the largest such payment in FCC’s history for settling an investigation related solely to the privacy of phone customers’ personal information.

Section 222 of the Communications Act, entitled “Privacy of Customer Information” imposes a duty on every telecommunications carrier to protect the “proprietary information” of its customers. These obligations are further clarified in the Customer Proprietary Network Information Rules (CPNI Rules) of the FCC.

Among other things, phone companies are prohibited from accessing or using certain personal information except in imitated circumstances. To be able to use customers’ information for certain marketing purposes, phone companies must obtain the approval of their customers through an opt-in or an opt-out. When that process is not working, the phone company must report the problem to the FCC within five business days.

The FCC investigation found that, beginning in 2006, and continuing for seven thereafter, Verizon failed to notify approximately two million new customers, on their welcome letter or their first invoices, of the privacy rights, including how to opt-out from having their personal information used in marketing campaigns. Further, Verizon failed to discover this deficiency until September 2012, and failed to notify the FCC until January 2013, over four months later.

Verizon represented that it took remediation efforts following discovery of the problem, including sending opt-out notices, banning all marketing, and implementing a new program to place CPNI opt-out notice on every invoice, each month, for all the potentially affected customers (consumers and small and medium size business customers).

In addition to the $7.4 million fine, to be paid to the US Treasury, Verizon will be required improve its privacy practices, including, among others, to:

  • Designate a senior corporate manager to serve as compliance manager responsible for implementing and administering Verizon’s compliance plan;
  • Notify all Verizon directors, officers, managers and employees of the terms of the consent order;
  • Establish operating procedures to ensure compliance with the consent order;
  • Develop and distribute a compliance manual regarding the handling of customer information;
  • Establish a compliance training program;
  • Notify customers of their opt-out rights on every bill;
  • Monitor and test its billing system and opt-out notice process on a monthly basis, to ensure that customers are receiving appropriate notices;
  • Report any detected problem to the FCC within 5 business days;
  • Report any non-compliance to the FCC within 30 calendar days.

Several of the compliance obligations listed above terminate three years after the date of the Consent Decree.

The Federal Trade Commission is only one of the federal agencies charged with the protection of personal information. Several agencies have sectoral responsibilities, as well. As discussed above, Section 222 of the Federal Communications Act and the related CPNI Rules, contain important provisions regarding the privacy of the personal information of phone users. These provisions are enforced by the Federal Communications Commission.


Posted in FCC, US Law
Comments Off on Verizon to pay $7.4 million to settle FCC privacy enforcement action