Skip to main content

Cybersecurity Strategy Advice for the Trump Administration: US-EU Relations

April 14, 2017


Alexander Kegel

Feature Series

Cybersecurity and the Trump Administration Series

Regional Recommendations for U.S. Cybersecurity Policy in the World

Central Challenge

The flow of data between the United States and the European Union has always been an essential element to international trade and transatlantic cooperation. Yet, different philosophies regarding the way in which individuals access and maintain the Internet creates barriers to trade.


  1. Adopt the Right to be Forgotten in the US.
  2. If the Right to be Forgotten is not adopted, instead clarify standards for information retention.
  3. Work to create law that can adapt to EU regulations and changing data realities.

With trade between the European Union (EU) and the United States (US) accounting for over 50 percent of world GDP and employing up to 15 million workers, it is important to eliminate policy differences that create barriers to trade, especially in digital services and access. In 2012, the US exported $140.6 billion worth of digitally deliverable services to the EU and imported $86.3 billion worth with the total amount rising to $260 billion in 2016. With a European Court of Justice ruling in 2015 and the resulting Privacy Shield, transatlantic trade issues regarding digital data flow were temporarily avoided. This year another European Court of Justice ruling on the “Right to be Forgotten” is a major area of contention between the US and the EU with regards to Internet governance and data security. The Court’s ruling enforced the right to edit and erase information in search results across Internet search engines used in the EU.

In dealing with this issue, we must understand that, first and foremost, we live in an increasingly digital world. While it is harder to disconnect, regulations regarding access to this digital world have been rather slow to develop to mirror their physical counterparts. Unlike the physical world, our digital world is tailored. Based on an individual’s interests, and more importantly, actions in the digital world, she receives unique and personalized search results and even ads from search engines such as Google. An individual who regularly visits the University of Washington’s webpage is more likely to see that webpage appear near the top of a list of results for “UW” than the webpage for the University of Wisconsin Madison. However, if an individual searches herself and finds incorrect information online, there is currently no internationally respected method for rectifying the mistaken information. Although the United States currently has regulations on protecting copyright and intimate information online, such as credit card information or email passwords, the protection does not extend as far as the Right to be Forgotten—a policy in the EU made to address the issue of incorrect or out-of-date information online.

The Right to be Forgotten

In 2010, Mario Costeja González, a Spanish citizen, utilized the Right to be Forgotten in a case against both a Spanish newspaper and Google, as the Internet search “data controller,” citing irrelevant search results linked to his name. The newspaper published a story, in 1998, about real estate auctions held to repay González’s social security debt and reacquire his repossessed house. For González, who had resolved this debt more than a decade previously and now owned the house, the search results that linked to the online version of the story continued to inaccurately represent him as an economically unstable individual. González brought his case to remove these now irrelevant search results before the Agencia Española de Protección de Datos (AEDP), the Spanish national data protection agency.  While the case against the newspaper was dismissed and the article remained online, the case against Google was referred to the European Court of Justice, the highest court in the European Union, stating a conflict with the 1995 Data Protection Directive of the European Union.

The European Union’s 1995 Data Protection Directive already supports and protects half of the principle underlying the Right to be Forgotten – the Right to Erasure. Article 12 of the Directive states that individuals have the right to request from the controller (whether Google, Bing, Yahoo or others) of search data, the “rectification, erasure or blocking” of such data, in particular because of the “incomplete or inaccurate nature of the data.” Deriving from the French droit à l’oubli – “the Right of Oblivion” – a convicted criminal who has served his time is considered to be rehabilitated and can request that the facts of his conviction and incarceration be removed from the public eye. Under this law, citizens of the European Union are giving the unique ability to edit and rectify any errors in their online presence across search engines within the European Union, unlike any control that American users might have.

What was not included, until recently, in European law was the Right to De-list, the second half of the Right to be Forgotten. Here, the discussion about de-listing search results revolves around three questions: 1) Does the 1995 Data Protection Directive apply to search engines? 2) Does EU law apply to Google Spain given that the company’s data processing server is in the United States? and 3) Does an individual have the right to request that their personal data be removed from the search results regardless of who uploaded the data?

In the May 13, 2014 ruling on Google v. Spain, the European Court of Justice certified that the 1995 Data Protection Directive does apply to search engines, thus, requiring the data controller, such as Google or Yahoo, to take responsibility for the erasure or rectification. Furthermore, regarding any search engine with a branch in a European Union member state which “promotes the selling of advertising space offered by the search engine”, European Union data protection laws do apply. What is being asked is that search engines run by companies, regardless of nationality of the physical data center or company, edit the results for all European versions. American search engines, such as Google and Microsoft’s Bing, are to apply European Union law to their search results as long as they continue to make money off of European citizens for the services provided.

Finally, the Court of Justice decided that individuals have the right to request the de-listing of links in which the information is inaccurate, inadequate, irrelevant, or excessive. This includes, but is not limited to, old financial issues, juvenile misdemeanors, and the online actions of underage individuals. While this ruling does require search engines to de-link the material, the court does not require them to remove the de-listed material from the search index; it must remain accessible when individuals search under any terms unrelated to the name of the individual who requested the de-linking.

In the same ruling, the Court of Justice clarified that the Right to be Forgotten is not absolute and does not cover information posted on social media platforms. Facebook posts, tweets and other forms of social media posts are protected from the Right to be Forgotten because of jurisprudence of the “household exception” – in which case the user who publishes the material is the “data controller” rather than the search engine. In the case of Facebook, in order to rectify or erase any information regarding what others post about an individual, said individual must leave the service entirely. Recognizing the “Right to Erasure” since the 1995 Data Protection Directive, The European Union grants individuals the right to request that all data pertaining to them is erased once they leave a social media service. In April of this year, the European Union member states voted to adopt the General Data Protection Regulation, set to go into effect in 2018, which seeks to empower and clarify the Right to be Forgotten in light of the Spain v. Google case and other cybersecurity decisions.

The Right to be Forgotten has been retained in this latest regulation and Article 17 reads:

“The data subject shall have the right to obtain from the controller the erasure of personal data concerning him or her without undue delay and the controller shall have the obligation to erase personal data without undue delay where … the personal data are no longer necessary in relation to the purpose for which they were collected or otherwise processed.”

In other words, if an individual can prove that information presented is no longer accurate or does not represent his current situation, he may request that Google, or any other data controller, remove such information. Unlike the previous iterations of this law, Article 17 specifically refers to the Right to Erasure and will require the search material to be removed rather than simply de-linked. However, Article 17 also clarifies that “[The right to erasure] shall not apply to the extent that processing is necessary for exercising the right of freedom of expression and information; for reasons of public interest in the area of public health; for the establishment, exercise or defense of legal claims.” Therefore, if the data controller can prove to a court that the information is important to public health, an ongoing court case or public safety, the individual’s request to delink or erase the data can be deemed invalid.

Recommendation 1: Adopt the Right to be Forgotten in the US

Opinions on the Right to Remove "Irrelevant" Information from Search Results

Opinions on the Right to Remove “Irrelevant” Information from Search Results

Luciano Floridi was appointed by Google as an ethics advisor on the Right to be Forgotten ruling. He worries that the current debate over this ruling “may be exploited to fight proxy wars across the Atlantic, between different schools of political and economic thought.” While there are generally different opinions on rights and values on opposite sides of the Atlantic on many economic and political matters, there is a clear decision by the general European and American publics that the Right to be Forgotten is a good idea.

Opinions on Whether Search Results Can Be Damaging, Misleading

Opinions on Whether Search Results Can Be Damaging, Misleading

As the graphs below taken from this 2014 study indicate, when Americans were questioned on whether irrelevant search results about individuals should be removed if the individual applies for removal, 61% were in favor of various degrees of removal. Only 18% opposed the decision because of a matter of public record rather than how to define what is relevant. What is particularly interesting is that only 19% of the same group of respondents somewhat or completely agreed that search results could be damaging or misleading; meaning that some of those who thought that search results are harmless wished to forget some information.

Recommendation 2: If the Right to be Forgotten is not adopted, instead clarify standards for information retention

With the removal of information from the public eye as a prominent concern for critics of the Right to be Forgotten, some have claimed that the Right to Be Forgotten is a form of censorship. However, a similar process is used to remove bank account numbers, social security numbers and revenge porn, not the exercise of free expression.

During her time as the Vice-President of the European Commission for Justice, Fundamental Rights, and Citizenship and a Member of the European Commission for Information Society and Media, Viviane Reding claimed that the Right to be Forgotten is no harder to enforce than copyright law. Google currently handles millions of requests to remove content that infringes copyright, includes personal legal information or is of an intimate nature not authorized by one of the individuals involved. Google receives 23 million requests each year, equal to 787,000 requests per day, to remove such material. As of February 2016, Google has received only 386,038 total removal requests under the Right to be Forgotten since the development of the 2010 Spanish case.

Of these 386,038 requests, Google has authorized and removed 42%, which demonstrates that the Right to be Forgotten is not unilaterally applied. As both Article 12 of the Data Protection Directive and Article 17 of the General Data Protection Regulation state, should the data controller decide that such information is still relevant or accurate, they may reject the claim by the individual in accordance with the law of the European Commission. In the member states of the European Union, Viviane Reding insists that European Union law must apply “not just [to] EU companies but all those who use our internal market as a goldmine” as the right to privacy is considered greater than any economic interest for data controllers.

In other words, clarify the standards for what information is necessary to keep in the public eye. The key is necessity. Excluding inaccurate data, the Right to be Forgotten cannot be applied absolutely. The right to privacy can only extend so far before the right to information must be defended. The history of political figures, for example, must come with their own set of rules regarding privacy when their public and private world are so intertwined and relevant to their competence. The final clarification must be made to social media and those who exist in the online world and wish to delineate their digital and physical selves. In the end, the European Union and its citizens have decided to implement this law in their region and it is in the best interest of our government and American companies operating in the European Union that we help define, recognize and apply the standards to all data within the European Union.

Recommendation 3: Work to create law that can adapt to EU regulations and changing data realities

With the face of cybersecurity policy constantly adapting and expanding to cover a variety of topics that even five years ago would not have been a concern, those who create, control and disseminate data need to be prepared to adapt. It is not appropriate for data controllers to disseminate inaccurate or unnecessary data simply because their own interests rest in the least amount of data editing. With all the information people store in cloud storage, post on social media, and send via email, the Internet is determined to remember everything. In light of all of this data, people should have the option to edit certain parts of their digital history.

Ask Luciano Floridi and even he will tell you that “It is trivial to remark that today we save by default and erase by choice [and] yet our memory is also very forgetful: inaccessible like your floppy disks, rewritable like your web page, fragile like your malware-prone laptop, limited like the Gigabytes in your smartphone, editable like your social media platform.” No two individuals have the same search results anymore. They have been tailored and edited as much as the information posted on a Facebook page. Governments and data controllers can take certain actions to authorize and manage the amount of tampering, still ensuring that the relevant information comes to the surface. Primarily, the European Union, as representatives for their populace, and the United States government, as the representative for the world’s major data controller, Google, can work to assure that judicial figures who preside over international data law are the most competent and well-informed on the issue they can be.


61% of Americans believe that a Right to be Forgotten law is worth considering for the US. With the trends of cybersecurity as they are, this discussion will surely happen in the United States within the next five years and we can be prepared if we help support and apply such standards in the European Union.

This publication was made possible in part by a grant from Carnegie Corporation of New York. The statements made and views expressed are solely the responsibility of the author.