Google Plans 'Forgotten' Results Flag

Search Giant Seeks 'Right to be Forgotten' Ruling Compliance
Google Plans 'Forgotten' Results Flag

Expect forthcoming Google search engine results to flag entries that have been removed due to a "right to be forgotten" request.

See Also: A Look at Processing Principles Under the GDPR, CCPA, and the EU-US DPF

To comply with a recent European high court ruling concerning data protection rules and the so-called "right to be forgotten," the search giant also plans to update its biannual, anti-censorship transparency report. That report describes the number of worldwide requests governments have made to Google to excise specific information from search engine results - to reflect the number of right-to-be-forgotten requests with which it's complied, the Guardian first reported.

A Google spokesman declined to comment on the accuracy of the Guardian report, or whether the "forgotten" alerts would be displayed only in the search results returned to users in the 28 European countries that comprise the EU.

Google's moves come in the wake of the May 13 Court of Justice of the European Union ruling that search engines must comply with the EU's rules for data controllers, as defined by the 1995 EU Data Protection Directive, which includes protection for some types of personal information.

The ruling states the "operator of the search engine ... is, in certain circumstances, obliged to remove links to Web pages that are published by third parties and contain information relating to a person from the list of results displayed following a search made on the basis of that person's name."

The court's ruling on the right to be forgotten stems from a case involving a man in Spain who argued that Google's search results disclosed details about the auction of his repossessed home over unpaid debts. "[The man] stated that the proceedings concerning him had been fully resolved for a number of years and that reference to them was now entirely irrelevant," the ruling states.

Rapid Uptake

Two weeks after the ruling, Google posted a Remove Information From Google page for Europeans to request takedowns. As of June 2, 2014 - four days after the page was created - the company reported receiving 41,000 takedown requests, and said they were continuing to pour in at a rate of 10,000 per day.

While that number might seem high, EU Justice Commissioner Viviane Reding has contrasted it with the millions of copyright-related takedown requests Google receives. "This is a small thing as compared to the copyright things," she said last week on BBC Radio 5 Live. "It is possible to handle the copyright question, so it should also be possible to handle the takedown requests on personal data questions."

Reding also emphasized that the court ruling was only holding search engines accountable to the existing 1995 data protection law. "This decision has been taken in 1995, and we have a ... European law that is applied in all member states since that moment, and the only ones who refused to apply European law on European territory were some American companies - and it took the European Court of Justice to remind those companies that they are not over and above the law, but they have to apply the law like everyone else in Europe does."

Time to Adjust

In the wake of the ruling, EU data regulators have promised to give search engine providers time to figure out their approach. "We won't be ruling on any complaints until the search providers have had a reasonable time to put their systems in place and start considering requests," says David Smith the UK's data protection director, in a blog. "After that, we'll be focusing on concerns linked to clear evidence of damage and distress to individuals."

At the same time, however, search engine companies and data commissioners are still trying to define what compliance with the ruling might look like.

After the ruling, the Article 29 Data Protection Working Party, which comprises EU member states' national data protection authorities, issued a statement clarifying that when it comes to forgetting, the court was only referring to search engine results, and not original content. "The ECJ concluded that Web users have the right to directly request from the search engine the deletion of the links to Web pages containing information breaching their rights under the directive, even if the publication of the information on the Web pages in question is lawful in itself," it said.

Similarly, Smith said the right to be forgotten didn't mean people could automatically excise any online information they wanted, simply by filling out a form. "There is no absolute right to have links removed," Smith said. "Also, the original publication and the search engine are considered separately: the public record of a newspaper may not be deleted even if the link to it from a search website is removed."

Define "Forgetting"

According to the court ruling on search engines, the right to be forgotten "applies where the information is inaccurate, inadequate, irrelevant or excessive," but it said that right was "not absolute," and must always "be balanced against other fundamental rights, such as the freedom of expression and of the media."

When pressed in the BBC radio interview for details about how search engines should evaluate takedown requests, however, EU Justice Commissioner Reding couldn't offer simple guidance, saying instead that such requests must be decided on a "case by case" basis. "It is not about wiping out history, it is not about taking away what has been written in the newspapers, or taking away the newspapers' data," she said. "But it is about not putting in a prominent position data that is irrelevant or inaccurate. "

In other words, the right to be forgotten might more accurately be described as a right to suppress private information that appears in search results. Untangling those nuances might remind people that the right to be forgotten is just one translation of the original French "droit a l'oubli," which, as Google Chief Privacy Officer Peter Fleischer says, could also be translated as the "right to delete" or even the "right to oblivion."

But per what's dubbed the Streisand Effect, attempting to mask or delete information online may incite people to publicize it instead.

Google vs. Anonymity

When it comes to forgetting, some U.S. privacy experts have accused the EU of offering little - if any - guidance about what that looks like in practice. "Exactly what is that responsibility?" asks Jennifer Granick, director of civil liberties at the Stanford Center for Internet and Society, via Twitter. "Very hard [question] and EU punts, forces Google to muddle out an answer."

As befits what originated as a French concept, the ruling has also sparked an existential debate about the extent to which people are defined by what's online. For example, Google's Fleischer has reacted to the court's ruling by previewing the rush of people who would soon attempt "to delete and edit their life histories, or at least the public fiction of their life histories."

From a business standpoint, however, Google has a lot invested in that public fiction. In recent years, the company has made it more difficult to post anonymously to Google+ or YouTube.

But EU privacy commissioner Reding said that Google has no inalienable right to that information. "The European law is the 1995 legislation on protection of personal data, because we do have a basic right in Europe that data belongs to the individual, it does not belong to a company," she told BBC Radio 5 Live.

Right-To-Know Concerns

But some U.S. legal experts have also accused the EU of prioritizing privacy rights over freedom of speech. "I feel bad for Google, which is stuck trying to administer this preposterous ruling," says Stewart Baker, a Washington-based attorney with the firm Steptoe & Johnson. He formerly led the policy directorate at the U.S. Department of Homeland Security, where he negotiated a number of privacy and personal-data-related issues with European governments.

Baker has launched a contest to see who can make "the most outrageous - and successful - takedown request."

Others, meanwhile, have countered that Google has no inalienable right to telegraph people's personal information in its search results. "You can debate whether this is a good idea or not," says Trend Micro CTO Raimund Genes in a blog post. "Europeans like myself tend to think this is a good idea - after all, who else should control [your data] but you, right? Americans tend to look at it as a free speech issue. There is a cultural divide here that will not be easy to resolve."


About the Author

Mathew J. Schwartz

Mathew J. Schwartz

Executive Editor, DataBreachToday & Europe, ISMG

Schwartz is an award-winning journalist with two decades of experience in magazines, newspapers and electronic media. He has covered the information security and privacy sector throughout his career. Before joining Information Security Media Group in 2014, where he now serves as the executive editor, DataBreachToday and for European news coverage, Schwartz was the information security beat reporter for InformationWeek and a frequent contributor to DarkReading, among other publications. He lives in Scotland.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing careersinfosecurity.eu, you agree to our use of cookies.