John Carr on the GDPR: Poor process, bad outcomes

In this article, we hear from John Carr OBE on how the provisions of the new General Data Protection Regulation will affect children and young people.

Date 2016-03-31 Author John Carr OBE
picture
To recap
In January, 2012, the Commission of the European Union published a consultation document setting out a proposal for a General Data Protection Regulation (GDPR). This followed a ‘pre-consultation consultation' which started in 2009. The GDPR was a long time coming and that made what happened to children and young people at the 59th minute of the 11th hour all the more surprising and disappointing.
 
The Commission told the world the GDPR is a vital building block in a larger, strategic plan to develop an EU-wide Digital Single Market (DSM). Whichever way you look at it, there is no doubt the GDPR is a monumental legislative achievement of the highest importance yet in respect of children's rights and online safety it is seriously flawed.
 
The final text was adopted at the LIBE Committee on 17 December 2015. It still has to be ratified. This is expected in May although an amendment is being discussed which might put that date back by a month or two.
 
On behalf of the European NGO Alliance for Child Safety Online (eNACSO), I spent a couple of days in Brussels talking to people who had been closely involved in writing the GDPR. Here I report on what I learned from those conversations.
 
It is a shocking story.
 
Substantial, inexcusable and unacceptable
There are positive features in the GDPR which will benefit children and young people, the right to be forgotten likely being the best known, but at the same time the GDPR's shortcomings are substantial, inexcusable and unacceptable. The EU constantly tells us it takes children's and young people's issues seriously. This episode paints an entirely different picture.
 
The two problems
  • The GDPR completely fails to address the fact that millions upon millions of children across Europe, including very young children, have become and remain members or users of social media sites and other online services which are not meant for them.
  • Article 8 of the GDPR makes 16 the default minimum age at which a young person can decide for themselves whether or not to join online services such as Facebook. Up to that age, parents will have to give permission. This limitation breaches Articles 12 and 13 of the UN Convention on the Rights of the Child (UNCRC).
Article 12 speaks of States' obligations to guarantee a child who is capable of forming his or her own views the right to express those views freely in all matters affecting the child.
 
Article 13 uses similar language: The child shall have the right to freedom of expression; this right shall include freedom to seek, receive and impart information and ideas of all kinds, regardless of frontiers, either orally, in writing or in print, in the form of art, or through any other media of the child's choice.
 
Note the absence of any reference to an age limit or the need for parental consent.
 
How has this tangle arisen?
Part of the answer to that question connects to the opaque, calculated and - in the end - extremely hurried processes followed. Low rent politics drove decisions. Evidence and principles played no part.
 
The EU hands over policy-making to the USA?
In relation to the minimum age at which a young person can decide for themselves whether or not to join an online service, the Commission originally proposed 13. However, no impact assessment was carried out to explain or justify it. In the wider study published by the Commission at the time the draft GDPR was released, it merely said (at page 67) their choice of 13 took:
 
"…..inspiration…..from the current US Children Online Data Protection Act (sic) of 1998 and are not expected to impose undue and unrealistic burden (sic) upon providers of online services."
(bold added by the author of this piece for emphasis).
 
The US law in question is actually called the Children's Online Privacy Protection Act (COPPA). It was intended to protect children (for these purposes defined as persons below the age of 13) from commercial exploitation. This Federal law made 13 a legal requirement for every US-based company (e.g. Facebook and Instagram) and explains why 13 had become a de facto standard in a great many countries. From that point of view it is not hard to imagine why Commission officials thought 13 would get an easy passage. Turns out they got that very wrong.
 
Nevertheless, we have to ask if the Commission routinely sub-contracts its policy-making functions to Washington DC? COPPA predates the social media explosion that emerged from the development of Web 2.0 and is widely acknowledged to be hopelessly out of date and ineffective in a number of key ways.
 
Whatever the reason, the Commission's failure to carry out their own, independent impact assessment in respect of 13 was an egregious error. But it was not the result of carelessness or a lack of resources. It was part of a deliberate strategy.
 
More than merely ironic
As we shall see, the lack of a proper impact assessment was going to have considerable consequences. This is more than merely ironic because in the GDPR itself Article 33 expressly requires everyone else to carry out a data protection impact assessment which takes into account the nature, scope, context and purposes of any proposed data processing where that data processing is likely to result in a high risk for the rights and freedoms of individuals. Children and young people are individuals. And they have rights.
 
Refusing to face the issue
How do we explain the decision not to undertake an impact assessment? It seems right at the beginning everyone on the inside track in the Commission and others elsewhere anticipated that the age thing was going to be tricky.
 
Remember the current (1995) Directive had been silent on the point. This time around there was some support for maintaining such a stance but a majority finally accepted that was now untenable.
 
The Commission's reason for not doing a risk assessment was therefore simple. A risk assessment would only draw attention to the problem, so why do one?
 
Rather than face any national or other sensitivities around age and debate them openly - there could even be a question about whether the EU had competency in this space - the plan was to finesse (read ‘manipulate') the process. Officials believed they could pull it off and end up with what they wanted: 13. It's not hard to imagine the conversation:
 
The age business could mess up and delay everything so let's leave it until as late as possible, after we have made lots of progress with all the other stuff. If we get the timing right, everyone will be fed up with the GDPR. They'll want it done and out of the way. They'll have to agree something. 13 is the only show in town. Hold your nerve. Keep your eye on the glittering prize.
 
The results of absence
Perhaps inevitably, because there was no impact assessment, when it came under attack there were no robust arguments to hand to defend or justify 13 as the minimum age. Simply saying that's the way the Americans have been doing it for years so we've all got used to it clearly didn't cut any ice.
 
In early December 2015, as everybody thought the process was indeed drawing to a close, disaster struck. 16 suddenly appeared from nowhere (again with no impact assessment attached to it) and supplanted 13. Word of this leaked. A media storm broke out. Everyone involved then went into an undignified, panic-driven flip flop. A frantic scramble took place to change policy. 48 hours later a new one emerged. Young people's interests were sacrificed on the altar of expediency amidst worries about a few here-today-gone-tomorrow headlines. At no point did the parties to the GDPR negotiations seek any expert counsel on what the policy ought to be.
 
Not one but four
What have we ended up with? Not one age – which makes some kind of sense if building a DSM is the overarching objective - but four ages.
 
As already noted, the GDPR makes 16 the default age but Member States now also have an option to choose 15,14 or 13 instead. Absent any evidence justifying them, 16 and 15 will be impossible to defend within the terms of the UNCRC. Still, the menu does seem to have diverted the media's attention. Mission accomplished.
 
Balancing the differences, resolving the tensions
How do we balance a laudable desire to protect young people from commercial exploitation with their undoubted right to express themselves? Are commercial exploitation and its associated data collection practices the only relevant factors to be considered anyway? Not everything that matters on the internet is about money. Other things being equal within any single jurisdiction, is it right to have but one age governing every aspect of young people's privacy?
 
Erecting new internal barriers
In other areas the EU is intent on tearing down internal barriers. Here it is erecting them. Why? The menu of 16, 15, 14 or 13 cannot be about subsidiarity. Those ages do not fit with every Member State's existing data protection laws (the ones the GDPR is otherwise harmonising). I know of two large countries that will have to change their law if the GDPR remains as it is, the UK being one of them. And it is not hard to predict what will happen.
 
The mighty US companies will lobby country by country for 13 (their status quo). They will win in some and lose in others. The temptation to stay with the default of 16 is likely to be the path of least resistance but it will be interesting to see how the balance pans out and learn what that teaches us about EU decision making.
 
Is the EU happy to contemplate or encourage the emergence of diverging youth cultures within the Union? Isn't that the obvious implication of the decision they have made? The ramifications of such a development are potentially quite profound. They should be talked about not allowed to creep in under the radar.
 
And non-compliance?
The non-compliance problem among children was acknowledged by the Commission (at page 23 of their study) back in 2012. Now it is a great deal worse. Look at the levels of non-compliance with current age rules shown in a recent study published by the BBC. 75 per cent of 10-12 year olds in the UK have social media accounts with sites or services which specify a minimum age of 13. Glance also at the work of EU Kids Online showing similar high levels of non-compliance across the whole Union in 2011. All of the percentages will have gone up since.
 
The GDPR does nothing to address this. On the contrary, without any corresponding requirement to carry out age verification, by setting higher age limits the GDPR will teach or entice even larger numbers of children to misrepresent their age so as to get into the otherwise forbidden places where children will believe all the best things are going on.
 
This sorry tale does not point to failings on the part of a particular individual or European institution. Rather it points to a systemic or collective failure.
 
We have to find a way to ensure things like this cannot happen again.
 
The views and opinions expressed in this article are those of the author and do not necessarily reflect the views of the Better Internet for Kids Portal, European Schoolnet, the European Commission or any related organisations or parties.
 
About the author of this article:
John Carr is Expert Adviser to the European NGO Alliance for Child Safety Online (eNACSO) and to Bangkok-based global NGO, ECPAT International. He is also Secretary of the UK Children's Charities' Coalition for Internet Safety (CHIS), a Member of the UK Government's Council for Child Internet Safety (UKCCIS) and a Senior Visiting Fellow at the LSE (London School of Economics and Political Science).
 
John was formerly a Senior Expert Adviser to the United Nations (International Telecommunication Union) and Adviser to the European Network and Information Security Agency (ENISA). He is Former Vice President of MySpace and Former Member of the Microsoft's Policy Board for Europe, the Middle East and Africa.
 
He is the author of ‘The role of the internet in the commission of crime', ‘Out of sight, out of mind – global response to child pornography on the internet', and co-authored ‘One in three: internet governance and children's rights' alongside Sonia Livingstone and Jasmina Byrne. John is also a member of the Europol Expert Platform.

Related news