You may be wondering what, if any, impact Europe’s sweeping new data law, GDPR (for General Data Privacy Regulation), has on parenting tech users in your life. After all, it went into effect today, and you may’ve seen headlines like the New York Times’s about how it makes Europe the “world’s leading tech watchdog” or the piece in Ad Age pointing out the irony that a data privacy law triggered a tsunami of spam in our email in-boxes (Quartz actually sent an email with the subject: “This is not that kind of email”).
Probably the biggest adjustment where kids are concerned is that 13 is no longer the worldwide default “minimum age” for kids in social media; just everywhere but Europe. And there, it’s all over the map. The GDPR raised the default minimum age to 16 for EU member nations, giving individual countries the option to lower it. The age level refers to when apps and services no longer need to obtain parental consent in order to allow a young person to use their service; which amazingly means, for example, that in some countries, such as France, “the age of consent to sex is the same as or lower than the age of consent for data purposes,” as UK online child protection expert John Carr put it in his blog a couple of months ago. [I’d welcome your thoughts, in Comments below, on whether a 15-year-old should have their parent’s consent for social media use (researchers, unlike the GDPR drafters, did survey parents — find out what they heard here).]
More protection, more confusion
The problem is, no one — from researchers to companies — is completely sure how the companies will obtain and verify a parent’s consent. And, practically speaking, how much time will parents really have to go through whatever hoops will be part of providing their consent to multiple companies? GDPR is adding fresh fuel to digital-age parent shaming.
In fact, there is more confusion about protecting minors than protecting everyone else in the GDPR and no teeth for doing so — ironically, because the law refers to children as “vulnerable individuals” deserving “specific protection.”
For example, in this passage…
Children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data. Such specific protection should, in particular, apply to the use of personal data of children for the purposes of marketing or creating personality or user profiles and the collection of personal data with regard to children when using services offered directly to a child. The consent of the holder of parental responsibility should not be necessary in the context of preventive or counselling services offered directly to a child.
…there’s a question as to whether “services offered directly to a child” means only apps and services specifically designed for children (such as Disney’s or PBS Kids) or social media services that are widely used by minors. And what about when, in an effort to abide by the regulation, companies over-restrict children’s use in countries with lower minimum ages. Prof. Sonia Livingstone at the London School of Economics gives the example of Facebook-owned Whatsapp: “Currently used by 24% of UK 12–15 year olds, [Whatsapp] announced it will restrict its services to those aged 16+ regardless of the fact that in many countries in Europe the digital age of consent is set at 13.” And 13–15 year-old Whatsapp users in the UK, too, are now out of luck.
CNN reported a statement made by Whatsapp that it “had to make a tradeoff between collecting more information or deciding to keep it simple and raise the minimum age of users to 16 across Europe.” Which brings up an important point about unintended consequences. Age verification means <em>more</em> data is gathered, not less. The more data companies gather and store on people of any age, the more vulnerable those users are to data breaches, identity theft, etc. (and minors’ data are extremely attractive to identity thieves). On the other hand, where appropriate, user data can be used by law enforcement to protect innocents and catch criminals. And when companies don’t verify and thus know users’ ages, they have no way of knowing who’s vulnerable in order to provide protective features for them.
Does GDPR now reinforce the unintended consequence of COPPA (the U.S.’s longstanding Children’s Online Privacy Protection Act) that now many 13–15 year-olds, too, will be lying about their age in order to use their favorite apps? I think there’s no question it will.
Maybe consult the rights holders?
As Prof. Sonia Livingstone points out in the Parenting for a Digital Future blog, if we’re talking about children’s and teens’ rights, why aren’t we talking with the rights holders? “Children’s voices and experiences have been signally lacking from these debates, largely ignored precisely by the states and European regulatory bodies that have officially promised to recognise their right to be heard.” Two and a half years ago, when the GDPR was being ratified, I wrote that Europe was in effect taking youth digital rights backwards. Because unexpectedly and inexplicably — in a closed-door session and without consulting child online protection experts, much less youth themselves — the policymakers who drafted the GDPR made a modification that raised the EU’s digital age of consent to 16.
So to answer the question of what GDPR means for American parents and kids, not a lot of change, once we get past all the additional spam and news coverage. At least not right away, because, as the Times reports, nobody, including Europeans, will really have a handle on the impacts for years. Reporter Adam Satariano does add, though, that “even if you don’t notice big changes, the new law provides important privacy rights [for users of all ages] worth knowing about.”
Maybe all the news, emails and discussion will get us thinking about involving our kids in family and public discussions — maybe even policymaking––about policies that concern them going forward! That’s the dream.
- A fundamental right of youth, digital and otherwise — enshrined in the UN Convention on the Rights of the Child — is to be consulted in the making of policies that concern them. That right was a full-on oversight by the creators of the GDPR So I was thrilled to read that Prof. Livingstone has started and will lead a new research project to surface “how children themselves understand how their personal data is used and how their data literacy develops through the years from 11–16 years old.” This should’ve been initiated by Brussels years ago, but at least it’s happening now. Read more about it here.
- About 2011 research on the unintended consequences of COPPA: Did any policymakers in Europe happen to have a look at that?
- Youth digital rights: My post about their framework almost 4 years ago (I’ll be blogging more on this soon)
- A helpful explainer of GDPR from the New York Times
- About EU members’ various digital ages of consent: from the European Commission
- About all that spam: New York Times tech writer Brian Chen advises people to read, not just delete, the emails and explains why
- TechCrunch on Facebook’s new Youth Portal educating them on how their data’s being used – aimed, I think, more at teens and parents in other countries, where teen Facebook use is still huge
- Age verification challenges: Wired has details.
- Trying to do so much. Possibly because aiming to “ease restrictions on data flows,” give citizens people over their data, protect their right to privacy and spur economic growth, GDPR is necessarily complex and ambiguous, but there’s another reason, suggests Prof. Alison Cool at University of Colorado, Boulder, in an opinion piece in the New York Times: “What are often framed as legal and technical questions are also questions of values. The European Union’s 28 member states have different historical experiences and contemporary attitudes about data collection. Germans, recalling the Nazis’ deadly efficient use of information, are suspicious of government or corporate collection of personal data; people in Nordic countries, on the other hand, link the collection and organization of data to the functioning of strong social welfare systems. Thus, the regulation is intentionally ambiguous, representing a series of compromises.”
- An analysis of GDPR’s effects by Julia Powles, Cornell Tech and New York University researcher, in The New Yorker
- Our “wakeup call”: My piece last month about this spring’s pivotal “moment” (or set of converging developments) when “big data” suddenly got personal for people in many societies, especially the US and UK
This piece was originally posted in Anne Collier’s blog at NetFamilyNews.org.