In the EU and UK, the personal data rights of anyone under 18 are protected by the General Data Protection Regulation (GDPR), which states that children “may be less aware of the risks”. The EU created the regulation amidst public concerns around children’s safety online.
This article analyses the overarching aims of personal data protection regulation and how its lived experience presents shortcomings when faced with the realities of children’s use of online services, e.g. apps, games, connected toys and websites.
It may seem unsuitable to start an article analysing the lived experience of European children with a story about Teeth, a 13-year-old student in the US. However, through his story, Teeth introduces very clearly the needs of young people. I will continue by describing crucial aspects of the regulation and draw upon insights from research into children’s lives and childhood to suggest three types of context which may produce different lived experiences of children’s data rights. I will conclude that further insights into these contexts for children and childhood are necessary to address the complexity of designing services that enable children to act upon their rights and needs comprehensively.
I will use the terms ‘children’ and ‘young people’ interchangeably in this text, using specific ages only where necessary for context.
Teeth’s story
“[In] September 2021, when in social studies my teacher asked (…) for us to make a presentation sharing pieces of media that are important to us (…) I chose was this song (…) a coping mechanism after my suicide attempt (…) and it was one of the songs I listened to a lot during music therapy (…) I wrote a shorter version of that in my slideshow on the school computer and Gaggle just didn’t get any of the context, just the word ‘suicide’ and flagged that and then of course both me and my mom found that the software was on there and were understandably shocked and horrified, because that was not something that I meant for anyone besides that one teacher to see, because I decided to trust him”
(In Machines We Trust)
Teeth’s story is part of a podcast exploring the tradeoffs between safety and privacy in schools. Teeth recalls how Gaggle, a software on his school computer, scanned his work and identified it as a potential threat. The software sifts through text locally and flags words that might mean the student is struggling. For now, the technology cannot discern context or intent. Words flagged are sent to human moderators to triage risk levels and notify responsible adults. The owner of Gaggle says the software prevented 1400 student deaths by suicide in 2021 in the USA.
Whilst the example is taken from the US, schools in Europe can also use software of this nature. However, the personal data rights of children in Europe are under GDPR’s special protection. So what are those rights? Furthermore, what actions can children take concerning their data?
Children’s personal data rights online
In 2016 80% of Europe’s population had internet access through various personal and public devices. A 3rd of those are thought to be children under 18. Though people’s use of online services is diverse, the exchange of personal information is common. Digital technologies are seen as both an opportunity and a risk. It affords many opportunities for children to interact with others and information through learning and playing, as well as risks to their well-being, privacy, agency and safety online. The regulation aims to address these risks.
The ubiquitous affordances of online technologies mean the lived experience of personal data protection is messy and complicates the adoption, auditing and enforcement of concerning regulations.
GDPR came into practice in 2018, and the UK has adopted the regulation. Noticeably the UK’s Information Commissioner’s Office (ICO) also created an ‘age-appropriate design code’, which all online services accessed by UK children must implement from 2020. The regulation intends to protect people’s online contributions, participation and habits, and children bare the same rights as adults:
- to be informed and consent to what personal data is collected, processed and used for;
- to access, rectify, erase and move their data;
- concerning automated decision-making, profiling and targetting.
However, children have additional rights:
- the collection of personal data of children under 16 (13 in the UK and some EU member states), the age of consent, is restricted and requires parental consent. However, children between the age of consent and 18 still require special protection.
- service providers must help children of all ages understand the risk of providing their data, for example, by providing privacy notices written for children.
Personal data is any information that can be connected to a person, e.g. their name or shopping history. The ability to participate and exchange information online, freely and trustingly access information and services whilst keeping personal data protected, is a crucial part of the right to privacy, as in Article 16 of the UNCRC.
Westin defined privacy as “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others”. Privacy has also been connected to the construction of trust within human relationships and a sense of control over space and in the absence of others, e.g. parents.
Teeth articulated that his privacy should be respected. Teenagers have expectations concerning who will see the information they share online, depending on where it is shared and how that audience will interpret and use it. Furthermore, they know that information shared online can impact their future opportunities. Information has the potential to stay online forever and therefore be used out of context, e.g. to understand the suitability for a job. Thus, both organisations providing services and the children using them must be clear about their expectations on what information can and cannot be used for.
To address this complexity, I suggest we look at three levels of context, which may have an impact on and produce different lived experiences of children’s data rights:
- Interactions: assessing the relative affordances of diverse digital services beyond the fears of adults in the news;
- Intentions: recognising the needs of children and their intent within online environments;
- Relations: understanding the tensions within the relationships between children and diverse authority figures.
1. Interactions
Teeth used the school computer to write a presentation for a project. The software used on that computer did not recognise Teeth’s aims as he was writing. Schools decide what words are flagged, who should be notified, and what this software is used for, e.g. to detect threats and prevent harm or recognise unwanted behaviour and punish perpetrators. However, what are adults ultimately worried about when children use online devices?
Researcher Kirsten Drotner suggests two perspectives to analyse children’s media in the context of overarching “socio-cultural practices”:
- how children use these services and what for across intended and unintended uses;
- how children’s use of these services may be mediated, restricted or adapted for their use.
Understanding the sentiments of news discourse regarding children’s use of online services can help understand its impact on the regulation. As Elizabeth Denham CBE, previous Information Commissioner, says, the regulation and its advice came from listening to the public’s concerns.
The news and public discourse around children’s digital use have been described as ‘media panics’. They are often reignited when a new form of media begins to be widely used by children. The news report on the dangers of social media use and the concerns are diverse:
- exposure to unsuitable content, sexual or violent;
- being victims of grooming, bullying, and sexual assault;
- harm to physical and mental health, e.g. suicide and self-harm;
- the impact of too much screen time on children’s development;
- linking violent content to committing crimes, e.g. video games and school mass shootings.
Some argue that content should be constrained, and others that children should be educated. Faced with a rapidly and ever-changing range of online services and devices, parents find it hard to keep track of what to do and resort to constraining screen time or avoiding it altogether. It leads to what has been described as ‘paranoid parenting’, driven by a limited understanding of what children are doing and a lack of easy means to act upon it.
In light of the regulation and the news, providers of online services used by children currently restrict use through age gates, age bands or mechanisms like parental consent and controls that can be applied on device, service or app level.
Age gates mean users must be older than a certain age to be granted access to the service. Age bands mean the service will adapt to the age of the person accessing it and have a child-specific version. For example, TikTok allows users under 13 in the US to use their service with a ‘limited app experience’. TikTok chose to set defaults to off instead of obliterating the features to give young people an opportunity for a deliberate choice. It needs to be clarified if the choice is informed.
Research has found that many children find ways to circumvent age limits and have accounts before the age of consent. Many choose to lie about their age when faced with being unable to access services because they are too young. The service will subsequently see them as adults, and their rights as children will not be protected.
What is challenging about the ‘media panics’ and the most easily accessible tool for parents being screen time restrictions is that it takes us away from discussing children’s needs and realities. Parents and children discuss screen time more than the activities conducted on those screens. More research needs to be done to understand the qualities of various online services and children’s uses, to make their experiences more equitable for children.
So what are children’s needs and intentions concerning online services? Furthermore, in what ways might they want to act upon their rights?
2. Intentions
Researcher David Oswell suggests that Media, alongside Family and School, are crucial “institutions through which children become socialised”. Children have been characterised as keen to try and experiment with new technologies. Children perceive access to digital media as a right and a sense of freedom and privacy.
When Gaggle flagged his words, Teeth had not shared his work yet. Furthermore, while writing, he deliberately included a very intimate personal story intended for a specific teacher. Neither Teeth nor his mum knew the software was on the computer.
The regulation states that organisations providing services must consider children’s best interests. In the past three decades, there has been some research on children’s and families’ use of digital and online devices. However, there needs to be more research on understanding how children’s needs and personal data rights have been considered in the design of digital and online services and the impact of their lived experience. The regulation has been criticised for an absence of children’s voices in how it has been constructed. Policies and information made for children and child-specific provisions to request their data and act upon it are scarce.
The particular need for children’s protection is linked to how children are perceived, “what it takes to be a vulnerable or a competent child”. Even within Europe, the attitude toward children and their competency has cultural variations, as is evident by how different countries defined different consent ages.
Researchers Valerie Steeves and Priscilla Regan argued that the way children are being educated about privacy and online use “fails to resonate with young people’s lived experiences and the kinds of privacy problems they are concerned about.”
Children are seen to “post everything” and, therefore, do not care about their privacy on social media. However, research has shown that teenagers make conscious choices about what to share, where, and with whom. They are also concerned with what others share about them and seek control over it. For example, a child has pursued legal action against their parents because of photos they shared on social media. Moreover, danah boyd’s research, alongside many other examples, shared a story of a teen who labouriously deletes comments after reading to keep Facebook clean, even though the service has not been designed with this intent.
If privacy is an integral part of their life and if they are to enact on their rights, young people need and want ways to take ownership over their personal data. From understanding their rights and risks of different information exchanges to accessing tools to control their data and its use.
Suppose we acknowledge children as rightful actors and consumers in this space. In that case, we also need to consider the realities of their diverse family lives and how they often rely on adults can gatekeepers to technology. So what power structures are children part of and might need to be considered, and who might “hold power over them”?
3. Relations
Children and their parents and carers constantly negotiate and decide on privacy and freedom to use devices and online services. Though Oswell points out that family practices have tended towards the ‘individualisation’ of children, as actors taking more and more responsibility within family life and relationships, children’s right to personal data privacy cannot be decoupled from these ways of relating to each other. Parents are expected to consent to the privacy policies of the services their children use. The need for parental control and supervision strains the relationship between carers and their children. For example, Teeth’s mum was notified about him potentially being in danger, triggered by the words he wrote on the school computer.
The reality of those caring for children is one of shared devices, unsupervised use, and constant attempts on boundaries. The diversity of families and the relations within them represent diverse media literacies, forms of access, households (and houses), attitudes and responsibility.
The regulation assumes parents are informed, understand the technology and its risks, and have more competency than their children regarding online services. On the one hand, constraining children’s use of online services too much leads them to choose other means to share and act online without fear of punishment or repercussion by parents and other authority figures. Subsequently, this leads adults to ingeniously infringe on children’s privacy, such as using a different account to follow them. As danah boyd articulates, this risks perpetuating the idea that young people “do not have the social status necessary to deserve rights associated with privacy”.
On the other hand, complete trust in what the child does online can be seen as ‘neglect’. The news judges parents on good and bad parenting, leading to parental guilt.
Placing full responsibility on the individual — both the child for their misbehaviour and the parent for their lack of good parenting — for a social problem is not new and is rooted in neoliberal attitudes. Consequently, it is up to the parents, perceived as the consumers of these services, which they purchase for their children, to find ways to protect their children and develop solutions for a problem bigger than family life. The regulation is devised by governments and implemented by services to some degree. However, the parents are expected to enforce it through consent and parental controls on device restrictions. And so we see this shift of responsibility from the social sphere to the individual sphere, which book author Lavalette articulates as a neo-liberal “reduction in the state’s role as provider of services.”
Neo-liberals see this shift as the empowerment of individuals and the right to freedom and choice. This, unfortunately, leads to focusing solutions on an individual level instead of urging the examination of the overarching structures and systems that cause the problem. Regardless, more support is needed for parents to fulfil this role. Addressing children’s rights requires more than educating children and their carers and providing child-specific tools. It is also necessary to create a sense of collective and shared responsibility and action to be shared between governments, communities, service providers, parents, and educators and mindfulness around the relationships between different actors.
Conclusion
In this article, I explored privacy needs raised by Teeth and analysed children’s data rights as defined by GDPR. Recognising that different realities might produce different experiences of data rights, I proposed three levels of context to understand children’s lived experiences. These contexts provide insight into what online services need to consider to meet the needs of children and empower them to act on their rights.
1. Interactions
I examined how worrying news about the dangers of social media has led parents and service providers to adopt restriction mechanisms, to which children have found workarounds. This can distract from the need to understand online services’ affordances, discussing and understanding children’s activities, needs, intentions and expectations online.
2. Intentions
I established that though children are eager adopters of online services, these are not necessarily designed for them, and their voice is not widely heard. Children are seen as in need of protection and as careless about their privacy. However, they have demonstrated that they make conscious decisions about their actions online and create strategies to take control of their data, even though there are no straightforward means for them to do so.
3. Relations
So far, it still needs to be determined how children can act upon their rights alongside the adults who are often the gatekeepers of technology. Thus, my third point briefly describes the potential power struggles and tensions that might be at play within children’s lived experience of their data rights and their use of online services.
Finally, to fulfil children’s data rights and needs, I advocate for collective responsibility. Further unbiased academic research, the collaboration between service providers and policymakers, and service co-creation with children and their parents are necessary.
References
- The SAGE Encyclopedia of Children and Childhood Studies [Article, 2020]
- Social privacy in networked publics: teens’ attitudes, practices and strategies [Paper, 2011]
- The Future of Privacy in Social Media [YouTube, 2012]
- Children and Digital Media: Online, On Site, On the Go, The Palgrave Handbook of Childhood Studies [Book, 2009]
- Strengthening privacy and safety for youth on TikTok [Article, 2021]
- What should our general approach to processing children’s personal data be? [ICO]
- What are the rules about an ISS and consent? [ICO]
- Individual rights [ICO]
- Children [ICO]
- What is personal data? [ICO]
- Age appropriate design: a code of practice for online services [ICO]
- Best interests of the child overview [ICO]
- Who watches AI watching students? [Podcast, 2022]
- A boy wrote about his suicide attempt. He didn’t realize his school’s Gaggle software was watching [Article, 2021]
- ‘In Defence of Childhood’: Against the Neo-Liberal Assault on Social Life [Chapter, 2005]
- Children’s rights and digital technologies [Book Section, 2018]
- Young people, new media: report of the research project Children Young People and the Changing Media Environment [Research report, 1999]
- EU kids online: final report [Report, 2011]
- In the digital home, how do parents support their children and who supports them? [Article, 2018]
- Children: a special case for privacy? [Article, 2018]
- Digital by default: the new normal of family life under COVID-19 [Article, 2020]
- The Future Of: Children’s Online Privacy [YouTube, 2021]
- Family and household, The Agency of Children: From Family to Global Human Rights [Book, 2012]
- Youth Privacy and Data Protection 101 [Article, 2021]
- Mobile phones are destroying family life — but it’s the PARENTS who are to blame, study claims [Article, 2017]
- Young people online and the social value of privacy [Article, 2014]
- A summary of the UN convention of the rights of the child
- Children and Media, The SAGE Encyclopedia of Children and Childhood Studies [2020]
This article is based on an essay written for a ‘Critical Theoretical Debates about Global Childhoods and Society’ module at UCL.