By Lukas Mol
Introduction
Looking back on my youth, I developed a stronger feeling that I was part of the last generation to grow up ‘partly’ without a smartphone. And even though I got a smartphone at the age of 13, my first smartphone was not as advanced and addictive as today’s smartphones are. It was the classic BlackBerry phone with an old-school physical keyboard, and since it already had WhatsApp on it, it made it easier to reach out to parents or friends. However, this was not a major revolutionary turning point in my youth, simply because I had already gained some necessary experience and independence in life without a phone, which formed a basis of trust between my parents and me. However, while my experience was rooted in trust and gradual independence, the landscape of childhood has undergone dramatic changes over the past decade.
As a result of the dominance of Big Tech companies, the development and evolution of
tracking apps, the digitalization, and the normalization of phone ownership from a young age,
the upbringing of a child has undergone a rapid metamorphosis in recent years. This
metamorphosis has led to a form of anxiousness in which parents are more likely to control or
at least follow their child in the digital world. Resulting in a great and essential loss of the
child’s privacy, and thus their awareness of their freedom.
This is why I will argue that the early exposure of digital surveillance to children normalizes
oversight and undermines their awareness of privacy, thereby laying the groundwork for
authoritarian control. I will look at how surveillance has become embedded in modern
parenting practices, explore the role of big tech companies in facilitating this shift, and reflect
on the broader societal consequences of growing up under constant watch. This analysis will
adopt a comparative approach between the European Union and the United States,
highlighting how different regulatory environments, cultural attitudes, and technological
Infrastructures shape children's experiences with surveillance.
A monitored childhood
As Heidt (2024) argues in his book, the childhood of nowadays can be called a so-called
phone-based-childhood. In the US, for example, 43% of 10-year-old kids already own a
smartphone, compared to 58% in Germany (Vatu, 2023). Due to the technologically
deterministic design of smartphones that are driven by personal and constantly developing
algorithms, it is very likely that, specifically, children will spend an extensive amount of
time on these devices. Dutch research has shown that children between the ages of 6 and 12 years old, on average, spend more than 3 hours a day on their screens – that includes gaming,
TV or social media (Beeldschermgebruik Bij Kinderen in Regio Haaglanden, 2024). Owing to this trend, parents have – rightly so – become increasingly concerned about the risks of their
child encountering harmful and sexual content. As a result of this, almost 86% of the parents
in the US monitor their online activity (Pandey, 2024), compared to 48% in Germany
(Freedom and Responsibility, 2021). Monitoring in this case concerns checking their screen
time, downloads, app usage, websites visited, text messages, calls, search history, and setting
alerts for risky behaviour such as cyberbullying, sexual content, etc.
Another form of monitoring is taking place within schools and universities. In the
case of the US, 89% of the teachers said that their school uses an online monitoring software
(Hidden Harms: The Misleading Promise of Monitoring Students Online, 2022). Even though
schools, as well as parents, may have their justified reasons for monitoring online activity –
which are in the case of the US, a growing number of self-harm incidents and school shootings – the problem lies within the extensive and insincere use of this monitoring software. As the same study pointed out, the monitoring was often not limited to school hours, and it was rather used for discipline than for students’ safety (Hidden Harms: The Misleading Promise of Monitoring Students Online, 2022). Because students from low-income families, Black Students and Hispanic students were more likely to rely on school devices. They are subjected: “…to more surveillance and the aforementioned harms, including interacting with law enforcement, being disciplined, and being outed, than those using personal devices.” (Center for Democracy and Technology, 2025)
As pointed out, it’s not necessarily the fact that children are being monitored, it’s the
– often incongruous and hierarchical – use that causes the problem. Compared to the EU there are stricter rules under the General Data Protection Regulation, which I will explain further later on. To give some examples, the use of monitoring software is only allowed, if necessary, limited, and transparent, and video surveillance is heavily restricted, never in classrooms during lessons (European Data Protection Board, 2020).
And the last case of monitoring I want to discuss is even more hidden. This case concerns the signs of apps targeted at children. Sun et al. (2023) analysed over 20,000 of such apps. The study revealed that a substantial number of these apps request unnecessary permissions and incorporate third-party trackers, frequently violating privacy regulations. The authors highlight the discrepancy between regulatory frameworks and actual app practices, calling for stricter enforcement and better design guidelines.
Aside from monitoring, surveillance also becomes normalized through sharenting, the
practice of parents sharing photos, videos, and personal details about their children on social
media. This often begins before the child is even born, with the posting of an ultrasound image.
In the US, 77% of the parents participate in sharenting (The Digital Wellness Lab, 2024) and
it’s not always likely that the children give their consent or are aware of it. A study from the
University of Michigan found that 56% of the children are aware of the it, but only 26% of the
children felt they had a say in what their parents share about them. Of course, one can argue
that parents sharing photos, etc., about their children has been an ever-existing given – but,
since it’s shared online, the content will stay forever, even when deleted. However, parents
don’t have the intention to harm the child: “…they do not see the long-term consequences of their actions, including possible psychological impact to the child, identity theft or exposure
to online predators.” (Iskül p.108) And as pointed out by Barclays Bank: “…by 2030, 7.4 million incidents of identity fraud per year could be linked to sharenting.” (The Digital Wellness Lab, 2024)
Both sharenting and monitoring a child contribute to the internalization of surveillance.
In both cases, there is a lack of awareness of a child's right to be forgotten.
Suppose you grow up in an environment in which constant surveillance is a predetermined given. In that case, you learn to adapt your behaviour according to the norms of the surveillance society and, most importantly, surrender your sense of privacy as a natural right.
Exploitation and marginalization through datafication
The content that is generated either through sharenting, monitoring, or the child’s own
use of apps and social media form the foundation of the child’s datafication, the process of
turning everyday activities, behaviours, interactions, and even emotions into quantifiable data
that can be collected, analysed, and used. The collected data is not only used to make the
content more personal, Big-Tech is also selling the data of children to data collectors, which
leads to profiling and targeted advertising. This not only leads to a new form of market
control but also allows the government to include algorithmic classifications in its ways of
governance. And as Mascheroni points out, “A growing literature shows how algorithmic
classifications are used to distribute access to resources and systematically exclude already
vulnerable groups, such as the poor, immigrants, and people with disabilities (Eubanks, 2018;
Gangadharan, 2017; Madden, 2017; Marwick & Boyd, 2018, as cited in Mascheroni, 2018).”
A perfect example is the Dutch childcare benefits scandal, a large scandal where thousands of
parents were wrongly accused of fraudulently claiming childcare benefits. Algorithmic
classifications made it possible to label parents with either a migration background or a low-
income as fraudulent or suspicious. As a result of the lack of human supervision on these
algorithms, parents were wrongly fined, sometimes up to tens of thousands of euros.
Aside from the government’s use, software, social media, and gaming companies
are using their collected data for commercial exploitation. And children are
way more often vulnerable to this form of exploitation. Research of UNICEF found that
almost half of the elementary children who participated had spent money on loot boxes or
extra avatars in games like Fortnite, Roblox, or Brawl Stars (VOS, 2024). Another study found
that, on average, children spent €39 a month on virtual games (Trends, 2020). Even though it’s almost always the parents’ money that is being spent, it often happens without any consent. 46% of parents have discovered their child used their credit or debit card without permission, marking a 59% increase from 2018 (Lending Tree, 2024). However, financial exploitation is often easier to address since it leaves behind traces, such as bank debits, and Tech companies are more frequently sued for this issue.
On the other hand, it is the most invisible form of data exploitation that harms the child’s privacy without them being aware of it. UK research has shown that individuals can identify overt risks, such as sharing personal information, yet they struggle to recognize more subtle threats, like online tracking and targeted advertising (Mascheroni et al., 2019). This makes them not only, as pointed out earlier, a target for financial exploitation, but also for sexual abuse. A recent global study found that 1 in 12 children are subjected to online sexual exploitation or abuse. And again, many of these children did not recognize the exploitative nature of these interactions at the time. (Georgia State University News Hub, 2025).
This growing ecosystem of datafication and exploitation reveals just how vulnerable children are to both visible and invisible forms of harm online. While financial misuse can sometimes be traced and challenged, the long-term consequences of constant data collection, profiling, and manipulation remain largely unknown. As highlighted by the article, Children's Privacy in the Big Data Era, there is an urgent need for interdisciplinary research to fully understand how these practices impact children's development, autonomy, and well-being in the long term (Mavoa et al., 2020).
Laws designed to counter this exploitation
To prevent or restrict the risks of the datafication of a child, there are laws in both the the EU and the US that try to protect a child’s online privacy. In the case of the EU, the General Data Protection Regulation (GDPR) requires explicit parental consent for data processing of children under age 16 (though member states can lower it to 13) and enforces the right ‘to be forgotten’. It also holds companies accountable for any data collection, even from third parties like tracking in apps. In 2023, for example, TikTok got fined 345 million euros for various reasons, such as setting children’s accounts to public by default and lacking a proper verification of parental consent for users between 13 and 16. This does symbolize an important change in the discourse of the legalisation of big tech companies, and proves that the GDPR can be used in an effective way to sue these companies (Data Protection Commission, 2023).
The problem is, as Edri (2023) notes, there is currently no satisfactory way of knowing who a child is or an adult online, making age-based protections difficult to enforce without compromising user privacy. To name a few: self-declaration methods are easily bypassed, ID verification raises significant privacy concerns, and parental email consent often suffers from
low compliance. Thereby is the legal language is often vague, e.g., terms like “legitimate interest” or “undue delay”. This results in organizations that struggle to interpret and comply. This benefits large companies with legal teams, while small businesses risk noncompliance due to confusion.
In the US, the law that should protect children is even less protective and very
outdated. The COPPA was introduced in 1998 and ‘protects’ children under 13, but the same
goes for here, the age verifications are easily bypassed. Where the EU focuses on actual
protection by data minimization and purpose limitation for tech companies, if the age of the
child is well determined, the US focus is much narrower, focusing mostly on stopping the
collection of personal data from children without parental permission. And this doesn’t
protect children from manipulative design, algorithmic profiling, or addictive features.
Besides, the COPPA oversees technologies such as smart toys, home devices, location
tracking, and biometric data, technologies that collect very sensitive data and didn’t exist
when the law was introduced.
The way towards digital authoritarianism
“He is seen, but he does not see; he is the object of information, never a subject in communication.” (Foucault, 2008, p.5) This quote mirrors the contemporary condition in which children grow up under constant surveillance. Even though each childhood has always been characterised by a form of surveillance, e.g., that of the parents, the school, or the state, the rapid evolution of the digital world has brought us into a new universe with a new power dynamic. A power dynamic that is even more hierarchical — and most importantly— more invisible. This invisibility, however, is not merely a consequence of technological advancement, but rather a fundamental feature of modern power structures, as exemplified in Foucault’s concept of the panopticon. For Foucault the idea of the panopticon is not only that of an architectural structure: “…it is the diagram of a mechanism of power reduced to its ideal form; its functioning, abstracted from any obstacle, resistance or friction, must be represented as a pure architectural and optical system: it is in fact a figure of political technology that may and must be detached from any specific use.” (Foucault, 2008, p.9)
Concerning the case of children in the digital age, even though they may not always be aware of the surveillance, the principle of the panopticon remains effective as long as it leads to self-regulation. A recent study showed that teens develop so-called workarounds: “e.g., moving to social media chat when texting privileges are removed and keeping alternative social media profiles secret from parents.” (Modecki et al., 2022, p.1686) This demonstrates a high level of strategic thinking and self-regulation in response to parental surveillance — children adapt their behavior to meet social expectations and boundaries, whether to comply with or subvert them.
What begins as parental control within the private sphere thus mirrors broader societal dynamics, where surveillance becomes internalized and normalized from a young age. This normalization is problematic because it creates a climate in which online privacy is routinely
undermined, laying the groundwork for a form of digital authoritarianism. By digital
authoritarianism, I do not mean the traditional notion of an authoritarian regime, where a
single dictator or ruling party holds absolute political power. Rather, I am referring to a more
subtle but equally concerning development: the increasing digital dependence of ordinary
citizens, coupled with an ever-growing asymmetry of power between individuals on the one
hand, and Big Tech companies and the state on the other. In this context, control is not always
enforced through overt political repression, but through data collection, algorithmic surveillance, and the quiet erosion of privacy norms. This creates a system in which consent is assumed, transparency is lacking, and citizens have little say in how their personal data is used, conditions that can enable and entrench undemocratic forms of control. And as Greenwald claims it goes further than just the growing difficulty in organizing movements of dissident: “…mass surveillance kills dissent in a deeper and more important place as well: in the mind, where the individual trains him- or herself to think only in line with what is expected and demanded.’” (as cited in Dencik, Hintz, & Cable, 2016, pp. 177–178)
And still, despite this erosion of privacy norms, one can argue that a democratic state must occasionally interfere with the privacy of its citizens to prevent crime and ensure protection. But the problem is that such an argument overlooks the power that governments have in doing so, because in recent years there have been a lot of examples that show us that democratic governments as well tend to abuse this form of power. It is often used for: “…far more than illegal acts, violent behaviour and terrorist plots. It typically extends to meaningful dissent and any genuine challenge.” (Dencik et al., 2016, p. 3) One of the most famous examples of privacy abuse, the Snowden Leaks, showed us that many politically active citizens are under scrutiny.
“For example, documents showed that government agencies in both the US and the UK have actively been engaging in the monitoring of political groups with a ‘watchlist’ including international organisations such as Medecins Du Monde (Doctors of the World), UNICEF, Amnesty International and Human Rights Watch, as well as prominent individuals such as Ahmad Muaffaq Zaidan (Al-Jazeera’s Pakistan Bureau Chief), Agha Saeed (a former political science professor who advocates for Muslim civil liberties and Palestinians rights), and groups such as Anonymous (Harding, 2014; Privacy International & Amnesty International, 2015).” (as cited in Dencik et al., 2016, p.3)
And of course, interfering in someone’s privacy is totally justified if there has been a crime or there is a high chance there might be, but monitoring and suppressing dissents without any justified suspicion is simply undermining the freedom of democracy and thereby laying the groundwork for digital authoritarianism. And even though the Snowden Leaks made people more aware of the potential harm of privacy issues, Dencik et al. found that among the same group of politically active citizens, many tended to view digital surveillance as a separate or secondary issue, not central to their activism. And this lack of care for online privacy, I would say, is the reason why society is being watched in the first place.
Where we tend to care a lot more about our ‘real world’ privacy, e.g., in your own garden, your bedroom, or at school, we lack the same mentality for our digital world. For this reason, Dencik et al. (2016, p. 10) call for a collective data justice framework that directly links digital surveillance to social justice, emphasizing that surveillance is not neutral but shapes who is empowered or marginalized. Simply because the nature of modern surveillance affects everyone’s ability to dissent and organize, with effects like self-censorship and repression.
Conclusion
The combination of early exposure to digital surveillance, commercial exploitation of child data, and ever-increasing government control is a dangerous cocktail that paves the way for digital authoritarianism. As this essay has shown, children today grow up in an environment where surveillance is a constant, from parental controls on smartphones to school monitoring and hidden data collection via apps.
While both the United States and the European Union recognize that children deserve extra protection, their approaches differ significantly. The EU has tightened its GDPR rules around data minimization, parental consent, and the right to be forgotten. However, enforcement has been flawed by vague legal language and the difficulty of reliably verifying online ages. The US, on the other hand, relies on the outdated COPPA legislation, which only protects children under 13 and largely fails to address modern risks such as algorithmic profiling or addictive design. In both cases, legal frameworks are failing to provide real protection against the influence of Big Tech and the growing asymmetry of power between citizens and systems.
And as Jonathan Heidt (2024) argues in his final chapter of his book, solutions also lie in the way we shape childhood itself. He recommends delaying smartphone ownership until at least high school, limiting social media use during vulnerable developmental years, and restoring time for unstructured outdoor play. These steps are not just protective—they are liberating. They help rebuild the social, emotional, and cognitive resilience that constant connectivity erodes.
Alongside systemic change, empowering families and communities to reclaim childhood offline is key to resisting the digital enclosure of young lives. This is why an integrated, international approach is needed that goes beyond technical compliance. We must strive for a culture of digital justice in which privacy is protected as a universal right, and where Big Tech and government become more transparent in their use of datafication and algorithmic profiling. Only by combining awareness, strong legislation, and ethical technological development can we prevent this digital generation from trading its freedom for permanent visibility.
References
Center for Democracy and Technology. (2025, April 11). Report – Hidden Harms: The
misleading promise of monitoring students online - Center for Democracy and Technology. https://cdt.org/insights/report-hidden-harms-the-misleading-promise-of-
monitoring-students-online/
Beeldschermgebruik bij (jonge) kinderen in regio Haaglanden. (2024, June 19). https://
epibul.ggdhaaglanden.nl/2024-nr-2/ beeldschermgebruik_bij_jonge_kinderen_in_regio_haaglanden.com
Data Protection Commission. (2023, September 15). DPC announces conclusion of TikTok
inquiry and issues €345 million fine. https://www.dataprotection.ie/en/news-media/
press-releases/dpc-announces-conclusion-tiktok-inquiry-and-issues-eu345-million-fine
Dencik, L., Hintz, A., & Cable, J. (2016). Towards data justice? The ambiguity of anti-
surveillance resistance in political activism. Big Data & Society, 3(2), 1–12. https://
doi.org/10.1177/2053951716679678
Digital Wellness Lab. (2024). Sharenting and child influencers. Boston Children's
Hospital. https://digitalwellnesslab.org/research-briefs/sharenting-and-child-
influencers/
European Data Protection Board. (2021). Guidelines 07/2020 on the concepts of controller
and processor in the GDPR (Version 2.0). https://edpb.europa.eu
Foucault, Michel. (2008). “‘Panopticism’ from ‘Discipline & Punish: The Birth of the
Prison.’” Race/Ethnicity: Multidisciplinary Global Contexts 2, no. 1–12. http://
www.jstor.org/stable/25594995.
Freedom and responsibility. (2021, November 25). https://www.kaspersky.com/about/press-
releases/freedom-and-responsibility-48-of-parents-use-parental-control-apps
Hidden Harms: the misleading promise of monitoring students online. (2022). Center for Democracy & Technology. https://cdt.org/wp-content/uploads/2022/08/Hidden-
Harms-The-Misleading-Promise-of-Monitoring-Students-Online-Research-Report-
Final-Accessible.pdf
Haidt, J. (2024). The anxious generation: How the great rewiring of childhood is causing an
epidemic of mental illness. Penguin Press.
Georgia State University News Hub. (2025, January 22). Study estimates 1 in 12 children
subjected to online sexual exploitation or abuse. https://news.gsu.edu/
2025/01/22/study-estimates-1-in-12-children-subjected-to-online-sexual-exploitation-
or-abuse/
Greitens, S. C. (2020). Surveillance, security, and liberal democracy in the post-COVID
world. International Organization, 74(S1), 169–195. https://doi.org/10.1017/
S0020818320000338
LendingTree. (2024). 46% of parents say their child used their credit card without
permission. https://www.lendingtree.com/credit-cards/study/kids-and-credit-cards-
survey/
Livingstone, S. et al. (2024). Children’s Rights and Online Age Assurance Systems: The Way
Forward. The International Journal of Children's Rights, 32(3), 721-747. https://
doi.org/10.1163/15718182-32030001
Mascheroni, G. (2018). Datafied childhoods: Contextualising datafication in everyday life.
Current Sociology, Advance online publication. https://doi.org/
10.1177/0011392118807534ScienceDirect+4staging-
unicatt.elsevierpure.com+4ResearchGate+4
Mascheroni, G. et al. (2019). Children's data and privacy online: Growing up in a digital
age [arXiv preprint arXiv:1902.02635]. https://arxiv.org/abs/1902.02635
Modecki, K. L., Goldberg, R. E., Wisniewski, P., & Orben, A. (2022). What is digital
parenting? A systematic review of past measurement and blueprint for the future.
Perspectives on Psychological Science, 17(6), 1673–1691.
Pandey, E. (2024, June 22). Some parents struggle to stop tracking their kids’ every move —
even in college. Axios. https://www.axios.com/2024/06/22/parents-teens-location-
tracking-college-adulthood
Trends. (2020). Kinderen betalen tientallen miljarden euro’s aan game-industrie: virtuele
munten zijn een deel van de magie. https://trends.knack.be
Vatu, G. (2023, September 12). How many kids have a mobile phone? SellCell.com Blog.
https://www.sellcell.com/blog/how-many-kids-have-a-mobile-phone/https://www.scientificamerican.com/article/how-gps-tracking-of-teens-24-7-impacts-parent-child-relationships/.com
Olga Solovyeva, PhD | Good tech & Digital safety strategist | London, UK
© 2025 Good Tech Strategy. All rights reserved.