Skip to content

Emerging tech, emerging mental health risks

In many parts of the world, there are more and more studies indicating that mental health is getting worse across all age groups. According to the European Commission, around 46% of Europeans[1] experienced emotional or psychosocial distress such as anxiety or depression in 2022. And, for young people, the picture is particularly alarming, with Glenn Micallef, Commissioner for Intergenerational Fairness and Youth labelling youth mental health “a silent crisis[2]”. Surveys in the UK[3] and US [4]suggest that nearly one in five young people suffer from a diagnosed mental health condition—a number that continues to climb. And the EU’s Future Shocks 2023 report[5] warned of further decline in societal well-being—particularly among the young, citing rising anxiety, loneliness, and an erosion of support systems.

There is no single explanation for this decline. Over the past decade, a cascade of crises—including the COVID-19 pandemic, economic stress and the high cost of living, climate anxiety, environmental factors, war and conflict—has put mounting pressure on individuals and communities. But one factor stands out in both public concern and policy attention: the role of digital technologies.

Smartphones and social media in particular have become central to everyday life, with at least two generations alive today that have never known a world without them.

These technologies have created and enabled opportunities for learning and play, communities for belonging and discussion, and long-distance connectivity across borders and regions that have never before been possible. Yet, already a decade ago, researchers began to observe[6] shifts in behaviour, mood, and cognition, particularly among young users, raising questions about early digital exposure. Experts warned [7]that toddlers and infants were engaging with screens before the effects on brain development were even understood, with concerns about their future cognitive skills and emotional resilience.

Today, one prominent voice and author of The Anxious Generation, Jonathan Haidt, has spurred a new wave of public discourse on this topic, arguing[8] that the combination of smartphone use, algorithmic feeds, and changing social norms has created a “perfect storm” for youth mental health. Haidt points to how online platforms exploit the (developing) brain’s reward system—particularly dopamine-driven gratification—to drive engagement at the expense of user mental health. For example, empirical studies[9] have shown that prolonged social media use among adolescents—driven by engagement-focused design features—directly reduces the time they spend on healthier activities like sleep. Other studies link excessive screen time and multitasking with impaired executive function, lower academic performance[10], and delayed cognitive development[11].

But not everyone agrees on the finality of the evidence. Critics [12][13] point to the lack of established causal linkages between smartphones, social media, and mental health, noting that much of the evidence to date is correlative. A 2024 meta-analysis of 226 studies by Stanford’s Jeff Hancock[14] found only “small effects.” Another review[15] of nearly 150 studies found a link between social media use and anxiety and depression in adolescents, though the varying impact suggests the findings shouldn’t be generalized. These studies caution that the science is not as clear-cut as public narratives suggest, and they elevate the trade-offs of over-focusing on the risks of digital technologies, as this can downplay some of the possible benefits, like digital health and therapeutics. Furthermore, researchers[16] warn that focusing too narrowly on digital technologies risks ignoring other deep systemic causes—like poverty[17]discrimination[18]emergencies[19]social isolation and loneliness[20] – all with  well-researched detrimental effects on mental health.

While earlier academic debates centered on whether a link exists between technology use and the mental-health crisis, new movements are now emerging.

Experts stress the need for age‑specific approaches, noting that technology can affect children differently at each developmental stage. For example, recent analysis show that receiving a smartphone before age 13[21] is linked to negative outcomes in young adulthood—especially among females—including suicidal thoughts, dissociation, poor emotional regulation, and diminished self-worth. Other studies highlight sensitive periods of heightened vulnerability[22]: early puberty (ages 11–13 for girls, 14–15 for boys) and again around age 19. These findings have led advocacy groups to incorporate childhood developmental milestones[23] into their guidance for the digital environment.

Policy measures should then be carefully designed and implemented for specific age groups, rather than taking a one-size-fits-all approach.

Academia is also increasingly adopting a global, interdisciplinary perspective on the issue. A recent global panel of over 120 experts from 11 disciplines[24] reached several conclusions: heavy use of smartphones and social media is consistently associated with sleep disturbances, attention difficulties, behavioural addiction, and—particularly for girls—body dissatisfaction and increased exposure to mental health risks. They also noted that evidence on social deprivation and the effects of interventions like age limits or school bans remains limited and inconclusive.

Mental health risks on the horizon

On top of the already well-documented risks posed by social media and screen time, yet another wave of technologies is reshaping how young people engage with digital environments.

One major trend is the rapid rise of generative AI, the effects of which on mental health are becoming a pressing concern — from the psychological impact of AI-generated content, to questions about its influence on cognitive capacity, and even its potential to trigger or worsen mental health conditions. 

These are, at the moment, just signals but it is clear research around these signals is urgent.

When it comes to content generation, troubling reports highlight how generative AI is being misused to bully classmates through sexually explicit deepfakes[25] or to recruit minors into extremist networks[26].

Another worrying phenomenon is the emergence of so-called brain rot, a term that carries two different meanings. Among young people, it’s used informally to describe AI-driven content on platforms like TikTok — repetitive, overstimulating, and algorithmically optimized for attention. Despite the negative connotations of the slang, many young people do not seem concerned[27], describing brain rot content as a welcome relief from the doomscrolling of conflict and climate change news.

By contrast, early research[28] employs the term more literally, referring to potential cognitive and behavioral decline linked to excessive digital media use, particularly in childhood. This research calls for a precautionary approach, noting that children are especially vulnerable in early developmental stages and that more research across disciplines is urgently needed.

Beyond the content itself, another area of concern is how AI changes the way we think,  particularly through cognitive offloading. While it is not new to outsource mentally burdensome tasks to machines—mental arithmetic to calculators, wayfinding to GPS—what is new is the depth and frequency of this delegation: thinking itself can now be outsourced to AI. While AI is expected to boost efficiency and productivity across society, the consequences of this new level of offloading are as-yet unknown. A widely-shared MIT pre-print study[29] on ChatGPT and “cognitive debt” found that heavy LLM use was linked to consistently lower performance across neural, linguistic, and behavioural measures. Similarly, a study [30]of 666 UK residents showed that the more participants relied on AI tools, the less critical thinking they exhibited. Experts [31]warn that, to protect the rights of children, the rapid rise of AI must not go unchecked.

Another emerging concern is the sycophantic tendency[32] of generative AI: when these systems overly flatter or agree with users—supposedly to maintain engagement—even when the direction is harmful. A striking example occurred when an update to GPT-4o led it to become noticeably more sycophantic, validating negative emotions, and endorsing impulsive behavior, leading to OpenAI rolling it  back.[33] Paradoxically, the much more neutral tone of GPT-5 led to a strong user backlash (and to another roll-back)[34], demonstrating the sheer power of such sycophancy on the human psyche, and prompting calls for an emotional benchmark for AI.[35] 

The optimisation of AI chatbots for user engagement raises concerns, especially for vulnerable people. Clinicians and researchers have begun documenting “AI psychosis[36] in which individuals—often teenagers or elderly, and often without prior psychiatric history—develop delusional beliefs, emotional dependencies, or distorted perceptions of reality, spiraling into severe mental health crises. The OECD AI Policy Observatory[37] has now flagged AI psychosis as an emerging trend, and several deaths and suicides have now been recorded[38] [39] [40], echoing the worst possible  implications of AI sycophancy and prompting several  wrongful‑death lawsuits[41] [42] against AI companies.

This raises pressing ethical concerns about to what extent AI chatbots should be emotionally attuned, and how to strike the balance between engagement, helpfulness, and safeguards for human wellbeing.

Another emerging technology with major implications is  neurotechnology—devices that sense or influence the brain. These tools are becoming[43] increasingly cheap and portable, and could soon be embedded into mainstream consumer wearables like headphones, glasses, and watches. In theory, they could support mental well-being, reduce anxiety, and improve focus. But there is an ironic risk that these novel tools could also be used as a digital band aid for mental health disorders, while the underlying root causes go unaddressed—particularly outside of clinical contexts. Worse, neuromarketing practices[44]—using brain-derived insights on how content is perceived—could amplify[45] the same harmful dynamics already seen with social media algorithms, if used to create hyper-compelling content tuned to emotional vulnerabilities, perhaps even in real-time[46] at the individual level.

Focus on how technology shapes young minds—not just whether

With a view to  these emerging concerns, there is a crucial opportunity to expand the public discourse on mental health and digital technology, from just whether to whether and how digital technologies shape young minds. This shift could unlock urgent policy action for mitigating these risks before it’s too late.

The current discourse tends to focus narrowly on whether digital technology is causing the mental health crisis. While establishing causal links between digital technologies and mental health is indeed vital, it requires long-term research which can effectively only inform reactive and incremental policy. Considering the global scope and speed of the otherwise unchecked technological innovation behind these emerging mental health risks, societies can’t afford to wait until neuroscientists establish causal pathways as evidence for policy action.

Expert and public discourses must be broadened to  focus on how digital technologies shape the environments in which young people grow: how they learn, and how they build relationships, their identity and sense of self.

Online platforms and digital services that mediate social interaction, drive compulsive use, and moderate content all influence how young people experience the world, making them some of the most consequential actors shaping future generations.

As such, both precautionary and reactive public oversight are essential to ensure these digital environments support, rather than erode, mental health of young people.

Policy is catching up

The good news is that alarm bells are finally ringing inside our political institutions.

World Health Organization (WHO) experts called last year for tobacco-style warning labels[47] on social media platforms, citing growing concern over compulsive use, comparison anxiety, and mood disorders. The WHO also warned [48]that current mental health systems are not equipped to meet the scale or urgency of today’s needs. In the U.S., the former Surgeon General urged public action[49], stating “the risk of not acting could be someone’s life.”

In Europe, political momentum is clearly building. In addition to its 2023 conclusions on mental health[50], the Council of the EU adopted new conclusions[51] in June 2025 stressing the urgent need to act. Members of the European Parliament’s committee to protect consumers (IMCO) unanimously agree[52] that Europe’s online protections must be strengthened[53]. Last year, the European Commission’s Joint Research Centre noted [54]that effective governance requires more than a single solution, calling for coordinated action across caregivers, educators, tech companies, policymakers, researchers, and youth themselves—through improved content systems, proportionate age assurances, enhanced media literacy, and better accountability for design choices that shape user behaviour. And pushing for speed, the Swedish Minister for Digitalisation warned[55] the Commission to “act fast,” noting that delays could lead to long-term harms that are harder to reverse.

EU institutions have already taken important steps to integrate mental health and youth protection across governance frameworks. The 2022 Better Internet for Kids[56] strategy, the 2023 Commission Communication on a Comprehensive Approach to Mental Health[57], and the 2024 Political Guidelines for the Next Commission[58] all recognise that digital safety is a public health issue—not just a consumer protection or digital policy matter.

Today, much of the necessary legal architecture is already there. More than ten existing EU regulations govern how social media and online platforms operate—from the General Data Protection Regulation (GDPR) and Unfair Commercial Practices Directive, to the Digital Services Act (DSA) and the Artificial Intelligence Act.

Together, these laws are an important toolbox for lawmakers and regulators to use for addressing these systemic risks online.

Although many useful laws are in place, their enforcement remains uneven and unsteady[59]. Without causal evidence, case law, or tested standards, concerns persist about the speed[60]consistency[61], and political vulnerability[62] of their application.

We don’t necessarily need new laws. What we need is for regulators and lawmakers to direct their attention to enforcement as a priority—to give it the focus, resources, and visibility it deserves. As the Knight-Georgetown Institute (KGI) and Panoptykon Foundation pointed out[63] in their joint submission to the European Board for Digital Services (in cooperation with the Commission), “systemic risk assessment and mitigation requirements are core components of the DSA framework and can be more fully integrated into the day-to-day operations of platforms.”

Looking forward, there are concrete opportunities to build on the existing legislative framework toward a more proactive, comprehensive governance protecting and promoting mental health in the digital age, especially for youth.

First, the European Commission launched a public consultation[64] on draft guidelines[65] for better protection of minors under the DSA, which are expected to include new standards, requirements, and controls for putting youth safety first.

Second, the Commission is preparing[66] a proposal for the Digital Fairness Act (DFA), which is expected to fill gaps not addressed by existing digital policy—particularly around addictive platform design, harmful default settings, and persuasive user interfaces. For youth mental health online, this new legislation could mark a turning point, from reactive content moderation to proactive design accountability.

And third, to better address youth mental health across governance frameworks—and generations—the Council Conclusions adopted in June 2025 gave the European Commission a clear mandate for more ambitious action. As the Polish Deputy Minister of Health urged[67], “the time has come for universal standards to help us understand how to respond to challenges and find our way in this new reality.”

With this, EU policymakers must step up[68] on all fronts – enforcing existing policies, and ensuring that policy action is forward-looking and fit for all generations. In particular, the Danish Presidency of the Council of the EU has an opportunity to ensure that protecting children and young people online is high on the EU’s political agenda. Indeed, this would be in line with its stated priorities, which include “protecting children and young people from harmful online content, addictive algorithms, screen consumption, unethical business models and extensive data harvesting and profiling[69]”. The Danish Presidency must also ensure that the EU Strategy for Intergenerational Fairness[70] and the EU Mental Health Strategy[71] called for by the European Parliament can adequately address existing and emerging technologies.

This is a rare political window. With the Council bringing digital mental health into the spotlight, the Parliament aligning across parties on the need for better protection, and the Commission advancing new enforcement efforts and strategies, the conditions are right to make meaningful progress.

EU institutions also do not have to carry the burden alone. A wide spectrum of non-government researchers, practitioners, and activists have already come together, united around a list of common-denominator policy demands[72] for better practices that make digital technology brain healthy by default. These demands are:

  • Turning user profiling off by default 
  • Optimising digital platforms and services for values other than engagement 
  • Prompting conscious user choice, including opening up content curation to third party services
  • Integrating positive friction to disrupt compulsive behaviour and trigger reflection
  • Banning addictive design features

These practices are a baseline, not a ceiling—offering policymakers and regulators a shared reference point for shaping a healthier digital environment. Crucially, they are already legally supportable under the EU’s GDPR, AIA, DSA, Digital Markets Act (DMA), and Unfair Commercial Practices Directive (UCPD). 

Specifically, already-enforceable demands include  turning user profiling off by default, prompting conscious user choice, and banning addictive design features—it is simply up to regulators to uphold these laws accordingly. 

Other demands, such as optimising digital platforms for values beyond engagement and integrating positive friction to disrupt compulsive behaviour and trigger reflection, are not (yet) strictly required under these laws but are supportable and strongly encouraged within the same legal framework. 

Looking forward—what’s next for policy action

As discussed above, at CFG we advocate for a broader, multidisciplinary, and complementary discussion on mental health and technology — one that focuses on how both contemporary and emerging technologies shape the environments in which people grow, learn, and form their sense of self.

Our focus was further consolidated and validated during our CPDP.ai 2025 roundtable on youth mental health and emerging technology, where participants from across disciplines and sectors — including government bodies, academia, and civil society representatives — endorsed this approach.

It is clear that we are at a moment of political momentum: the intersection of mental health and technology is firmly on the agenda, and we are already contributing to civil society’s policy demands for urgent and stronger mitigating measures.

At the same time, we recognize that technological change is fast-moving, and that implementing every aspect of our approach cannot happen all at once. This requires sustained efforts to monitor new trends while also continuing to advocate for measures that may be less visible or politically attractive but are no less important.

To support this work, CFG is launching a new workstream on “Tech and the Brain.” This initiative will explore how emerging technologies—from AI and neurotech to immersive platforms—are reshaping mental and brain health. The workstream will generate evidence, foster interdisciplinary collaboration, and support EU policymakers with concrete, forward-looking recommendations. By examining risks and opportunities, it aims to inform brain-positive governance of digital technologies that protects cognitive integrity and promotes mental health, particularly for vulnerable groups.

Authors

Mehmet Onur Cevik

Governance Analyst – Policy

Maria Koomen

Governance Director

Virginia Mahieu

Neurotechnology Director

Endnotes

[1] European Commission, ‘Mental Health’ (n.d.), https://health.ec.europa.eu/non-communicable-diseases/mental-health_en, (accessed 7 August 2025)

[2] European Commission, ‘Statement by Commissioner Micallef at European Parliament on “Silent Crisis: the Mental Health of Europe’s Youth”’ (2025) https://ec.europa.eu/commission/presscorner/detail/en/speech_25_627, (accessed 16 February 2025)

[3] National Health Service England, Mental Health of Children and Young People in England, 2023 – Wave 4 Follow-Up to the 2017 Survey, (2023), https://digital.nhs.uk/data-and-information/publications/statistical/mental-health-of-children-and-young-people-in-england/2023-wave-4-follow-up, (accessed 7 August 2025)

[4] Children’s Mental Health, ‘Data and Statistics on Children’s Mental Health’, (2025), https://www.cdc.gov/children-mental-health/data-research/index.html, (accessed 7 August 2025)

[5] European Parliament Think Tank, Future Shocks 2023: Anticipating and Weathering the Next Storms, (2023), https://www.europarl.europa.eu/thinktank/en/document/EPRS_STU(2023)751428, (accessed 7 August 2025)

[6] Twenge JM, ‘Have Smartphones Destroyed a Generation?’, (2017), The Atlantic https://archive.ph/KbVsr, (accessed 7 August 2025)

[7] Haughton C, Aiken M and Cheevers C, ‘Cyber Babies: The Impact of Emerging Technology on the Developing Infant’, (2015), 5 (9) Journal of Psychology Research 504–518, https://www.researchgate.net/publication/284028810_Cyber_babies_The_impact_of_emerging_technology_on_the_developing_InfantPsychology_Research, (accessed 7 August 2025)

[8] Haidt J, The Anxious Generation: How the Great Rewiring of Childhood Is Causing an Epidemic of Mental Illness, (Allen Lane 2024)

[9] Alonzo R, Hussain J, Stranges S and Anderson KK, ‘Interplay Between Social Media Use, Sleep Quality, and Mental Health in Youth: A Systematic Review’, (2020), 56 (2) Sleep Medicine Reviews 101414, https://www.researchgate.net/publication/347816353_Interplay_Between_Social_Media_Use_Sleep_Quality_and_Mental_Health_in_Youth_A_Systematic_Review, (accessed 7 August 2025)

[10] Muppalla SK, Vuppalapati S, Reddy Pulliahgaru A and Sreenivasulu H, ‘Effects of Excessive Screen Time on Child Development: An Updated Review and Strategies for Management’, (2023), 15 (6) Cureus e40608, https://pmc.ncbi.nlm.nih.gov/articles/PMC10353947/, (accessed 7 August 2025)

[11] Zhang Z et al., ‘Associations Between Screen Time and Cognitive Development in Preschoolers’, (2021), 27 (2) Paediatrics & Child Health 105–110, https://pmc.ncbi.nlm.nih.gov/articles/PMC9113848/, (accessed 7 August 2025)

[12] Montgomery B, ‘The Anxious Generation Wants to Save Teens, but the Bestseller’s Anti-Tech Logic Is Skewed’, (2024), The Guardian, https://www.theguardian.com/books/2024/apr/27/anxious-generation-jonathan-haidt, (accessed 7 August 2025)

[13] Brown A, ‘The Statistically Flawed Evidence That Social Media Is Causing the Teen Mental Health Crisis’ (2023), Reasonhttps://reason.com/2023/03/29/the-statistically-flawed-evidence-that-social-media-is-causing-the-teen-mental-health-crisis/, (accessed 7 August 2025)

[14] Hancock JT, Liu SX, Luo M and Mieczkowski H, ‘Social Media and Psychological Well-Being’, (2022), https://gwern.net/doc/sociology/technology/2022-hancock.pdf, (accessed 7 August 2025)

[15] McPhillips D, ‘Science of Social Media’s Effect on Mental Health Isn’t as Clear Cut as a Warning Label Might Suggest’, (2024), CNN, https://edition.cnn.com/2024/06/24/health/social-media-mental-health-limited-science-wellness/index.html, (accessed 7 August 2025)

[16] Odgers CL, ‘Book Review: “The Great Rewiring: Is Social Media Really Behind an Epidemic of Teenage Mental Illness?”’, (2024), Nature, https://www.nature.com/articles/d41586-024-00902-2, (accessed 7 August 2025)

[17] World Health Organization, ‘Mental Health’, (2022), https://www.who.int/news-room/fact-sheets/detail/mental-health-strengthening-our-response, (accessed 7 August 2025)

[18] World Health Organization, ‘Mental Health of Adolescents’, (2024), https://www.who.int/news-room/fact-sheets/detail/adolescent-mental-health, (accessed 7 August 2025)

[19] World Health Organization, ‘Mental Health in Emergencies’, , (2025)https://www.who.int/news-room/fact-sheets/detail/mental-health-in-emergencies, (accessed 7 August 2025)

[20] World Health Organization, ‘Social Isolation and Loneliness’, (2025), https://www.who.int/teams/social-determinants-of-health/demographic-change-and-healthy-ageing/social-isolation-and-loneliness, (accessed 7 August 2025)

[21] Thiagarajan TC, Jane J and Swaminathan S, ‘Protecting the Developing Mind in a Digital Age: A Global Policy Imperative’, Journal of Human Development and Capabilities 26, no 3, (2025), https://doi.org/10.1080/19452829.2025.2518313, (accessed 8 August 2025)

[22] Hughes V, ‘Does Social Media Make Teens Unhappy? It May Depend on Their Age’, The New York Times, (28 March 2022), https://www.nytimes.com/2022/03/28/science/social-media-teens-mental-health.html, (accessed 8 August 2025)

[23] 5Rights Foundation, Digital Childhood – Updated Report, (3 October 2023), https://5rightsfoundation.com/resource/digital-childhood-updated-report/, (accessed 8 August 2025)

[24] ‘A Consensus Statement on Potential Negative Impacts of Smartphone and Social Media Use on Adolescent Mental Health’, (2025), Seton Hall Law School Legal Studies Research, forthcoming HEC Paris Research Paper No MKG-2025-1567 https://papers.ssrn.com/sol3/papers.cfm?abstract_id=5256747, (accessed 7 August 2025)

[25] Long C, ‘First reports of children using AI to bully their peers using sexually explicit generated images, eSafety commissioner says’, (16 August 2023), ABC News https://www.abc.net.au/news/2023-08-16/esafety-commisioner-warns-ai-safety-must-improve/102733628, (accessed 7 August 2025)

[26] Europol, European Union Terrorism Situation and Trend Report 2025 (EU TE-SAT 2025) (Luxembourg, Publications Office of the European Union 2025), https://www.europol.europa.eu/cms/sites/default/files/documents/EU_TE-SAT_2025.pdf, (accessed 7 August 2025)

[27] Owens E, ‘Why teenagers are deliberately seeking “brain-rot” on TikTok’, (8 October 2024), Psychehttps://psyche.co/ideas/why-teenagers-are-deliberately-seeking-brain-rot-on-tiktok (accessed 7 August 2025)

[28] Gross T, Guglielmucci F, Narváez Carrión C R and Zerrouki Y, Brainrot Reality? Evidence-Based Neuropsychiatric Perspectives on Early-Life Media Use, White Paper, Zentrum für Medienpsychologie und Verhaltensforschung, (2025), https://www.researchgate.net/publication/390398670_BRAINROT_REALITY_Evidence-Based_Neuropsychiatric_Perspectives_on_Early-Life_Media_Use, (accessed 7 August 2025)

[29] Kos’myna N, ‘Your Brain on ChatGPT: Accumulation of Cognitive Debt when Using an AI Assistant for Essay-Writing Task’, (10 June 2025), MIT Media Lab https://www.media.mit.edu/publications/your-brain-on-chatgpt/, (accessed 7 August 2025)

[30] Gerlich M, ‘AI Tools in Society: Impacts on Cognitive Offloading and the Future of Critical Thinking’, (2025), 15 (1) Societies 6, https://www.mdpi.com/2075-4698/15/1/6, (accessed 7 August 2025)

[31] 5Rights Foundation, Children & AI Design Code, (2025), https://5rightsfoundation.com/children-and-ai-code-of-conduct/, (accessed 7 August 2025)

[32] Caulfield M, ‘AI Is Not Your Friend’, (9 May 2025), The Atlantic https://www.theatlantic.com/technology/archive/2025/05/sycophantic-ai/682743/, (accessed 7 August 2025)

[33] OpenAI, ‘Sycophancy in GPT-4o: what happened and what we’re doing about it’, (29 April 2025), https://openai.com/index/sycophancy-in-gpt-4o/, (accessed 7 August 2025)

[34] Tangermann V, OpenAI Announces That It’s Making GPT-5 More Sycophantic After User Backlash, (2025), Futurism https://futurism.com/openai-gpt5-more-sycophantic, (accessed 28 August 2025)

[35] Knight W, GPT-5 Doesn’t Dislike You—It Might Just Need a Benchmark for Emotional Intelligence (2025), Wired https://www.wired.com/story/gpt-5-doesnt-dislike-you-it-might-just-need-a-benchmark-for-empathy/, (accessed 28 August 2025)

[36] Wei M, The Emerging Problem of “AI Psychosis”, (21 July 2025) Psychology Today , https://www.psychologytoday.com/us/blog/urban-survival/202507/the-emerging-problem-of-ai-psychosis, (accessed 28 August 2025)

[37] OECD AI Policy Observatory, Incident: AI Psychosis Flagged as Emerging Trend, (2025), https://oecd.ai/en/incidents/2025-08-12-4268, (accessed 28 August 2025)

[38] Montgomery B, Character.AI Chatbot Linked to Death of Teenager Sewell Setzer (2024), The Guardian https://www.theguardian.com/technology/2024/oct/23/character-ai-chatbot-sewell-setzer-death, (accessed 28 August 2025)

[39] Horwitz J, Special Report: Meta AI Chatbot Implicated in Death (2025), Reuters https://www.reuters.com/investigates/special-report/meta-ai-chatbot-death/, (accessed 28 August 2025)

[40] Godoy J, OpenAI and Sam Altman Sued Over ChatGPT’s Role in California Teen’s Suicide (2025), Reuters https://www.reuters.com/sustainability/boards-policy-regulation/openai-altman-sued-over-chatgpts-role-california-teens-suicide-2025-08-26, (accessed 28 August 2025)

[41] Montgomery B, Character.AI Chatbot Linked to Death of Teenager Sewell Setzer (2024), The Guardian https://www.theguardian.com/technology/2024/oct/23/character-ai-chatbot-sewell-setzer-death, (accessed 28 August 2025)

[42] Godoy J, OpenAI and Sam Altman Sued Over ChatGPT’s Role in California Teen’s Suicide (2025), Reuters https://www.reuters.com/sustainability/boards-policy-regulation/openai-altman-sued-over-chatgpts-role-california-teens-suicide-2025-08-26, (accessed 28 August 2025)

[43] Bernaez Timon L and Mahieu V, Neurotech Consumer Market Atlas: How the Sector Is Making Moves into the Mainstream, Centre for Future Generations, (2025), https://cfg.eu/neurotech-market-atlas/, (accessed 7 August 2025)

[44] Mouammine Y and Azdimousa H, ‘An overview of ethical issues in neuromarketing: discussion and possible solutions’, (2023), 18 (4) Marketing Science & Inspirations 29–47, https://msijournal.com/overview-ethical-issues-neuromarketing/, (accessed 7 August 2025).

[45] Mahieu V et al.Towards Inclusive EU Governance of Neurotechnologies, Centre for Future Generations, (2024), https://cfg.eu/towards-inclusive-eu-governance-of-neurotechnologies/, (accessed 7 August 2025)

[46] Baradari D, Kosmyna N, Petrov O, Kaplun R and Maes P, ‘NeuroChat: A Neuroadaptive AI Chatbot for Customizing Learning Experiences’, (2025), https://arxiv.org/abs/2503.07599, (accessed 7 August 2025)

[47] Chiappa C, ‘Control smartphones like tobacco, says leading WHO expert’ (2 October 2024) Politico Europe, https://www.politico.eu/article/education-stronger-regulation-protect-kids-social-media-misuse-smartphones-who-ban-addiction/, (accessed 7 August 2025)

[48] World Health Organization, ‘WHO highlights urgent need to transform mental health and mental health care’, (17 June 2022), https://www.who.int/news/item/17-06-2022-who-highlights-urgent-need-to-transform-mental-health-and-mental-health-care, (accessed 7 August 2025)

[49] Murthy V H, ‘Surgeon General: Why I’m Calling for a Warning Label on Social Media Platforms’, (17 June 2024), The New York Timeshttps://www.nytimes.com/2024/06/17/opinion/social-media-health-warning.html, (accessed 7 August 2025)

[50] Council of the European Union, ‘Mental health: member states to take action across multiple levels, sectors and ages’, (Press release, 30 November 2023), https://www.consilium.europa.eu/en/press/press-releases/2023/11/30/mental-health-member-states-to-take-action-across-multiple-levels-sectors-and-ages/, (accessed 7 August 2025)

[51] European Council, ‘Council calls for greater efforts to protect the mental health of children and teenagers in the digital era’, (Press release, 20 June 2025), https://www.consilium.europa.eu/en/press/press-releases/2025/06/20/council-calls-for-greater-efforts-to-protect-the-mental-health-of-children-and-teenagers-in-the-digital-era/ (accessed 7 August 2025)

[52] European Parliament, Protection of Minors Online (2025/2060(INI)) – Legislative Observatory procedure file, (2025), https://oeil.secure.europarl.europa.eu/oeil/en/procedure-file?reference=2025/2060(INI), (accessed 7 August 2025)

[53] Datta A, ‘Five questions with MEP Christel Schaldemose on protecting minors online’, (EURACTIV, 25 June 2025), https://www.euractiv.com/section/tech/news/five-questions-with-mep-christel-schaldemose-on-protecting-minors-online/, (accessed 7 August 2025)

[54] European Commission Joint Research Centre, Minors’ Health and Social Media: An Interdisciplinary Scientific Perspective, (2025), https://publications.jrc.ec.europa.eu/repository/handle/JRC141090, (accessed 7 August 2025)

[55] Government Offices of Sweden, ‘Letter from the Swedish Minister for Social Affairs and Public Health to Commissioners Breton and Kyriakides’, (26 June 2024), https://www.regeringen.se/globalassets/regeringen/dokument/socialdepartementet/fokhalsa-och-sjukvard/jakob-forssmeds-skrivelse-till-kommissionarerna-den-26-juni-2024-pa-engelska.pdf, (accessed 7 August 2025)

[56] Better Internet for Kids, ‘Homepage’, (n.d.), https://better-internet-for-kids.europa.eu/en, (accessed 7 August 2025)

[57] European Commission, Communication on a Comprehensive Approach to Mental Health COM(2023) 298 final, (2023), https://eur-lex.europa.eu/legal-content/EN/TXT/?uri=celex:52023DC0298, (accessed 7 August 2025)

[58] von der Leyen U, Europe’s Choice: Political Guidelines for the Next European Commission 2024–2029, (2024), https://commission.europa.eu/document/download/e6cd4328-673c-4e7a-8683-f63ffb2cf648_en?filename=Political%20Guidelines%202024-2029_EN.pdf, (accessed 7 August 2025)

[59] Çevik M O, ‘Enforcement Spotlight – Spring 2025’, Centre for Future Generations, (2025), https://cfg.eu/enforcement-spotlight-spring-2025/, (accessed 7 August 2025)

[60] Government Offices of Sweden, ‘Letter from the Swedish Minister for Social Affairs and Public Health to Commissioners Breton &Kyriakides’, (2024), https://www.regeringen.se/globalassets/regeringen/dokument/socialdepartementet/fokhalsa-och-sjukvard/jakob-forssmeds-skrivelse-till-kommissionarerna-den-26-juni-2024-pa-engelska.pdf, (accessed 7 August 2025)

[61] European Parliamentary Research Service, TikTok and EU Regulation: Legal Challenges and Cross-Jurisdictional Insights, (Briefing 2025), https://www.europarl.europa.eu/RegData/etudes/BRIE/2025/775837/EPRS_BRI(2025)775837_EN.pdf, (accessed 7 August 2025)

[62] Windwehr S, ‘Trump vs Europe: The Role of the Digital Services Act’, Heinrich-Böll-Stiftung, (18 February 2025), https://eu.boell.org/en/2025/02/18/trump-vs-europe-role-digital-services-act, (accessed 7 August 2025)

[63]  Knight-Georgetown Institute and Panoptykon Foundation, ‘Comments on European Board for Digital Services and European Commission Report on Systemic Risks and Mitigations under the Digital Services Act’, (2025), https://en.panoptykon.org/comments-european-board-digital-services-and-european-commission-report-systemic-risks-and, (accessed 7 August 2025)

[64] European Commission, ‘Commission publishes draft guidelines on protection of minors online under the Digital Services Act’, (2025), https://digital-strategy.ec.europa.eu/en/news/commission-publishes-draft-guidelines-protection-minors-online-under-digital-services-act, (accessed 7 August 2025)

[65] European Commission, ‘Commission publishes draft guidelines on protection of minors online under the Digital Services Act’, (2025), https://digital-strategy.ec.europa.eu/en/news/commission-publishes-draft-guidelines-protection-minors-online-under-digital-services-act, (accessed 7 August 2025)

[66] European Commission, Commission Staff Working Document – Fitness Check of EU Consumer Law on Digital Fairness, (2024), https://commission.europa.eu/document/download/707d7404-78e5-4aef-acfa-82b4cf639f55_en?filename=Commission%20Staff%20Working%20Document%20Fitness%20Check%20on%20EU%20consumer%20law%20on%20digital%20fairness.pdf, (accessed 7 August 2025)

[67] Polish Presidency of the Council of the European Union, ‘Closer Together – Young People Co-Create the Future of Mental Health in Europe’, (2025), https://polish-presidency.consilium.europa.eu/en/news/closer-together-young-people-co-create-the-future-of-mental-health-in-europe/, (accessed 7 August 2025)

[68] Polish Presidency of the Council of the European Union, ‘Closer Together – Young People Co-Create the Future of Mental Health in Europe’, (2025), https://polish-presidency.consilium.europa.eu/en/news/closer-together-young-people-co-create-the-future-of-mental-health-in-europe/, (accessed 7 August 2025)

[69] Danish Presidency of the Council of the European Union, Programme of the Danish EU Presidency, (2025), https://danish-presidency.consilium.europa.eu/en/programme-for-the-danish-eu-presidency/programme-of-the-danish-eu-presidency/, (accessed 7 August 2025)

[70] Joint Research Centre, ‘Paving the way for an EU Intergenerational Fairness Strategy’, Policy Lab blog, (25 February 2025), https://policy-lab.ec.europa.eu/news/paving-way-eu-intergenerational-fairness-strategy-2025-02-25_en, (accessed 7 August 2025)

[71] European Parliament, ‘Parliament calls for action to protect mental health’, (24 June 2024), https://www.europarl.europa.eu/topics/en/article/20220624STO33809/parliament-calls-for-action-to-protect-mental-health, (accessed 7 August 2025)

[72] People vs Big Tech, Safe by Default: Moving Away from Engagement-Based Architecture, (Report 2024), https://peoplevsbig.tech/safe-by-default/, (accessed 7 August 2025)

Centre for Future Generations
Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.