Empowering digital futures: Collaboration is key in making the digital world safe for children  

From the concerning challenges circulating on platforms like TikTok to the rise of algorithmic addictions, the digital world is affecting children. Amidst these challenges, the call for collaboration becomes a resounding imperative – a united effort to bridge the gaps and ensure that the benefits of digital progress should not be monopolized, but rather, ought to become an inclusive force for societal upliftment.  

To seek cross-sectoral collaborative solutions, Save the Children Finland organized the Nordic Digital Citizenship conference on Children’s Rights in the Digital World on December 12th in Helsinki. The conference ventured through the intricate landscape of digital disparities, urging organizations, professionals, and disciplines to join forces in crafting sustainable solutions that safeguard the well-being and rights of our children in the ever-evolving digital age. 

“Cooperation is crucial in this changing world of digital environments, which is why we wanted to bring together a cross-sectoral group of experts on these issues”, says powerhouse of the conference Veera Träskelin from Save the Children Finland.  

Hanna Markkula-Kivisilta, Secretary General of Save the Children Finland and Antti Järvinen, Country Lead, Google

“Set yourself on fire or crack your friend’s skull” – what are our children exposed to in social media 

Ella Kangasalusta, 21, is a social media influencer on TikTok and Instagram. Her audience is mostly young, but having worked as a teacher’s aide previously, she feels she has a good grasp on the children of today. In addition to her career as an influencer, she has also given children classes on healthy social media. As someone spending a lot of her time in social media, with young people, she is worried.  

“Adults can pretend to be children everywhere online so actual children don’t have a possibility to be children there”, she says.  

There are images and language circulating on TikTok that are, according to Kangasalusta, often something, that children should never be exposed to.  

“Students as young as 10 years old are already pressured for beauty standards”, she says, describing things young children can encounter in social media.  

And it’s not only that. Mikko Alasaarela, the CEO of Equel Oy, has seen the development of social media, or anti-social media, as he calls some of the applications, from a variety of perspectives, including Silicon Valley at the time when Facebook was only starting.  

“Challenges circulating on TikTok in the past few years have included, for example, a challenge to strangle yourself until you pass out, a challenge to crack your friend’s skull, a challenge to drink poison, a challenge to set yourself on fire”, Alasaarela lists.   

These challenges and other dangerous content keep spreading. What is happening to children online? 

Algorithmic Addiction and Anti-Social Media 

Let’s back up a little bit. What is Mikko Alasaarela talking about when he is talking about anti-social media? 

According to Alasaarela, the transformation begun at the games industry, that has shifted its focus from selling traditional products to offering free-to-play games, where in-game purchases play a significant role. One notable consequence is the rise of so-called algorithmic addictions, creating automated pathways leading users into addictive behaviours within games. 

Mikko Alasaarela, CEO, Equel Oy

In the realm of social media, companies initially focused on the concept of “friending” people, as seen on platforms like Facebook. However, this has evolved into a one-directional model of following, giving rise to influencers or creators. The advent of platforms like TikTok introduced an algorithmic model that differs significantly from traditional social media platforms. While Facebook provides algorithmically selected content from your friends, TikTok exposes users to content from a wide array of creators, contributing to its addictive nature. 

“It is quite ironical that there is an existential threat to humanity that can solve another existential threat to humanity.” 

Mikko Alasaarela

The prevalence of algorithmic content is transforming social media platforms into what can be termed as “anti-social media” platforms. Studies reveal that U.S. teens spend a significant portion of their free time, approximately 4.8 hours a day, on social media, primarily on platforms like YouTube, TikTok, and Instagram, which are identified as the most anti-social. 

Overall screen time spent on digital devices has been linked to mental health issues. According to a social psychologist and professor of ethical leadership at the New Your University, Jonathan Haidt, the four foundational harms associated with increased screen time include social deprivation, sleep deprivation, attention fragmentation, and addiction. These factors collectively contribute to shaping individuals’ perceptions of the world in the digital age. 

Algorithmic addictions, according to Alasaarela, create a sort of a digital drug trade. And the exponentially growing AI development is only accelerating the speed.  

But perhaps there are also positive sides.  

“Currently, the AI is the only technology that is growing fast enough to solve climate change”, Alasaarela points out.  

“It is quite ironical that there is an existential threat to humanity that can solve another existential threat to humanity.” 

Everything is fake and nothing is private 

While the growth of AI can impact the algorithms and make them even more addictive, it also influences the content. Author Nina Shick has predicted that 90 % of all content on the internet will be AI generated by the end of 2025. Is there any way to know what is true anymore? 

EU’s new Artificial Intelligence Act states, that deep fakes and other AI generated content will have to be labelled as such. This is a much-welcomed change, but in the world of fast-paced change, laws and regulations always lag behind. In addition to mis- and disinformation, another big question is privacy. 

“There are babies being born with already existing Instagram accounts”, Ella Kangaslausta points out.  

“To raise children also requires ensuring their rights. Also, their right to privacy.” 

“Kids as young as eight perfectly understand privacy”, says Marie Potel-Saville, Founder & CEO of Fair Patterns and Amurabi. 

“However, it is often buried in jargon.” 

Marie Potel-Saville, Founder & CEO of Fair Patterns and Amurabi

”It’s unreasonable to built addictive services and then tell kids off for spending so much time online.”

Marie Potel-Saville

Potel-Saville has been working with questions such as how to create privacy policies that are readable to children and how to make difficult concepts like “cookies” understandable to children. 

Imagine you are about to enter an amusement park, that had several different themed areas. At the gate, a large computer scans you. The scan will tell the computer what you have browsed, what you have purchased, and what you have talked about with your friends online, and based on the results, directs you to the dinosaur themed area. You recently bought a T-rex toy after all. You spend a fun day and leave the park. However, you will never know that the park also had a space themed area, a cowboy themed area, a knight themed area, and so on.  

According to Potel-Saville, in workshops, when presented with this idea, children often complained. They did not get to see all the other parts of the amusement park or discover new things. A similar thing happens, when algorithms only direct you towards specific types of content based on your previous behaviour – and the accepted cookies.   

While we need the laws and regulations to catch up, we also need to develop ways of explaining children their rights in the online world. However, the main pressure should not be on children or even their parents. 

”It’s unreasonable to built addictive services and then tell kids off for spending so much time online”, Potel-Saville argues.  

No one wants to talk about it 

While cookies might be a complicated topic to talk about, the online world contains something even harsher and more difficult. The internet has exploded the phenomenon of child sexual abuse material (CSAM) and according to Emily Slifer, Director of Policy, Thorn, we don’t even know the scale of it.  

“It’s a crime that no one wants to talk about”, Slifer says.  

“But it seems that we are getting past it. We are talking about it and that is important.” 

Emily Slifer, Director of Policy, Thorn

“Sexuality and violence are hard topics to discuss”, agrees analyst Maarit Mustonen from Save the Children Finland’s Hotline Nettvihje as well. 

“The terms have to be correct, but also understandable to everyone. You cannot be safe from danger if you don’t understand that it is happening to you.” 

While child sexual abuse online is a growing phenomenon, it is also spreading in new ways. Sextortion means blackmailing children with sexual images they have previously sent. The victims are often boys, and the criminals use their pictures to ask for money or more pictures. Meanwhile, 1 in 4 youth agree that it is normal to share nudes.  

Another growing trend is AI generated content, where even an innocent image can be used for malicious purposes. As finding the victims and perpetrators online is already like looking for a needle in haystack, AI created content makes the haystack even bigger.   

“You cannot be safe from danger if you don’t understand that it is happening to you.” 

Maarit Mustonen

“We need safety by design”, Slifer says.  

This could mean, for example, clean training datasets for AI tools, and focus on maintaining – if a model fails a safety evaluation, the access to it should be removed. Regulation happens at the EU level and is currently stuck with the member states.  

“Contact your representatives”, Slifer encourages.  

“We can get ahead of this problem!” 

Collaboration of organizations, professionals, and disciplines 

The landscape of digital advancements presents a stark reality: those already advantaged stand a better chance of reaping the benefits, further exacerbating existing inequalities. Conversely, those facing disadvantages may find it challenging to access and leverage digital opportunities and might face danger more often.  

“Together, we must craft sustainable solutions to protect the well-being and rights of our children in the evolving digital age.”

Veera Träskelin

However, amidst these challenges lies a crucial realization – sustainable solutions can only emerge through collaborative efforts. The intricate web of issues surrounding digital disparities demands a united front, where individuals, communities, and organizations work together to bridge the gaps and ensure that the benefits of digital progress are accessible to all.  

The call for collaboration echoes as a resounding imperative to not only address the current imbalances but also to pave the way for a future where digital advantages become an inclusive force for societal upliftment. 

“In the face of digital challenges impacting our children, from concerning content to the rise of algorithmic addictions, a united call for collaboration emerges. Together, we must craft sustainable solutions to protect the well-being and rights of our children in the evolving digital age, ensuring that digital progress becomes an inclusive force for societal upliftment”, concludes Veera Träskelin.

Conference Workshops Illuminate Crucial Issues: AI, Privacy, and Protection in Focus

Save the Children organized the Nordic Digital Citizenship conference on Children’s Rights in the Digital World on December 12th. The conference served as a dynamic platform for cross-sectoral dialogue, collaborative innovation, and the exchange of inspiring solutions, fostering a shared vision of providing safe and inclusive digital opportunities for all children.

It gathered together professionals from NGOs, tech companies, governmental institutions, and think tanks to develop an updated and enriched understanding of important and current perspectives related to digital childhood in the Nordic context.

Conference’s various workshops shed light on critical issues related to AI, privacy, and protection.

The two AI workshops, Disinformation as a challenge to democracy and children’s rights facilitated by Rosa Haavisto (P/CVE Advisor, Save the Children Finland) and Milka Sormunen (postdoctoral researcher, Faculty of Law, University of Helsinki) and ChatGPT and Beyond: AI’s Influence on Children’s Everyday Experiences, facilitated by Veera Träskelin (Thematic Advisor, Save the Children Finland), Mikko Alasaarela (CEO of Equel Oy) and Henriikka Vartiainen (Senior Researcher, Generation AI -program) delved into the profound impact of algorithms, emphasizing the challenges posed by business models of social media companies. The discussion extended to the necessity for multi-level and multi-agency cooperation, particularly in addressing the threats posed by deep fake videos and audio. The consensus was that collective action is crucial to tackle these challenges.

”Children are especially vulnerable for algorithmic manipulation, we have to build safer algorithms for them”, Alasaarela says.

Workshop on ChatGPT and Beyond: AI’s Influence on Children’s Everyday Experiences

The two Privacy Workshops, Childhood Under Surveillance: Weighing Safety Against Privacy facilitated by Fredrik Lindén (MyData4Children) and Douha Kermani (Digital advisor, Save the Children Sweden) and Empowering Children: Demystifying Privacy in Child-Centric Terms facilitated by Nora Musto (Project Manager, GDPR4CHLDRN-project, The Office of the Data Protection Ombudsman), Heikki Lauha (Specialist, Digital Power and Democracy project, the Finnish Innovation Fund Sitra) emphasized the importance of giving privacy and encouraged prevention efforts to address issues like sharing explicit content. The discussions highlighted the need for better platforms that prioritize safety and fairness, emphasizing the balance between privacy and safety.

In the two Protection Workshops, How to Protect Vulnerable Children from Sexual Abuse Online facilitated by Maarit Mustonen (Advisor, Finnish Hotline Nettivihje, Save the Children Finland) and Milla Ruuska, (Education planner, EHYT ry) and Countering the Threat: Safeguarding Against Sextortion facilitated by Kaisa Rissanen (Head of Legal and Operations, Someturva – SomeBuddy), Anni Pätilä, (Leading expert, Digital Youth Work, Sua varten somessa / Loisto setlementti ry), Christoffer Borup (Child Protection Advisor, Save the Children Denmark) and Rebecca Cronfeld (Child Protection Advisor, Save the Children Denmark) attention was drawn to vulnerable groups and the challenges they face in accessing information and help. The need for increased funding, interest, and an improved legal framework was stressed. Language barriers, particularly in understanding terms and conditions, were identified as obstacles, and the importance of child participation was emphasized. The workshops advocated for clear conceptual frameworks and the sharing of stories to bring attention to these critical issues.

“We should raise our voice and bring these problems to everyone’s knowledge and call other people into actions to make internet a safer place for children”, Maarit Mustonen sums.

Text: Noora Isomäki

Photos: Anna Autio and Mira Ahlstedt


The conference was organized as part of Digital Citizenship, a Google Foundation-funded pan-Nordic program that aims to provide every child with safe and non-discriminatory access to digital opportunities. It also seeks to explore opportunities for Nordic collaboration, engage ecosystem stakeholders for mutual learning, and establish new strategic partnerships to advance children’s rights in the digital world. The conference is organised in collaboration with Google.org and Google.