Safer Internet Day is the annual punctuation point when governments, NGOs, citizens, and businesses worldwide recognise the importance of creating a safer internet for all. In the early 2000s, I was one of the people who advocated for an annual day, having successfully bid for funding from a programme backed by the European Commission that supported the setup and operation of internet safety centres in each member state across Europe. The day’s purpose was, and still is, to keep internet safety front and centre in people’s minds whilst serving as a day of reflection on what has not yet been achieved; to create a safer internet for children.
We have certainly come a long way since then and the shape of the internet has changed drastically in that time. Some strides have been made in improving safeguarding provisions for children. But it is still a major challenge for regulators and companies alike to develop these provisions in line with the incredible rate at which internet use has grown and changed.
Over the last 20 years, we have seen the growth of corporate surveillance and the rise of the attention economy. Worryingly, in part due to this new reliance on monitoring of users, the incidence of child sexual abuse has spiked during COVID-19.
One of the major contributing factors is the implementation of harmful data-driven operations – specifically, recommendation algorithms that seamlessly connect adults with a sexual interest in children with children on social media and gaming platforms. Unintentionally, AI is facilitating child sexual abuse.
So how has this been allowed to happen? The pace of change has posed a challenge to establishing long-term safeguarding strategies. Also, self-regulation has been one of the defining features of the tech industry. These two facts go hand in hand: the pace at which things evolve in the industry has meant that companies in the sector can argue that traditional state-mandated regulation is too inflexible a model to appropriately respond to the decentralised and dynamic networks involved. Unfortunately, in the years since the self-regulation model was instituted, it has become apparent that it is not working.
Over the last few years, governments and regulatory bodies are reclaiming their mantle and writing more prescriptive regulation and legislation to help shape the future of the internet. Perhaps the best known, enforcement of the General Data Protection Regulation (GDPR) enforcement regularly makes headlines, with data protection authorities levying major fines on some of the best-known companies in the world.
GDPR also contains some major provisions for the protection of children’s data, who are known as vulnerable data subjects. These also function to combat other online harms; such as recommendation systems facilitating online abuse and how children should be excluded from these, or included under conditions distinct from those for adults, as a result of being vulnerable data subjects.
One of the most significant child safety provisions in the GDPR is Article 8, which requires companies to obtain parental consent before processing children’s data. This clause has the potential to be the most significant positive impact on internet safety in the last two decades.
Companies are required to age check their users so they can identify who is a child, and, for younger children, obtain parental consent. Those platforms that know the ages of their users are able to create safer spaces online
Effective enforcement of GDPR, alongside the proposed Online Safety Bill, the EU Digital Services Act, and United Nations Convention on the Rights of the Child, will constitute a regulatory regime that requires all companies to exercise a duty to protect children and young people.
Alongside this increase in regulatory intervention is a change in the way that investors, consumers, and businesses value social responsibility. We are witnessing a transition away from self-regulation towards greater accountability as a result of enhanced oversight and consumer interests.
Businesses are recognising the opportunity for digital transformation in the wake of the rise of ethical consumerism, meaning that users are selecting what to consume and which sites to use in light of their environmental, social, and governance (ESG) impacts. From this new consumer-based moral imperative emerges greater opportunities for businesses to differentiate themselves from others on the basis of trust, respect for the law, and children’s safety.
It is critical that businesses stay ahead of this new wave of child-oriented regulations to avoid penalties whilst maximising consumers’ ethical interests to play their part in shaping a safer internet for our children and future generations.