
After months of anticipation and debate over the government’s controversial move, under 16s woke up to find themselves locked out of popular platforms such as TikTok, Instagram and YouTube, according to media reports.
The ban aims to protect young people from online abuse such as cyberbullying, exploitation and exposure to harmful content, all of which are detrimental to their mental health and well-being.
Bans could backfire
As other governments contemplate similar moves, the UN Children’s Fund (UNICEF) cautions that age-related restrictions alone won’t keep children safe.
“While UNICEF welcomes the growing commitment to children’s online safety, social media bans come with their own risks, and they may even backfire,” the agency said in a statement.
For many children, particularly those who are isolated or marginalised, social media is a lifeline for learning, connection, play and self-expression, UNICEF explained.
Moreover, many will still access social media – for example, through workarounds, shared devices, or use of less regulated platforms – which will only make it harder to protect them.
Protection and respect for human rights
“Age restrictions must be part of a broader approach that protects children from harm, respects their rights to privacy and participation, and avoids pushing them into unregulated, less safe spaces,” the statement said.
“Regulation should not be a substitute for platforms investing in child safety. Laws introducing age restrictions are not an alternative to companies improving platform design and content moderation.”
The UN human rights chief also weighed in during his end-of-year press conference in Geneva.
“We know how difficult it is for societies to grapple with the issue of how to keep children safe online,” Volker Türk said in response to a journalist’s question.
“We have had the social media platforms launched now quite a few years ago, but I don’t think at the stage when they were launched that a human rights due impact assessment was actually done.”
Make the internet safe
UNICEF urges governments, regulators and tech companies to work together with children and families to build a digital space that is safe, inclusive and respects children’s rights.
Authorities must ensure that age-related laws and regulations do not replace companies’ obligations to invest in safer platform design and effective content moderation.
Furthermore, social media products must be re-designed, putting child safety and well-being at the centre, while regulators must have systemic measures to effectively prevent and mitigate online harm.
Support for parents
Other recommendations include helping parents and caregivers to improve their digital literacy.
“They have a crucial role but currently are being asked to do the impossible to protect their children online: monitor platforms they didn’t design, police algorithms they can’t see, and manage dozens of apps around the clock,” UNICEF said.
The UN rights chief noted that countries are trying to keep up with technological developments, and Australia is not alone in its response. The state of California in the US has a similar law to shield minors online, while the European Union is debating draft legislation.
“It’s very important to keep monitoring what works, what doesn’t work,” said Mr. Türk.
“But it is also very clear from a human rights perspective that the best interest of the child has to be taken into account in all of this, including the protection and safety concerns that children face.”
