(Photo : pexels)
On Monday, a representative for major social media platforms urged an Australian Senate committee to postpone implementing laws that would prohibit children under 16 from accessing these platforms. Instead of rushing the legislation through Parliament this week, the advocate called for a delay until at least next year.
Sunita Bose, managing director of Digital Industry Group Inc., which represents platforms such as X, Instagram, Facebook, and TikTok, appeared at a one-day Senate committee hearing to discuss the proposed world-first legislation introduced last week. Bose emphasized that lawmakers should hold off until the government completes its evaluation of age verification technologies, which is expected by June.
Parliament is asked to pass a bill this week without knowing how it will work, Bose said.
Potential Fines and Concerns Over Safety
The legislation would impose fines of up to 50 million Australian dollars ($33 million) on platforms for systemic failures to prevent young children from holding accounts.
While platforms like Tiktok and Instagram acknowledge the need for improvement, they warn of challenges in implementing technologies that ensure compliance without compromising user privacy.
However, it seems likely to be passed by Parliament by Thursday with the support of the major parties.
It would take effect a year after the bill becomes law, allowing the platforms time to work out technological solutions that would also protect users' privacy.
Debate Over Accountability and Platform Responsibility
Opposition Sen. Ross Cadell asked how his 10-year-old stepson was able to hold Instagram, Snapchat and YouTube accounts from the age of 8, despite the platforms setting a nominal age limit of 13.
Bose replied that this is an area where the industry needs to improve.
She said the proposed social media ban risked isolating some children and driving children to darker, less safe online spaces than mainstream platforms.
Bose said her concern with the proposed law was that this could compromise the safety of young people, prompting a hostile response from opposition Sen. Sarah Henderson.
That's an outrageous statement. You're trying to protect the big tech giants, Henderson said.
Independent Senator Jacqui Lambie questioned why social media platforms don't leverage their algorithms to block harmful content from reaching children. These algorithms, she noted, have been criticized for fostering technology addiction among young users and exposing them to damaging material, such as content promoting suicide and eating disorders.
"Your platforms are capable of addressing this issue. The only thing holding them back is their own greed," Lambie remarked.
Bose said algorithms were already in place to protect young people online through functions including filtering out nudity.
We need to see continued investment in algorithms and ensuring that they do a better job at addressing harmful content, Bose said.
Opposition Senator Dave Sharma asked Bose if she knew how much advertising revenue the platforms she represented earned from Australian children, to which she admitted she did not. Bose also said she was unfamiliar with a study by the Harvard T.H. Chan School of Public Health that reported X, Facebook, Instagram, TikTok, YouTube, and Snapchat generated $11 billion in advertising revenue from U.S. users under 18 in 2022.
Why is this Law Being Introduced?
The proposed law is being introduced as part of the Australian government's efforts to enhance online safety for children. Some of the major concerns are:
1. Protection from Harmful Content: Ensuring that children are not exposed to inappropriate or harmful material, such as violent or sexual content, on social media platforms.
2. Combating Online Exploitation: Addressing risks of grooming, cyberbullying, and other forms of exploitation that children might encounter online.
3. Accountability of Social Media Companies: Holding platforms responsible for enforcing their own minimum age limits, which are often ignored.
4. Supporting Parental Efforts: Aiding parents in regulating their children's social media use, especially when platforms lack robust verification systems.