Failure to respond could see services fined up to $782,500 a day.
Google, Meta, Microsoft, and Apple will need to compile a report every six months to the eSafety Commission explaining how they are dealing with child abuse material on their platforms.
The eSafety office also requires social media services, including Discord and WhatsApp, to go into more detail and outline how they are tackling deep fake material of children generated with AI, live-streamed abuse, and sexual extortion, on top of child abuse material.
Failure to respond could result in services being fined up to $782,500 (US$517,000) a day.
eSafety Commissioner Julie Inman Grant said the notices aimed to pressure tech giants into increasing their efforts to protect children online.
This action follows their answers in previous reports about online child safety, which had been “alarming but not surprising.”
“In our subsequent conversations with these companies, we still haven’t seen meaningful changes or improvements to these identified safety shortcomings,” Ms. Inman Grant said.
She revealed that Apple and Microsoft do not proactively detect child abuse material stored in their cloud services despite being known for holding this content.
Meanwhile, apps like FaceTime, Discord, and Skype did not use any technology to detect abuse during live streams.
eSafety also found that a number of Google services, including YouTube, do not block links to websites that are known to house child abuse material.
Ms. Inman Grant also noted the response time differed between platforms.
“Back in 2022, Microsoft said on average it took two days to respond … Snap, on the other hand, reported responding within 4 minutes,” she said.
“Speed isn’t everything, but every minute counts when a child is at risk.”
The primary focus of this notice round is curbing the ability of adults to contact children online, sexual extortion risks, livestreaming, and AI-generated deepfakes.
“These notices will let us know if these companies have made any improvements in online safety since 2022/3 and make sure these companies remain accountable for harm still being perpetrated against children on their services,” Ms. Inman Grant said.
“We know that some of these companies have been making improvements in some areas—this is the opportunity to show us progress across the board.”
Companies have until Feb. 15, 2025, to submit their first response.
Protect the Most Vulnerable
UNICEF welcomed the move to protect children online and hold big tech companies to account.
“We all have a role to play in protecting young people online. This includes making sure tech companies are accountable for their part, and provide information to the eSafety Commissioner about what they are doing to protect the most vulnerable,” UNICEF Australia’s digital policy lead John Livingstone said.
“Fundamentally, the online world needs to be a safe environment for young people to explore, connect, and learn. UNICEF Australia wants Australia to be the safest place in the world for children to go online.”
He also continued to call for stronger measures in the Online Safety Act to provide the “highest level of protection” possible for children.
This comes after the eSafety Commission issued notices to tech companies to draft enforceable codes to bar children from accessing pornography.
“It’s not just porn sites we are talking about here, with 60 percent of young people telling us they were exposed to pornography on social media.”