The Australian eSafety commissioner has served Apple, Microsoft, Facebook, WhatsApp, Instagram and other tech giants with world-first legal orders to reveal exactly they are doing to detect and report child sex abuse material, or else face fines of $550,000 a day.
The tech giants have been put on notice under the Australian government’s new Basic Online Safety Expectations, a key part of the Online Safety Act 2021.
These expectations set out the minimum safety requirements expected of tech companies which wish to operate in Australia, along with the steps they should take to protect Australian users from harm. The expectations will be accompanied by mandatory codes.
The move will help eSafety “lift the hood” on what companies are doing – and not doing – to protect their users from harm, says Australian eSafety commissioner Julie Inman Grant.
“Some of the most harmful material online today involves the sexual exploitation of children and, frighteningly, this activity is no longer confined to hidden corners of the dark web but is prevalent on the mainstream platforms we and our children use every day,” Inman Grant says.
“As more companies move towards encrypted messaging services and deploy features like live-streaming, the fear is that this horrific material will spread unchecked on these platforms. Child sexual exploitation material that is reported now is just the tip of the iceberg – online child sexual abuse that isn’t being detected and remediated continues to be a huge concern.”
The spread of child sexual exploitation material online is a global scourge; last year, 29.1 million reports were made to the National Centre for Missing and Exploited Children. The office of the Australian eSafety commissioner has handled more than 61,000 complaints about illegal and restricted content since 2015, with the majority involving child sexual exploitation material.
“We know there are proven tools available to stop this horrific material being identified and recirculated, but many tech companies publish insufficient information about where or how these tools operate, and too often claim that certain safety measures are not technically feasible,” Inman Grant says.
“Industry must be upfront on the steps they are taking, so that we can get the full picture of online harms occurring and collectively focus on the real challenges before all of us. We all have a responsibility to keep children free from online exploitation and abuse.”