Australian Electronic Security Commissioner declared on Thursday that some of the world’s largest IT companies, including Apple and Microsoft, are not doing enough to crack down on child abuse material.
In August, the regulator required Apple, Meta (recognized as extremist in Russia, owns Facebook and Instagram), WhatsApp, Microsoft, Skype, Snap and Omegle to answer a number of specific questions about how they counteract the dissemination of materials with scenes of child abuse.
Based on the responses received, the regulator issued a report that showed that Apple and Microsoft “are not proactively trying to detect child abuse material” in iCloud and OneDrive storage. In addition, the companies do not use any technology to detect broadcasts of sexual content involving children in Skype, Microsoft Teams, or FaceTime video chats.
This is despite the widespread availability of Microsoft’s PhotoDNA service for detecting such material. It is believed that the service makes one mistake in 50 billion search results, the regulator stressed.
The commissioner said the data from the report will be used to raise security standards in the IT industry.
Recallin 2020, the Australian Department of Infrastructure, Transport, Regional Development and Communications introduced the Online Safety Bill, which provided new powers for electronic safety commissioners when dealing with “abhorrent violent” content, as such, the document refers to extremely cruel audio and video materials with recordings of terrorist attacks, murders, torture, rape, etc.