Australia goes after Apple and Microsoft for how well they protect children online.
After using new powers to force the tech giants to share information about their methods, an Australian regulator said that Apple Inc. (AAPL.O) and Microsoft Corp. (MSFT.O) weren’t doing enough to stop child exploitation content from being posted on their platforms.
The e-Safety Commissioner, an office set up to protect people who use the internet, said that after sending legal requests for information to some of the biggest internet companies in the world, the answers showed that Apple and Microsoft did not actively check their storage services, iCloud and OneDrive, for child abuse material.
In a report released Thursday, the commissioner said that the two companies also confirmed that they did not use any technology to stop live-streaming of child sexual abuse on video services Skype, Microsoft Teams, and FaceTime, which are owned by Microsoft and Apple, respectively.
Related: Microsoft will put in place a “data boundary” for customers in the EU on January 1.
A Microsoft spokesperson said that the company was committed to stopping the spread of abuse material, but that “as threats to children’s safety continue to change and bad actors get better at what they do, we continue to push ourselves to change our response.”
Apple could not be reached right away for comment.
The disclosure proves that some of the biggest tech companies in the world aren’t doing enough to protect children. This means that the public will put more pressure on these companies to do more, according to the commissioner. Facebook, Instagram, and WhatsApp are all owned by Meta Platforms Inc. (META.O) and Snap Inc. (SNAP.N), which owns Snapchat.
In a statement, commissioner Julie Inman Grant said that the responses as a whole were “alarming” and raised concerns about the “clearly inadequate and inconsistent use of widely available technology to detect child abuse material and grooming.”
Microsoft and Apple “do not even try to proactively detect previously confirmed child abuse material” on their storage services, but law enforcement agencies use a Microsoft product that does this.
After privacy advocates put pressure on Apple, the company said last week that it would stop scanning iCloud accounts for signs of child abuse. Inman Grant called this “a major step backwards from their responsibility to help keep children safe.”
Related: Microsoft will acquire a 4% stake in LSEG as part of a 10-year commercial agreement.
She also said that the fact that both companies didn’t catch live-streamed abuse meant that “some of the biggest and richest technology companies in the world turned a blind eye and didn’t take the right steps to protect the most vulnerable from the most predatory.”