In response to child safety concerns, Meta now says “the impacts of E2EE go far beyond such a simplistic ‘privacy versus security’ or ‘privacy versus safety’ framing.” Again, this is right as regards dedicated messaging, but wrong where social media platforms are concerned. Ironically, in this regard Meta’s criticism of Apple on the privacy front is well justified. That said, the report does criticize client-side scanning, which it says “would undermine the integrity of E2EE and disproportionately restrict people’s privacy and a range of other human rights.” Bad news for Apple, which has embarked down the client-side scanning route. Unsurprisingly, the new report echoes this, recommending in Meta’s words that “we continue to invest in effective harm prevention strategies such as metadata and behavioral analysis, user education and robust user reporting, among other tools.” like restricting interactions between adults and minors.” Working together also gives us more information to identify abusive accounts and allows us to introduce safety features. “We’re building strong safety measures that are designed to prevent harm from happening in the first place,” the company has told me, “and give people controls to respond if it does. We saw this last year, where Facebook evidence was critical to the capture of “ one of the web’s most dangerous pedophiles ,” which, investigators say, would not have been possible with end-to-end encryption.įacebook’s response is to bulk up metadata AI analysis. As children’s charity NSPCC has warned, this proposed update to Messenger risks “failing to protect children from avoidable harm,” that “10% of child sexual offences on Facebook-owned platforms take place on WhatsApp, but they account for less than 2% of child abuse the company reports to police because they can’t see the content of messages,” which makes this new update very high-risk.
0 Comments
Leave a Reply. |