UK offers cash for CSAM detection tech targeted at e2e encryption

The UK government is preparing to spend over half a million dollars to encourage the development of detection technologies for child sexual exploitation material (CSAM) that can be bolted on to end-to-end encrypted messaging platforms to scan for the illegal material, as part of its ongoing policy push around Internet and child safety.

In a joint initiative today, the Home Office and the Department for Digital, Media, Culture and Sport (DCMS) announced a “Tech Safety Challenge Fund” — which will distribute up to £425,000 (~$584k) to five organizations (£85k/$117k each) to develop “innovative technology to keep children safe in environments such as online messaging platforms with end-to-end encryption”.

A Challenge statement for applicants to the program adds that the focus is on solutions that can be deployed within e2e encrypted environments “without compromising user privacy”.

“The problem that we’re trying to fix is essentially the blindfolding of law enforcement agencies,” a Home Office spokeswoman told us, arguing that if tech platforms go ahead with their “full end-to-end encryption plans, as they currently are… we will be completely hindered in being able to protect our children online”.

While the announcement does not name any specific platforms of concern, Home Secretary Priti Patel has previously attacked Facebook’s plans to expand its use of e2e encryption — warning in April that the move could jeopardize law enforcement’s ability to investigate child abuse crime.

Facebook-owned WhatsApp also already uses e2e encryption so that platform is already a clear target for whatever ‘safety’ technologies might result from this taxpayer-funded challenge.

Apple’s iMessage and FaceTime are among other existing mainstream messaging tools which use e2e encryption.

So there is potential for very widespread application of any ‘child safety tech’ developed through this government-backed challenge. (Per the Home Office, technologies submitted to the Challenge will be evaluated by “independent academic experts”. The department was unable to provide details of who exactly will assess the projects.)

Patel, meanwhile, is continuing to apply high level pressure on the tech sector on this issue — including aiming to drum up support from G7 counterparts.

Writing in paywalled op-ed in Tory-friendly newspaper, The Telegraph, she trails a meeting she’ll be chairing today where she says she’ll push the G7 to collectively pressure social media companies to do more to address “harmful content on their platforms”.

“The introduction of end-to-end encryption must not open the door to even greater levels of child sexual abuse. Hyperbolic accusations from some quarters that this is really about governments wanting to snoop and spy on innocent citizens are simply untrue. It is about keeping the most vulnerable among us safe and preventing truly evil crimes,” she adds.

“I am calling on our international partners to back the UK’s approach of holding technology companies to account. They must not let harmful content continue to be posted on their platforms or neglect public safety when designing their products. We believe there are alternative solutions, and I know our law enforcement colleagues agree with us.”

In the op-ed, the Home Secretary singles out Apple’s recent move to add a CSAM detection tool to iOS and macOS to scan content on user’s devices before it’s uploaded to iCloud — welcoming the development as a “first step”.

“Apple state their child sexual abuse filtering technology has a false positive rate of 1 in a trillion, meaning the privacy of legitimate users is protected whilst those building huge collections of extreme child sexual abuse material are caught out. They need to see th[r]ough that project,” she writes, urging Apple to press ahead with the (currently delayed) rollout.

Last week the iPhone maker said it would delay implementing the CSAM detection system — following a backlash led by security experts and privacy advocates who raised concerns about vulnerabilities in its approach, as well as the contradiction of a ‘privacy-focused’ company carrying out on-device scanning of customer data. They also flagged the wider risk of the scanning infrastructure being seized upon by governments and states who might order Apple to scan for other types of content, not just CSAM.

Patel’s description of Apple’s move as just a “first step” is unlikely to do anything to assuage concerns that once such scanning infrastructure is baked into e2e encrypted systems it will become a target for governments to widen the scope of what commercial platforms must legally scan for.

However the Home Office’s spokeswoman told us that Patel’s comments on Apple’s CSAM tech were only intended to welcome its decision to take action in the area of child safety — rather than being an endorsement of any specific technology or approach. (And Patel does also write: “But that is just one solution, by one company. Greater investment is essential.”)

The Home Office spokeswoman wouldn’t comment on which types of technologies the government is aiming to support via the Challenge fund, either, saying only that they’re looking for a range of solutions.

She told us the overarching goal is to support ‘middleground’ solutions — denying the government is trying to encourage technologists to come up with ways to backdoor e2e encryption.

In recent years in the UK GCHQ has also floated the controversial idea of a so-called ‘ghost protocol’ — that would allow for state intelligence or law enforcement agencies to be invisibly CC’d by service providers into encrypted communications on a targeted basis. That proposal was met with widespread criticism, including from the tech industry, which warned it would undermine trust and security and threaten fundamental rights.

It’s not clear if the government has such an approach — albeit with a CSAM focus — in mind here now as it tries to encourage the development of ‘middleground’ technologies that are able to scan e2e encrypted content for specifically illegal stuff.

In another concerning development, earlier this summer, guidance put out by DCMS for messaging platforms recommended that they “prevent” the use of e2e encryption for child accounts altogether.

Asked about that, the Home Office spokeswoman told us the tech fund is “not too different” and “is trying to find the solution in between”.

“Working together and bringing academics and NGOs into the field so that we can find a solution that works for both what social media companies want to achieve and also make sure that we’re able to protect children,” said said, adding: “We need everybody to come together and look at what they can do.”

There is not much more clarity in the Home Office guidance to suppliers applying for the chance to bag a tranche of funding.

There it writes that proposals must “make innovative use of technology to enable more effective detection and/or prevention of sexually explicit images or videos of children”.

“Within scope are tools which can identify, block or report either new or previously known child sexual abuse material, based on AI, hash-based detection or other techniques,” it goes on, further noting that proposals need to address “the specific challenges posed by e2ee environments, considering the opportunities to respond at different levels of the technical stack (including client-side and server-side).”

General information about the Challenge — which is open to applicants based anywhere, not just in the UK — can be found on the Safety Tech Network website.

The deadline for applications is October 6.

Selected applicants will have five months, between November 2021 and March 2022 to deliver their projects.

When exactly any of the tech might be pushed at the commercial sector isn’t clear — but the government may be hoping that by keeping up the pressure on the tech sector platform giants will develop this stuff themselves, as Apple has been.

The Challenge is just the latest UK government initiative to bring platforms in line with its policy priorities — back in 2017, for example, it was pushing them to build tools to block terrorist content — and you could argue it’s a form of progress that ministers are not simply calling for e2e encryption to be outlawed, as they frequently have in the past.

That said, talk of ‘preventing’ the use of e2e encryption — or even fuzzy suggestions of “in between” solutions — may not end up being so very different.

What is different is the sustained focus on child safety as the political cudgel to make platforms comply. That seems to be getting results.

Wider government plans to regulate platforms — set out in a draft Online Safety bill, published earlier this year — have yet to go through parliamentary scrutiny. But in one already baked in change, the country’s data protection watchdog is now enforcing a children’s design code which stipulates that platforms need to prioritize kids’ privacy by default, among other recommended standards.

The Age Appropriate Design Code was appended to the UK’s data protection bill as an amendment — meaning it sits under wider legislation that transposed Europe’s General Data Protection Regulation (GDPR) into law, which brought in supersized penalties for violations like data breaches. And in recent months a number of social media giants have announced changes to how they handle children’s accounts and data — which the ICO has credited to the code.

So the government may be feeling confident that it has finally found a blueprint for bringing tech giants to heel.

Article Categories:
Technology