attestation-findings/output/reports/childnet_age_verification_report.md
2026-03-14 07:10:42 +00:00

18 KiB

Childnet International: Age Verification Positions & Industry Ties

Date: 2026-03-14 Subject: UK Charity 1080173 - Childnet International Focus: Age verification positions, Meta funding ties, independence concerns


1. EXECUTIVE SUMMARY

Childnet International's public positions on age verification are notably vague and non-specific about methods. Despite being one of the UK's most prominent child safety charities and sitting on Meta's Safety Advisory Council since 2009, Childnet has avoided taking clear positions on where age verification should occur (platform-level vs. device/OS-level vs. app store). Their positions consistently emphasize education, parental involvement, and multi-faceted approaches over strong platform-level mandates.

Key finding: We found no evidence that Childnet has explicitly advocated for Meta's preferred device-level/OS-level age verification approach. However, we also found no evidence that Childnet has advocated against it, or for the alternative position (requiring platforms themselves to verify age). Their silence on this critical policy question -- while being a member of Meta's Safety Advisory Council -- is itself notable.

Childnet does list Meta as a Tier 2 supporter/funder on its website. The Charity Commission is currently assessing concerns about Childnet's independence from tech company funders.


2. CHILDNET'S PUBLIC POSITIONS ON AGE VERIFICATION

2.1 Position on Age Verification for Pornography (2016 DCMS Consultation)

Childnet submitted written evidence to the DCMS consultation on age verification for pornography in February 2016. This is the most detailed statement Childnet has made on age verification methods.

Key positions from the 2016 consultation (source: childnet.com/wp-content/uploads/2021/11/Childnet-response-to-the-DCMS-consultation-into-age-verification-for-pornography-providers-Feb-2016.pdf):

  • Supported mandatory age verification for pornography: "Childnet are broadly in support of the approach set out by the Government"
  • Supported credit card verification as a method: "Verification by credit card ownership is an effective approach that has been implemented successfully by the gambling industry"
  • Explicitly favored "age verification, rather than identity verification" to "ensure the minimum data required is being collected"
  • Supported Ofcom-led and BBFC-supported civil enforcement
  • Critically, stated that "a single approach can only ever be moderately effective" and pushed a multi-pronged approach including education
  • Supported requiring pornography providers to make content detectable by filters (e.g., XML-labeling)

What was NOT said: The 2016 submission did not mention device-level verification, app store verification, or OS-level verification. The focus was entirely on website-level verification and regulatory enforcement against non-compliant sites.

2.2 Position on Ofcom Children's Codes (2025)

When Ofcom's Protection of Children Codes came into force, Will Gardner stated:

"This new regulatory regime means that the protections that we provide children and young people offline in relation to age-inappropriate content are now to be implemented online, and we support this step in better protecting children from harmful content online."

"At Childnet young people's voice is at the heart of everything we do and we think it is vital that with changes like this young people's opinions are listened to and taken into account."

Notable: Gardner welcomed the Ofcom codes (which place the burden on platforms to implement age checks) but did NOT specify which age verification methods should be used. Childnet called this "an important first step" but said "there is still more work to be done."

2.3 Position on VPN Circumvention of Age Verification (2025-2026)

Childnet published research arguing that the reported surge in VPN use following age verification enforcement on pornography sites was not attributable to children:

"Although the widely reported spike in VPN use in July has often been linked to the enforcement of the Online Safety Act (OSA) age verification requirements on online pornography providers, this spike cannot be attributed to children."

Key data cited:

  • 23% of young people started using VPNs in the 3 months coinciding with age verification enforcement
  • 21% started using VPNs one year prior
  • Only 10% used VPNs to access age-restricted content
  • 38% cited privacy/safety as the reason for VPN use

Significance: This research effectively provides cover for the age verification regime by arguing that children are not successfully circumventing it. This supports the narrative that platform-level age verification is working, which arguably cuts against Meta's argument that device-level verification is needed instead.

2.4 Position on Social Media Age Bans (January 2026)

Childnet was a signatory (along with 41 other organizations and individuals) to a joint statement opposing social media bans for under-16s, hosted by the Molly Rose Foundation (January 18, 2026).

Key quotes from the joint statement:

"Though well-intentioned, blanket bans on social media would fail to deliver the improvement in children's safety and wellbeing that they so urgently need."

"Bans are the wrong answer to a vital question. They risk unintended consequences that could leave children at greater risk of harm by treating the symptoms, not the problem."

"We want to see a requirement on platforms to use highly effective age assurance that robustly enforces minimum age limits."

"Companies should be required to set minimum age limits based on their functionalities and risk level to deliver age-appropriate experiences."

Significance: The joint statement explicitly places the burden on platforms to implement age assurance ("a requirement on platforms to use highly effective age assurance"). This is actually not aligned with Meta's preferred device/OS-level approach. However, Childnet was only one of 42 signatories and this was a coalition position, not Childnet's individual statement.

Other signatories included: NSPCC, 5 Rights Foundation, Internet Watch Foundation, SWGfL, UK Safer Internet Centre, Internet Matters, Parent Zone, Marie Collins Foundation, Full Fact, Breck Foundation, Institute for Strategic Dialogue, and several bereaved families.

2.5 Position on Social Media Age Restrictions (General)

Childnet's general guidance on age restrictions is educational rather than policy-focused:

  • Advises that "it's always better to wait until the required age to join any social media service"
  • Notes that platforms "may ask users to declare their age during sign up" (acknowledging the weakness of self-declaration)
  • Warns that using fake ages causes loss of age-appropriate protections
  • Emphasizes parental involvement and family discussions
  • Does NOT discuss specific age verification technologies or methods

2.6 Position on Mandatory Age Verification for Pornography (General)

Will Gardner stated:

"Protecting children from exposure...to adult content is incredibly important, given the effect it can have on young people. Steps like this to help restrict access...are key."

Childnet consistently frames the issue as requiring a multi-faceted approach: age verification + parental controls + education. They describe education as "essential" alongside technical restrictions.


3. META FUNDING AND FINANCIAL TIES

3.1 Direct Meta Funding

Confirmed: Meta is listed as a Tier 2 supporter on Childnet's official supporters page (childnet.com/who-we-are/supporters/).

Other Tier 2 supporters include: Amazon, BBFC, BBC, Community Fibre, Roblox, Vodafone, Snapchat, MPA, Sony Interactive Entertainment.

Tier 1 supporters (higher tier): Apple, Disney, Global Witness Foundation, LexisNexis, Nominet, Supercell, Tesco Mobile.

3.2 Charity Commission Financials (Year Ending 31 March 2025)

  • Total income: GBP 738,877
  • Donations and legacies: GBP 587,021 (79% of income)
  • Charitable activities: GBP 149,679
  • Government contracts: GBP 11,051
  • 13 employees, none earning over GBP 60,000
  • No trustee remuneration

Note: The Charity Commission filing does not break down individual corporate donations within the "Donations and legacies" category. The specific amount Meta contributes is not publicly disclosed through this channel.

3.3 ConnectSafely Relationship

  • Both Childnet and ConnectSafely are founding members of Meta's Safety Advisory Board/Council since December 2009
  • The original five founding members were: Common Sense Media, ConnectSafely, WiredSafety, Childnet International, and FOSI
  • ConnectSafely is NOT listed on Childnet's supporters page
  • ConnectSafely's 2024 IRS Form 990 (via ProPublica) shows total revenues of $784,500, but grant recipients are not detailed in the summary filing
  • The specific $100,000/year ConnectSafely-to-Childnet grant was not confirmed through publicly accessible 990 summaries; full Schedule I review would be needed

3.4 Safety Advisory Council Membership

Childnet prominently describes its SAC membership on its "Who we are" page:

"[Childnet] achieves a wider impact through giving young people a voice and influencing best practice and policy, both in the UK and internationally, sitting on Meta's Safety Advisory Council, and the Executive Board of the UK Council for Internet Safety."

In January 2025, Childnet joined the Safety Advisory Council's open letter criticizing Meta's decision to end third-party fact-checking and reduce content moderation. The letter demanded Meta:

  1. Prioritize mental health support for young people
  2. Account for global impact of policy decisions
  3. Commit to "Safety by Design"
  4. Champion media literacy education globally

Childnet stated: "Safety has to be a priority - any change that has safety implications for users must have safety considered at the outset."


4. INDEPENDENCE CONCERNS AND CRITICISM

4.1 Safer Internet Day Censorship Scandal (2024-2026)

The allegation: In 2024, Childnet's young ambassadors Lewis Swire and Saamya Ghai prepared speeches for Safer Internet Day. Critical comments about Snapchat (a Childnet funder) were removed from their speeches. Specifically, a line stating "Social media companies are in bed with the very same psychology used to exploit gambling victims" was cut, along with a claim that scrolling online is making people "sick."

Childnet's response: "We did not censor what our young speakers had to say. Time was short (four minutes per speaker) and they wanted to cover a lot of ground. All points, even those which were negative and challenging to the tech industry, were included but needed to be succinct."

Childnet also stated: Financial supporters "do not influence what we say to young people" or how the organization operates.

4.2 Charity Commission Referral

A group of peers and MPs signed an open letter to the Charity Commission calling for an investigation into Childnet and the suspension of Safer Internet Day. Signatories included:

  • Baroness Spielman (former Children's Commissioner/Ofsted Chair)
  • Baroness Jenkin
  • Neil O'Brien MP (Conservative, Harborough, Oadby and Wigston)

They described the evidence of censorship as "credible and compelling" and requested investigation into potential conflicts between Childnet's independence and its corporate sponsorship from tech companies.

Charity Commission status: "We are assessing concerns raised with us about Childnet to determine what regulatory role there is, if any, for the commission."

4.3 Tech Transparency Project Findings

The Tech Transparency Project (TTP) published "Inside Meta's Spin Machine on Kids and Social Media" documenting Meta's use of child safety organizations to counter concerns about platform harms. While Childnet was not individually profiled in the report, the report documented:

  • ConnectSafely CEO Larry Magid defended Meta's Messenger Kids as "training wheels for social media"
  • ConnectSafely created parent guides for Messenger Kids, Horizon Worlds, and Meta AI
  • Meta's Safety Advisory Council members raised concerns about the January 2025 moderation changes (the same letter Childnet signed)
  • Meta funded PROJECT ROCKIT whose CEO endorsed Instagram Teen Accounts without disclosing Meta funding

4.4 Broader Pattern: Digital Childhood Alliance

The Deseret News reported on Meta's involvement with the Digital Childhood Alliance (DCA), which promotes the App Store Accountability Act (ASAA). This is the specific legislative vehicle for Meta's device-level/app store age verification strategy. The DCA's executive director refused to name tech company funders when asked by Louisiana Senator Jay Morris.

Connection to Childnet: None found. Childnet has not been linked to the DCA or the ASAA. This is a US-focused initiative. However, the underlying policy position (shifting age verification from platforms to device/OS/app store level) is the same position Meta advocates globally.


5. ANALYSIS: ALIGNMENT WITH META'S POSITION

What Meta Wants

Meta's preferred age verification approach is to shift the burden to device/OS makers and app stores (the ASAA model), away from requiring individual social media platforms to verify users' ages.

Childnet's Actual Positions

Issue Childnet's Position Aligned with Meta?
Age verification for porn Supports platform-level verification No
Ofcom Children's Codes Welcomes platform-level obligations No
Social media ban for under-16s Opposes blanket bans Partially (Meta also opposes bans)
Device-level age verification No stated position Silence
VPN circumvention Says children aren't circumventing AV No (supports platform AV efficacy)
Education emphasis Strongly emphasizes education over regulation Partially (dilutes regulatory focus)
Multi-stakeholder approach Consistently advocates Partially (distributes responsibility)

Assessment

Childnet's positions are more nuanced and less directly aligned with Meta than might be expected given the funding relationship. Key observations:

  1. The joint statement on social media bans explicitly calls for "a requirement on platforms to use highly effective age assurance" -- this directly contradicts Meta's preferred device-level approach.

  2. Childnet has never publicly advocated for device-level or app-store-level age verification, which is Meta's primary policy goal.

  3. However, Childnet consistently de-emphasizes regulation in favor of education, parental controls, and multi-pronged approaches -- which could be seen as diluting pressure for strong platform-level mandates.

  4. The silence is notable: On the most contested policy question (who bears responsibility for age verification -- platforms or device/OS makers), Childnet has simply declined to take a position, despite being one of the UK's most influential child safety organizations with a seat on Meta's advisory council.

  5. The VPN research is the most interesting data point: it provides empirical cover for the current age verification regime (platform-level), suggesting it works. This could be read as supporting the status quo rather than Meta's push for a device-level alternative.


6. OUTSTANDING QUESTIONS

  1. What is the specific amount of Meta's financial support to Childnet? The Tier 2 listing suggests a significant but not dominant funding relationship.

  2. Does the ConnectSafely $100K/year wire to Childnet exist? Not confirmed through publicly available 990 data. Full Schedule I from ConnectSafely's Form 990 would be needed.

  3. Has Childnet submitted a response to the March 2026 UK government consultation on social media bans? The consultation opened March 2 and closes May 26, 2026. Childnet's individual submission (vs. the January 2026 joint statement) would be highly relevant.

  4. Has the Charity Commission opened a formal investigation? As of our research, they are still "assessing concerns."

  5. Has Childnet ever been asked directly about device-level vs. platform-level age verification? No evidence of this found.


7. KEY SOURCES