Regulatory Requirements

The CSAM Deterrence Centre is committed to helping companies create a safer online environment. With new industry codes coming into effect in both Australia and the UK, and around the world, we are here to support technology companies in meeting these new requirements. Transparency reporting has been introduced to ensure safety measures are demonstrated to the regulator and the public by platforms and service providers.

In Australia, the Online Safety Act provides for mandatory Industry Codes & Standards requiring companies operating in Australia to:

  • Social Media Services, Messaging Services, File Storage and Sharing, Gaming, Websites: Proactively detect and remove CSAM and ‘disrupt and deter’ its creation, access or distribution.
  • Generative AI services: In addition to the above detection, disruption and deterrence obligations, certain generative AI services must:
    • Ensure that end-users in Australia specifically seeking CSAM images are presented with prominent messaging that outlines the potential risk and criminality of accessing the material;
    • Ensure that material generated using terms that have known associations to CSAM are accompanied by information or links to services (e.g. reporting to law enforcement or sharing support links).
  • Search Engine Providers: Remove search results that link to CSAM and ensure that search results specifically seeking images of known CSAM are accompanied by deterrent messaging that outlines the potential risk and criminality of accessing images of CSAM, preventing users from easily finding and accessing it.
  • Hosting Services: Must have and enforce policies related to CSAM on hosted services.

The CSAM Deterrence Centre aims to assist companies in implementing such measures, including warning messages, and to evaluate their effectiveness.

There are also a range of requirements on these services to prevent children’s access to age inappropriate material, such as online pornography that will come into effect later this year and into 2026.

In addition to industry specific codes and standards under the Online Safety Act, Australia’s eSafety Commissioner regulates a set of Basic Online Safety Expectations (BOSE), which state that providers need to take proactive and preventative measures to address CSAM. Under the Online Safety Act, the eSafety Commissioner has transparency powers that require digital platforms to report publicly on their efforts to address illegal and harmful content like CSAM through the BOSE. These reports must provide clear data on the nature and effectiveness of the safety measures implemented.

The CSAM Deterrence Centre supports this by providing robust, evidence-based evaluations of deterrence initiatives, equipping companies with the data needed to help them meet their reporting obligations and demonstrate their commitment to user safety.