The Proposed Regulation on Combatting Child Sexual Abuse focuses on the role that online service providers should have to protect children from online sexual abuse, by introducing mandatory detection measures for not only known child sexual abuse material (CSAM), but also new CSAM and grooming. While there is wide agreement on the need to swiftly and effectively protect children online, concerns have been raised that mandatory detection measures, as imposed under the current form of the proposed regulation, pose a threat to fundamental rights, particularly to the rights to data protection, respect of private life and confidentiality of communication. The goal of this panel is not to criticize the proposed regulation by elevating privacy to an absolute right but rather to take a pragmatic approach, well-informed by real-world affordances of currently available automation tools for child protection, assessing the feasibility and implications of integrating such technologies into the envisioned legal framework.