Denmark proposes a voluntary detection regime for Child Sexual Abuse Material (CSAM) in the European Union, moving away from mandatory detection amid concerns over privacy and effectiveness.
- The Danish Council presidency, led by the justice minister, announced a shift from mandatory detection orders to a voluntary detection regime for child pornography in the EUs CSAM proposal.
- This legislative change aims to address child sexual abuse material online while reducing privacy concerns that were raised in earlier drafts of the proposal supported by Germany.
- The European Commission has faced backlash from platforms like WhatsApp, which oppose mandatory detection measures, emphasizing the need for a balanced approach in Brussels.
Why It Matters
This shift reflects growing tensions in the European Union regarding privacy versus child protection, highlighting the complexity of regulating online platforms while ensuring user safety and rights. The decision could significantly influence future legislation on online child protection across member states.