Protecting children by acting on abusive online content
Telecom operators and other companies can act to fight child sexual abuse material. Telia Company works with various innovative technologies and cooperates with the police and NGOs to help their work.
- One of the targets for Global Goal number 16,”Peace, justice and strong institutions”, is to end abuse and all forms of violence against children. Through our work with Children online we can have an impact, and we have a number of concrete activities supporting this goal says Anne Larilahti, Head of Group Sustainability Strategy, Telia Company.
Telia has taken an active position to participate in the fight against child sexual abuse material (CSAM) online. One measure is to maintain filters on our networks that block access to websites known to host CSAM.
- Telia stands for and promotes an open internet and this is the only content that we voluntarily block. It is very important for us to work long term on protecting and strengthening children online as part of our daily operations, Anne continues.
However, although it has impact, blocking has limited effect on the real purpose, which is to prevent sexual abuse of children. So more needs to be done. One approach is to work in partnerships, where many actors work for the same purpose. In this context Telia works with NGOs like World Childhood Foundation, with technology companies like NetClean and with law enforcement. We regularly share our experiences and discuss with others on how to best move forward through innovative technology and partnerships.
We have also activated a technology in our Nordic and Baltic operations that alerts if CSAM is detected somewhere on Telia’s own internal IT systems. When such incidences happen, police are notified and a process to investigate and possibly ultimately press charges and dismiss an employee is started.
- The core purpose of cleaning up our internal IT-system from child sexual abuse material is to rescue children from ongoing or future abuse. By detecting and reporting people who watch child sexual abuse material on workplace equipment we can help the police to rescue children from ongoing sexual abuse and protect others from abuse in the future, Anne concludes.
According to international studies, more than 50% of the persons watching CSAM - surprisingly often on their work computers - also abuse children. Hence, making it as difficult as possible to access this material and intervening when it does happen does make a difference. It is worth it. Even one single child is worth it.