Online platforms have a responsibility to protect children from harm – TechCrunch

0


Facebook whistleblower Frances Haugen’s post on Instagram’s impact on teenage girls was unequivocal: Facebook studies found that 13% of UK teens said Instagram had elicited suicidal thoughts and 17% of teenage girls said Instagram made eating disorders worse

These statistics, however, are only part of the bigger picture when it comes to the overall safety of teens online.

It is estimated that there are over 500,000 active sexual predators on the Internet every day. In 2020, more than 21.7 million suspected cases of child sexual exploitation were reported to the National Center for Missing & Exploited Children’s CyberTipline. Online incentive reports – which detail when a person communicates with a child through the internet with the intention of exploiting them – increased by more than 97% from the previous year.

Reports of online predators are on the rise, but online predatory behavior is as old as Netscape.

My family got our first PC in 1999. I started on gaming platforms like Neopets and Gaia Online. Soon I was posting thoughts and communicating with other users on Myspace and Tumblr. As my online world grew, I encountered older men pretending to be tweens. At one point, I started a “relationship” with a 17 year old boy when I was only 12 years old. Of course, I didn’t talk about any of this, mostly out of shame. I didn’t know I was being treated – I had never heard the word used until I started working on gender-based violence myself.

Grooming is subtle and for an unfamiliar teenager, undetectable. An individual prepares to establish trust and an emotional connection with a child or adolescent so that they can manipulate, exploit and abuse them. It can look like an older teenager asking for a webcam and slowly tricking a child or teen into doing inappropriate things, like turning around for them or changing clothes for something “cuter” or a digital “friend” lobbying. on someone to engage in cybersex. Predators sometimes pretend to be a youngster to obtain personal details such as photos or a sexual history; they then turn this information into weapons for their own enjoyment.

I only recently realized that there was CSAM – or child sexual abuse material – of me on the internet. Images of me may still reside on someone’s old cell phone or on a hard drive collecting dust. It could one day be shared on private Discords or Telegram channels.

My individual experience as a teenager on the internet is part of what led me to create a nonprofit online background check that allows anyone to see if anyone they are talking to has a history of violence. – ideally before the first in-person meeting. We recently made the decision to allow users as young as 13 years old to access our public folder database in the future. While we may never be able to completely prevent children and teens from being exploited online, we can at least arm them with tools and technology to understand if someone they meet is. line has a history of bad behavior.

Of course, a background check is only one tool in the security arsenal – people frequently lie about their names and identities. If a child is being cared for or if an adult exploits them, they often do so anonymously, in isolation and in secrecy.

That is why it is essential to educate young people to avoid the dangers that lurk online. This may involve teaching them to identify early red flags like love bombing, extreme jealousy, pushing boundaries, etc. We can also communicate to young people what a healthy, safe and consensual relationship looks like – with “green flags” as opposed to red ones.

There are various practical skills that we can also incorporate into the education of children. Teach them to be selective about what photos they share and what follow-up requests they accept and bring an adult if they meet people they know online in real life.

When the adults around them openly and systematically discuss the dangers of online dating and Internet communication, children and teens learn to recognize the risks. This can go a long way in preventing serious trauma. Conversations about online safety, like sex education, are often left to parents, while parents assume children are having them in school. These discussions can be difficult to navigate, especially for parents who don’t always understand the online culture, but it is essential that parents seek out resources to educate themselves.

As Haugen pointed out, online platforms also have a responsibility. Online platform trust and security departments are relatively new, and there is still a lot to learn and improve.

On most digital platforms, content moderators are understaffed, underpaid, and under-trained. Online platforms should prioritize protection over profit and invest in additional training and mental health support for those responsible for the security of their platforms. By giving security teams the tools and the time they need to think critically about questionable content, they can execute their mandate efficiently and prudently.

While the internet can create environments that lead to abuse, it can also be a powerful tool for educating young people about the warning signs and realities of the world, including giving them access to information about who to whom. they talk online.

Reactive measures to tackle abuse – from the criminal justice system to platform moderators – are a band-aid on a bleeding wound. Preventing sexual abuse before it happens is the best protection we can offer our children. By taking responsibility – whether as platforms, politicians, or parents – for the potential damage done online, we can begin to create a safer world for all of us.


Leave A Reply

Your email address will not be published.