Becca, age 19 from Essex, is currently in a psychiatric ward. “Time goes by really slowly in here,” she says. To keep herself occupied while in hospital, Becca enjoys making TikToks: “It’s really fun and makes the time go by a little bit quicker.”
Becca is part of the app’s self-harm recovery community, with over 13,400 followers to her name. Much of her content uses humour to talk candidly about self-harm. “I think it’s a really good coping mechanism for me, because it allows me to be funny and lighthearted, and it also allows me to use my voice to hopefully help other people,” she says.
But none of Becca’s videos are tagged #selfharm or #selfharmrecovery. If you search either of these terms on TikTok, no videos will come up. Instead, you’ll find a support resources page and the Samaritans number.
Naturally, censoring this one hashtag doesn’t mean that TikTok is free from content that discusses self-harm. Tech-savvy users like Becca have taken to creating ‘secret’ self-harm hashtags on the app, where largely unmoderated content is free to circulate.
Aoibheann, a 17-year-old from Dublin, is another user who posts under these secret hashtags. Like Becca, Aoibheann enjoys creating funny videos, using dark humour to talk openly about self-harm. “I find it helps, because people who’ve gone through the same thing relate,” she explains. Aoibheann’s frank, upfront approach to talking about self-harm clearly resonates with a lot of people, as she currently boasts an impressive 22,900 followers on TikTok.
It’s no surprise that there’s a market for Aoibheann and Becca’s content. According to Mental Health First Aid England, 25.7 per cent of women and 9.7 per cent of men aged 16 to 24 report have self-harmed at some point in their life. Research published in February 2021 found that the rate of self-harm among young children in the UK has doubled over the last six years, with an average of 10 children aged between 9 to 12 being admitted to hospital each week after intentionally injuring themselves.
While it’s wrong to assume that these ‘secret’ self-harm hashtags are entirely full of dangerous, pro-self-harm content, it seems that by pushing users into using these secret tags, TikTok has inadvertently made it easier for people to stumble across triggering videos. “I have been triggered by some content. If there’s like, obvious open wounds and they haven’t used appropriate warnings, it can catch you off-guard,” Becca says.
Aoibheann too has been triggered by content found under these secret hashtags. “It’s not really their fault,” she says, speaking of the creators behind the videos which caused her distress. Aoibheann makes the point that triggers are complicated, and users sometimes unknowingly upload content which can cause others to spiral: “Like personally, if I see that someone has worse scars than me, I’m like, ‘Oh, I’m not that bad, theirs is worse.’ But that’s not their fault.”
Aoibheann’s story offers an insight into the complexities of self-harm triggers. Censoring ‘triggering’ content isn’t as straightforward as making the platform free of graphic imagery (although that certainly helps). Recovering self-harmers, like any other group, are not a homogenous mass to which a one-size-fits-all rule can be applied – so how does this work on social media? How do you moderate something where the response differs wildly from person to person? As Aoibheann says: “People post triggering things on TikTok sometimes without realising it – the TikTok mental health community is really sensitive.”
Speaking to Huck, a TikTok spokesperson says: “Our Community Guidelines make clear that we do not allow content depicting, promoting, normalising, or glorifying activities that could lead to suicide or self-harm. However, we do support members of our community sharing their personal experiences with these issues in a safe way to raise awareness and find community support.” But is this policy doing enough to protect vulnerable people on the app?
Dr Ysabel Gerrard, a social media researcher at the University of Sheffield, explains that speaking out about mental health struggles online can be a huge comfort to sufferers – but stresses that it’s vital this content is moderated effectively. “It’s really important that TikTok’s content moderation practices are effectively able to separate helpful from harmful self-harm related posts, as unfortunately, pro-self-harm content does exist,” she says.
Dr Gerrard currently sits on Facebook’s Suicide and Self-Injury Advisory Board and she suggests that there are “lots” of things TikTok could do to make the platform a safer place for recovering self-harmers. “For example, they could launch their own version of ‘sensitivity screens’, similar to Instagram’s, to warn users when they’re about to see a post about self-harm,” she says. “TikTok users can also indicate when they’re ‘not interested’ in a particular post and I’d encourage TikTok to take these requests incredibly seriously when they come from videos about self-harm.”
Moderating video-based content is complex, and TikTok are doing a good job in spite of this – 92.4 per cent of the 90 million video removed in the second half of 2020 were removed before a user reported them – but that’s not to say there’s no room for improvement. As Dr Gerrard says, “This is an unbelievably difficult area of social media policy-making and will never be resolved to everyone’s satisfaction, but that of course doesn’t mean we should ever give up or deprioritise it.”
Ultimately, however, this isn’t a problem that begins and ends with TikTok. Dr Gerrard points out that “self-harm existed long before TikTok did, so while the company evidently needs to make this issue one of its top priorities and treat it very carefully, they can’t take all of the blame.” There wouldn’t be so many triggering videos if fewer young people were struggling in the first place. According to the Children’s Commissioner, in 2017, 1 in 9 people aged between 5 and 19 had a probable mental disorder. In 2020, this rose to 1 in 6. There is no impending youth mental health crisis: we’re in the middle of the crisis already.
A decade of Tory austerity has left mental health services chronically underfunded. Becca has experienced these crippled services firsthand: “There’s a long waiting list, it’s very disorganised, and there’s not enough being done,” she says.
Child and adolescent mental health services (CAMHS) have been particularly neglected, accounting for under one per cent of total NHS spending. This is having a tangible effect on young people: 26 per cent of referrals to specialist children’s mental health services were rejected in 2018-2019. Young people waited an average of 56 days to begin mental health treatment in 2019. And heartbreakingly, people aged 10 to 24 years in England and Wales have seen one of the greatest increases in suicide rates over the past decade.
There’s often a lot of focus on the negative impacts of social media on mental health, when arguably, the real problem is the preexisting youth mental health crisis. Really, TikTok – or social media more generally – isn’t the crux of the issue. The crux of the issue is the fact that CAMHS has been woefully neglected and overstretched for years, with young people all over the country now bearing the impact.
So, yes, we must call on TikTok to ensure robust content moderation prevents users from seeing distressing content. But we must also give the same amount of energy, attention, and time towards demanding that the government’s policies on child and adolescent mental health services are robust too. And until action is taken, we’d do well to remember what Dr Gerrard points out: “Social media is often people’s only real form of mental health support.”
Follow Serena Smith on Twitter.