TikTok Launches Mental Health Guide After Report About Instagram's Impact On Teens

Mental health and tiktok mental health tiktok videos mental health on tiktok mental health and tiktok tiktok mental health research tiktok and mental health issues tiktok effect on mental health tiktok and mental health statistics tiktok affecting mental health tiktok effects on mental health mental health tiktok videos tiktok and adolescent mental health
TikTok launches mental health guide after report about Instagram's impact on teens


TikTok launches mental health guide after report about Instagram's impact on teens

TikTok shared a handful of new features on Tuesday designed to support users' mental well-being, including guides on how to engage with people who may be struggling and updated warning labels for sensitive content. The changes come as Facebook's research into its photo-sharing app Instagram, which last year launched TikTok competitor Reels, has reportedly raised concerns about Instagram's impact on the mental health of teens

"While we don't allow content that promotes, glorifies or normalizes suicide, self-harm or eating disorders," TikTok said in a blog post, "we do support people who choose to share their experiences to raise awareness, help others who might be struggling and find support among our community."

To more safely support these conversations and connections, TikTok is rolling out new well-being guides to help people sharing their personal experiences on the video app. The guides were developed along with the International Association for Suicide Prevention, Crisis Text Line, Live for Tomorrow, Samaritans of Singapore and Samaritans (UK), and they're available on TikTok's Safety Center.

The social video app is also sharing a new Safety Center guide for teens, educators and caregivers about eating disorders. The guide was developed along with experts like the National Eating Disorders Association, National Eating Disorder Information Centre, Butterfly Foundation and Bodywhys, and offers information, support and advice. Earlier this year, TikTok added a feature that directs users searching for terms related to eating disorders to appropriate resources. 

In addition, when someone searches for words or phrases like #suicide, they're pointed to local support resources like the Crisis Text Line helpline to find information on treatment options and support. 

TikTok also said it's updating its warning label for sensitive content, so that when a user searches for terms that could surface distressing content, such as "scary makeup," the search results page will show an opt-in viewing screen. Users can tap "Show results" to view the content. 

The site is also showcasing content from creators sharing their personal experiences with mental well-being, information on where to get help and advice on how to talk to loved ones. 

"These videos will appear in search results for certain terms related to suicide or self-harm, with our community able to opt-in to view should they wish to," TikTok said.

On Tuesday, The Wall Street Journal reported that in studies conducted over the past three years, Facebook researchers have found Instagram is "harmful for a sizable percentage" of young users, particularly teenage girls. For years, child advocates have expressed concern over the mental health impact of sites like Instagram, where it can be hard to separate reality from altered images. Advocacy groups and lawmakers have long criticized Instagram and parent Facebook for harboring harmful content and fostering anxiety and depression, particularly among younger audiences. 

A 2017 report by the UK's Royal Society for Public Health found that Instagram is the worst social media platform for young people's mental health. Reports earlier this year revealed Instagram is planning to launch a platform for kids under 13, stirring up more criticism from child health advocates who are concerned about threats to children's online privacy and their mental well-being

In response to criticism, both Facebook and Instagram said in May that they'd give all users the option to hide the number of likes their posts get from the public and to choose whether they can see like counts on all posts in their feed. Following the Journal report Tuesday, Instagram said in a blog post that it stands by its research to understand young people's experiences on the app. 

"The question on many people's minds is if social media is good or bad for people," Karina Newton, head of public policy at Instagram, wrote. "The research on this is mixed; it can be both. At Instagram, we look at the benefits and the risks of what we do." Newton added that Instagram has done "extensive work around bullying, suicide and self-injury, and eating disorders" to make the app a safe place for everyone.

Like TikTok, Instagram has its own well-being guides and eating disorder resources, created in partnership with organizations including the National Eating Disorders Association and The Jed Foundation. Facebook also has online well-being and emotional health hubs with resources from experts, as well as suicide prevention tools to provide users with resources like a one-click link to the Crisis Text Line. In July, Instagram launched a tool called Sensitive Content Control to give people more control over how much sensitive content appears on its Explore page.

Concerns about the impact of technology on young minds extends as well to TikTok, which last month added more features aimed at protecting the privacy and safety of teenagers who use the app. TikTok was also sued in April over allegations it illegally collects and uses children's data, with the company saying those claims lack merit.

If you're struggling with negative thoughts or suicidal feelings, here are  13 suicide and crisis intervention hotlines  you can use to get help.


Source