By Ethan Wang
LOS ANGELES — Online conversations about mental health and illness, particularly over social media, are at an all-time high. A 2022 study indicated that Generation Z (often dubbed ‘Gen Z’) is the generation most affected by such mental health crises, with 42% having received a mental illness diagnosis.
As the prevalence of mental health and illness awareness rises with each successive generation, the advocacy we see for its corresponding healthcare rises accordingly. While this is certainly a positive feature of its increasing ubiquity, it comes with a simultaneous rise in discussions about mental illness over social media platforms—for better or for worse.
As reported by CBS, social media’s coverage of mental health and illness has proliferated with the use of TikTok and other popular social media platforms, but this coverage is widely inaccurate. In fact, 52% of videos about ADHD were deemed ‘misleading,’ and mental health professional Dr. Inna Kanevsky particularly speaks to her observations of armchair psychologists—unqualified individuals who provide mental health advice—on TikTok “overpathologiz[ing] normal behaviors.”
In addition to the overwhelming amount of misleading TikTok videos—with 58% of videos about trauma containing inaccurate information—there is an alarming amount of content coming from unqualified users. A recent PlushCare study conveyed that only 9% of TikTok creators who discuss mental health possess a proper credential to do so, and only 1% of these TikToks contain a disclaimer about their lack of proper qualification.
Non-credentialed mental health content creators often fashion a facade of professionalism, assuming the role of a ‘teacher’ and subsequently establishing a false hierarchy of knowledge that centralizes the viewer. In fostering this parasocial relationship of credibility and trust, the information’s delivery supersedes its accuracy, and it becomes more difficult for average users to deduce which videos are truthful and which are not.
The major concern with front loading misinformation over TikTok is the demographic of its primary audience: adolescents and teenagers, particularly 10 to 17 year olds. Due to the nature of the fact that adolescents and teenagers are a much younger audience, their abilities to fact-check every single piece of media they encounter are much lower than those older users of platforms like Facebook and Instagram.
TikTok’s primary audiences, children and teenagers, cannot be entrusted and entirely burdened with the task of rigorously confirming the truth factor across each video—especially when many of these unreliable creators are adults, who are much more responsible for their production of misleading content than are their younger consumers for not being able to debunk it. Considering that Generation Z is most affected by mental health crises, they then become more likely to seek mental health assistance in the form of less credible videos.
Several teenagers offered testimonies about how social media—and TikTok, in particular—led them to self-diagnose with mental health conditions that professionals would later deduce they were not affected by. Additional research cites teenagers’ use of social media as a catalyst for developing more severe eating disorders, as well as higher rates of suicide. Ultimately, it is clear that engagement with falsified content about mental health via social media led these teenagers to endure further anxiety and stress, even exacerbating other diagnoses that they had truly received.
Social media curates identity categorizations through aesthetics, particularly targeted toward young women and girls, including the glamorization and consequent commodification of themes centered around mental illness, as seen in TikTok catchphrases like ‘sad girl core’ and users declaring that they are in their ‘Sylvia Plath era.’ In encouraging the creation and monetization of content that romanticizes suffering in this manner, TikTok’s algorithm goes to reproduce harmful conceptions of mental illness while fostering a platform that encourages it.
Ultimately, TikTok’s algorithm is engineered to boost viral content rather than verified information, as content creators must be paid by the site. Social media platforms are “created as businesses to make money,” aware that their lives and financial payoff depend on views and reposts, with the information disseminated across them overwhelmingly influenced by consumerism. Under this algorithm, videos with a veil of professionalism, the illusion of an answer, and a cheap price are those that perform best.
Regardless of whether or not the rising generations of teenagers and young adults are properly trained on misleading information online, it is unrealistic to burden them with such a task, especially when those older social media users are not held to the same expectation. Consumers cannot be held responsible for a faulty product, and it is important to remember that TikTok is one of the many platforms that allows the proliferation of misinformative content in the first place.
Mental health awareness and its following demand for professional care will only continue to rise, and it is those overseeing its social spread who must ultimately be held responsible for shaping how our future generations come to understand it.