Today, we are launching our Online Suicide Prevention Kit (https://www.kokocares.org/suicide-prevention-toolkit). The goal is to help social networks and online communities better support at-risk individuals on their platforms.
Many social platforms have built-in lists of keywords that detect mental health-related search terms (e.g., “self-harm” or “depression”). There is already an established practice to suppress content or surface disclaimers for such searches. Search “suicide” on most platforms and you’ll at least get shown a 1-800 number.
But there are a few problems with this. The keyword lists always have glaring omissions. Millions of young adults can still easily find dangerous content, such as tips on how to self-harm or kill themselves. And while some platforms redirect users to “emotional support” pages, the resources provided are often underwhelming and lack evidence-base. The most common approach is to provide an overwhelming list of crisis lines (which isn’t particularly helpful to someone who may already be overwhelmed themselves).
Here’s our solution: We have a privacy-first native library designed for social networks, streaming services, online communities, forums, etc. It catches common search terms like “kill myself”, “depressed” or “thinspiration”, as well as a huge long-tail of slang terms and evasive language (e.g., “sewerslide” or “an0rex1a”).
The library is written in Rust and matches in under a microsecond. It has language bindings to Python, Go, and Ruby, and all other major runtimes are coming soon. Our keywords are sourced from over 12k known crisis posts and are hand-curated by social and clinical psychologists on our team. We also use text generators like GPT-3 to expand these lists with other keywords beyond our user-generated corpus. The terms are updated regularly based on new patterns that emerge on our support platform, as well as co-listed terms on large social platforms.
We also provide evidence-based mental health interventions and resources, to help supplement what online platforms might already provide (though, frankly, many do essentially nothing). Our interventions can be accessed online, for free, without having to download an app. We provide users with online peer support, self-guided mini courses, crisis triage, etc. We have published seven reviewed papers on these interventions and we have two more in prep now. In a randomized controlled trial with Harvard, our services increased the conversion rate to crisis lines by 23%.*
This combination —search detection + evidence-based online interventions — enables us to reach users where they are, right at the moment they are reaching out for help. Instead of showing a user an ad or, at worst, harmful content, we can display resources that are actually helpful. We have seen young people search for “proanorexia” content, then click our banner, then engage with our courses, and then show marked improvement in body image perception and a greater motivation to get help offline.
Our library collects no data and our interventions are anonymous (we do not collect emails, usernames, IP addresses, phone numbers, etc).
Online platforms are heavily (and rightly) criticized for contributing to the youth mental health crisis. But what’s missing from the discussion is how these platforms are uniquely positioned to do something about it. Everyday, millions of people are crying out for help and the most anyone does is throw up a 1-800 number or offer suggestions to “go take a walk” or “reach out to a friend.”
Fortunately, we have partnered with a few large social networks that are eager to take the next step. We are now helping over 12,000 people a month with this approach. For users who complete our online interventions, we see significant improvements across clinical outcomes, including hopelessness, body image perception, and self-hatred.
This definitely won’t help everyone and nothing can replace direct human-to-human connection. Some at-risk users need far more than we can ever give them with our approach. But it does help some people in profound ways, and that inspires us to keep going.
Koko is something I started while I was a graduate student at MIT. I was severely depressed at the time, so I hacked together various technologies to manage my own mental health, as a way to fill the gaps between sessions with my therapist. That was almost ten years ago. I now have a kid of my own and I can see him struggle emotionally, just as I did.
Suicide rates for young people have increased dramatically over the past decade.* Since 2019, the rate of suspected suicides for girls aged 12-17 has increased by over 50% .* There is nothing more terrifying to me than the thought of a young person dying by suicide. If we can help avert at least one tragedy, it’ll be worth it.
We need your support. If you work at a large platform, or even if you just have a small Discord server or subreddit, you can help us by trying out our kit:
And please donate! If you care about this issue, please support us: https://every.org/kokocares
If you work at a large social network, or even if you just have a small online community (a Discord server, a subreddit), we think our resources could be helpful. But we’re curious if there are other opportunities we haven’t considered. We would love your feedback on what we’re building, and any technical ideas that might help improve it.
* Happy to provide references in the comments - just ask