Although many of us might not be quick to admit it, we’ve likely all used social media or internet searches to learn more about ourselves.
Whether it’s a Google search for “anxiety vs. heart attack” or a social media algorithm inclined to show us “weight loss hacks,” it’s not uncommon to turn to the endless expanse of life online as a way to get help for our mental health.
However, social media apps and tech giants aren’t built for that. In fact, according to Koko co-founder and CEO Rob Morris, they often end up making things worse.
And at a time where 95% of American teenagers have access to a smartphone and 13% of young people have reported attempting suicide, this tether between mental health and internet content must be severed — or, at least, reimagined.
That’s where Morris’s organization, Koko, enters the picture.
Koko is a nonprofit digital tool that helps internet platforms, like social networks and telehealth services, better support the mental health of their users.
This support is done through two key digital integrations: detecting high-risk content that could be detrimental to mental health and providing resources to users who are at risk.
“Our mission is to end the youth mental health crisis,” Morris said at Fast Forward’s 2023 Virtual Demo Day, a funding event for tech startups that do good.
While Koko initially started as a peer support app for folks experiencing mental health challenges, ups and downs in tech have brought it to a new iteration from Morris’s original vision he had as a student at MIT.
Now a nonprofit, Koko meets young internet users at the crossroads between risk and harm reduction. Morris shared an example of how Koko works:
Say a young person searches for specific content on a social media platform like TikTok or Instagram, hoping to find information or content about anything from eating disorders to suicidal ideation.
Koko steps in before this young person can find something harmful. Because, let’s face it: As good as one’s intentions may be, there is a lot of harmful stuff online.
Trained by developers to understand a large library of online terminology and keywords, Koko’s algorithms detect and suppress dangerous content (like unsafe weight loss “tips” or stories about “unaliving”) and instead sends the user a direct message.
“Everything okay?” the message asks. “If you or someone you know is struggling, you are not alone. For peer support, self-help courses, and other resources, please try Koko.”
Users are then redirected to a number of mental health options, all of which are free.
“One click, and they are on an entirely different mental health trajectory,” Morris said.
Once a user ventures into Koko’s interface, they can explore a whole world of resources that are actually supportive, like online peer support communities, global crisis lines, and personal safety plans.
“There are millions of young people reaching out right now on the platforms they use every day,” Morris said. “Imagine if all the millions of young people reaching out online actually got help. We could create the largest mental health intervention in history.”
Koko is embedded across the internet, with the hopes of reaching young people wherever they are, and over two million users have engaged with Koko in the last year.
Peer-reviewed studies have also shown Koko’s efficacy. In 2022, a study from Stony Brook University showed that young people exposed to Koko saw a significant decrease in feelings of hopelessness as opposed to traditional crisis intervention methods.
“A lot of people express online that they are struggling, and we don’t yet have a good understanding of how best to provide help and support,” Dr. Matthew Nock, the chair of Harvard University’s Department of Psychology said.
“Koko is filling this gap and providing scientifically-supported tools to help those who are struggling. They’re figuring out what works best and then putting it to work to help people.”
Now, Koko is hoping to grow its impact to reach 10,000 young people per day.
“We want to see a world where any teen in need can access a service like this for free — zero friction, completely ubiquitous,” Morris said. “This is about saving a generation.”
If you are in immediate need of safety or mental health support, please contact the following:
Emergency Medical Services
911
National Suicide Prevention Lifeline
suicidepreventionlifeline.org
988
Crisis Text Line
Text CRISIS to 741-741
crisistextline.org
And if you are not in immediate crisis and would still like access to more resources, visit our list of Mental Health Resources for more information.