A bill in the U.S. Senate would put more responsibility on social media companies to design their platforms to better protect children from online dangers. The Kids Online Safety Act, or KOSA, appeared set to pass Congress last year, only to stall in the House. It was reintroduced earlier this month.
Advocates say KOSA is necessary to keep kids from being fed harmful content through algorithms, such as posts encouraging eating disorders or suicide. But opponents such as the ACLU have argued that it could restrict free speech.
Kristin Bride is a social media reform advocate from Lake Oswego whose son died by suicide after he was bullied online. Nick Allen is the director of the Center for Digital Mental Health at the University of Oregon. They join us to talk about the new legislation and how families should weigh the pros and cons of youth social media use.
If you or someone you know is struggling, you can call or text 988 to reach the Suicide and Crisis Lifeline. Help is available 24 hours a day, seven days a week.
Note: The following transcript was transcribed digitally and validated for accuracy, readability and formatting by an OPB volunteer.
Dave Miller: This is Think Out Loud on OPB. I’m Dave Miller. We end this week with a conversation about an effort to safeguard kids from online dangers: a federal bill called the Kids Online Safety Act, or KOSA. Advocates say KOSA is necessary to keep kids from engaging with harmful content online. Opponents, including the ACLU, have argued that it could restrict free speech. The Senate voted for KOSA overwhelmingly last year, but then the bill died in the House. It was reintroduced in the Senate earlier this month.
In a few minutes we’ll be joined by Nick Allen, the director of the Center for Digital Mental Health at the University of Oregon. We start right now with Kristin Bride. She’s a social media reform advocate from Lake Oswego whose son Carson died by suicide in 2020 after he was bullied online. Kristin, thanks very much for coming in.
Kristin Bride: Thank you so much for having me, Dave.
Miller: Do you mind starting by telling us a little bit about your son Carson? What was he like?
Bride: Carson had a great sense of humor, an incredible smile. He was creative, he was into acting, could do all different accents. He really was the light of our family and it’s an incredible loss to live without him.
Miller: What role did social media play in his death?
Bride: He was viciously cyberbullied by his high school classmates in Portland, Oregon, who were using anonymous apps to hide their identities that were on the back end of Snapchat. We were conservative parents with technology. We waited until Carson was in high school for his first iPhone, with no social media. Then in 10th grade, he begged for Snapchat because that was the way all the kids were connecting and Carson never made it to 11th grade.
Miller: Did you reach out to those social media apps after his death?
Bride: Yes, I reached out to YOLO. I knew he had Snapchat. I did not know that Snapchat had allowed anonymous apps on their back end. So we heard through the community that he was bullied over YOLO. I didn’t even know what that was. I found out that YOLO had promoted that they would monitor for cyberbullying, reveal the identities of those who do so and ban them from the app. Yet when I reached out to the company on four separate occasions, I was ignored.
Miller: What are the big goals of the Kids Online Safety Act?
Bride: It’s to create a duty of care for these companies, like every other company in America. When you look at Carson’s example, anonymous apps over the last decade have existed. When marketed to teens, they always lead to cyberbullying and then usually a suicide, and then they shut down. So I truly believe that if KOSA was enacted a decade ago, Carson would still be alive.
The other really important thing about KOSA is that kids can look up whatever they want. What we object to is kids being fed content, pro-suicide content and pro-eating disorder, when they’re not looking it up.
Miller: When you say pro-suicide or pro-eating disorder, what do you mean by that?
Bride: I can give you an example of a parent who lost her son to suicide. He was actually feeling depressed and looking up inspirational quotes. He got fed algorithms that said, “if your girlfriend breaks up with you, you should end your life,” which he ended up doing. It’s this dramatic and harmful content that keeps kids glued to their screens, sends them down really dark rabbit holes, and they’re not looking it up in the first place.
Miller: When you testified to Congress a few years ago, you said this: “I speak before you today with a tremendous responsibility to represent the many other parents who have also lost their children to social media harms.”
You mentioned some specific families, then you added this: “Our numbers continue to grow exponentially with teen deaths from dangerous online challenges fed to them on TikTok, sextortion over Facebook, fentanyl-laced drugs purchased over Snapchat, and deaths from eating disorder content over Instagram.”
How would the bill address those various situations? Those are four different platforms and four different kinds of content.
Bride: Well, the other thing with the challenges, the bill would allow for users who see harmful content, like the choking challenge, to actually reach out and get a response from the platform. Right now, when parents do that – and a lot of the parents whose children died from the choking challenge do this on a regular basis to try to help save other kids – they either get no response from the platform or they get a response back saying that it does meet community guidelines, when it certainly does not. So we really need to have a way to communicate with the platforms. I also get calls from parents whose kids are suicidal because something’s being virally spread and they cannot reach the platform.
Miller: I should note that for folks listening, if you or someone you know is struggling, you can call or text 988 to reach a suicide and crisis lifeline. Help is available 24 hours a day, seven days a week.
The current bill says that platforms have to “exercise reasonable care in the creation and implementation of any design feature to prevent and mitigate various harms, including online harassment, if a reasonable person would agree that the design feature is a contributing factor to such harms.”
It doesn’t mention … the word anonymous or anonymous messaging is not specifically called out, stipulated in the bill. But it seems like it’s your contention that this bill would prevent anonymous messaging from being a feature for apps that teens could use.
Bride: I think it’s the way it’s implemented, it’s looking at the actual design. In this case, they promoted to kids that this was safe, because there was a feature that allowed, if you were being harassed, to identify who was doing it. Carson, many times in struggling through this, asked kids to swipe up and identify themselves so they could talk it out in person. So that was a dangerous design flaw. I truly don’t believe that the company ever had the ability to do that.
I also filed an FTC complaint against another anonymous app. Kids were complaining about having to pay for subscriptions to reveal who was sending them the message and they weren’t getting that name. It turned out the two co-founders were sending the kids the harassing messages themselves. That is a design feature and that is dangerous.
Miller: They were making money by saying, if you give us money, we will divulge the identity of your harassers. But they were the harassers themselves?
Bride: Right.You can look it up. FTC complaint. Yeah, it was an NGL, “not gonna lie.”
Miller: The bill would also provide more parental control, if I understand it correctly. Over what kinds of things?
Bride: Over the ability to turn off the algorithms, which would be great because we want kids to be able to look up what they want to see, not what the companies want them to see that will keep them online longer. Time spent – those features are on there. Privacy settings – we don’t want strangers contacting our kids. So applications like that, [which] would help parents feel more comfortable.
Miller: Opponents, including the ACLU and Ron Wyden, one of only three senators out of 100 who voted against this, have said various things. But the complaint I’ve seen the most is that letting the government or these companies determine what harmful content is could impinge on free speech rights. What’s your response?
Bride: We are really specific in the bill about harmful content. Pro-eating disorder content, pro-suicide content. And one of the things that’s been really frustrating is that big tech lobbyists have promoted just that, that we are asking for other content to be taken down.
Miller: LGBTQ content, for example. A conservative attorney general in a red state could prevent a queer kid who’s looking for affirmation or support from finding that.
Bride: So the change made to the bill was that AGs cannot enforce duty of care. It’s got to be the FTC.
Miller: How have the last five years changed the way you think parents should be responding to their kids’ desires to be on social media?
Bride: I feel very hopeful that there is a movement out there. I think in the last five years … When this happened to Carson, I felt completely alone. Now we have a movement of so many parents, unfortunately, whose kids have died. Other parents who realize that their kids are not doing well with the hours that they’re spending on social media. I mean, the platform’s own research shows that one out of three girls on Instagram feel worse about themselves. So we see movements like Wait Until 8th, we see phone-free school movements passing in 19 states. Unfortunately, it did not pass in Oregon, but I have hope that it will next year.
Miller: There’s also the possibility of executive action from the governor.
Bride: Right, exactly. They’re finding the benefits of not being on phones and not giving your child a phone early, because the content that your child will be receiving is not appropriate and not healthy.
Miller: What about your own relationship to these companies, to social media? Are you interested in giving your eyeballs your attention to any of these companies?
Bride: No, in fact, I was never on Instagram, certainly not TikTok. I was on Facebook because I was encouraged many years ago, because I was on a board, to get on it. And last year I deleted it. One of the things at the Meta March in New York City that I did, I was asking people to delete meta for good – there’s a pun in there. But if this were a food or a toy that was this dangerous, we would not give it to our kids. We wouldn’t use it ourselves and we wouldn’t support it.
Miller: Has it been a challenge to push forward a movement in 2024, 2025 if you’re not on social media? I mean, you’re against this and for very clearly articulated reasons. Does it prevent you from organizing?
Bride: No, because we have different groups and we organize in different ways, over Signal. I use LinkedIn to promote my advocacy work and that has worked just fine.
Miller: Kristin Bride, thank you very much.
Bride: Thank you so much for having me.
Miller: Kristin Bride is a social media reform advocate from Lake Oswego.
I’m joined now by Nick Allen. He is a professor of clinical psychology and the director of the Center for Digital Mental Health at the University of Oregon. Nick Allen, welcome to the show.
Nick Allen: Thank you. Great to be here.
Miller: As we just heard, there are many heartbreaking individual stories of kids’ experiences with social media. But population-wide, as a researcher, how much can you definitively say about the effects that social media and other digital spaces are having on kids’ mental health?
Allen: Thanks for that question. Just before I get to it though, I do want to acknowledge Kristin’s story. It’s a story that many families have experienced. I have worked in the area of mental health in young people my entire career, and I know many families who’ve gone through similar things. Even in our own family, I have three children and one of my children really struggled with mental health issues during that period of life. So I definitely honor her bravery in sharing her story and in also taking meaningful action based on that experience.
I just wanted to say that to start with. But to get to your question about population-wide, what we know about the research – the picture is complex. What we know is that there is enormous variability in people’s experiences online. There are many young people who experience positive outcomes from being online. They connect with friends, they find important information, they find entertainment, they find community. And all of these things are well documented, in terms of the experiences that people have online.
Then there are other people who have much less experiences online and the experience of bullying, which we’ve just heard about, is definitely one of the most harmful. There’s other, perhaps less severe experiences that are negative. A lot of people talk about the experience of fear of missing out, often referred to as FOMO, as a feeling that people have online.
But the bottom line is, when we look at the population-wide level, we do see some associations between being online and mental health problems. But the effects are generally small at the population level and they also vary a lot between the different studies. So our best understanding is that, like all social contexts in which young people find themselves, whether it might be at school, in their religious community, in their sporting organization or in their neighborhood, there’s actually a lot of different experiences that people have. The online environment is a social context that is similar in many ways, not identical, but similar in many ways to those other social contexts. So most of the research is pointing us to understand more about, not just being online, but what the specific experiences are that young people are having and how they might be either beneficial or harmful.
Miller: Well, let’s focus on some of the language in the current bill because it talks about forcing, in a sense, platforms to exercise reasonable care in the creation and implementation of design features that could prevent or mitigate various harms. Do you think that researchers at this point know enough to be able to say, these are the kinds of design features that should not be a part of a social media platform that a 14-year-old is using?
Allen: I know that this is in some ways quite an unsatisfying answer, but the fact is we still need more research on that, to understand that from a data-driven point of view. One of the issues here is, as I think you’ll be aware, there is some applicable law, one of them is referred to as section 230, which states that online companies that make products like social media products are not responsible for the content of what is shared on those products, but they are responsible for aspects of the design. So this distinction between design and content is actually a really critical one in the law.
Now, there are those who argue that section 230 should be changed, and there are those who argue that it should be maintained because of concerns about limitations on free speech and ways in which legal approaches to content moderation could be misused, ultimately – which is one of the issues that you discussed previously. But I think it’s important for people to understand that distinction. So, of course, all the places where young people are, we want them to be safe, whether they be schools, other community locations and so forth. So creating places that are safe by design is a very important proposal.
But understanding which design features, as distinct from content, is still something that we are seeking more research on. One of the issues that was discussed in the last interview was anonymity. And anonymity is one of the features that has a plausibility about it, because we know that one of the things that regulates our behavior interpersonally is the idea that when I behave towards someone in a certain way, I accrue some reputation for that kind of behavior. In other words, it has personal implications. For example, if I treat someone badly, I develop a reputation as someone who behaves that way and then that has an implication that I might weigh up in choosing how I behave. Obviously, anonymous posting allows people to behave in ways that avoid that reputational consequence. So I do think that looking at anonymous posting is an important issue. But as with all things, there are some complexities.
For example, you mentioned before the example of a young person who is LGBTQ or questioning, and they may not live in an environment where that experience is supported or understood. And they may want to go online and find a community where they can discuss those issues without feeling criticism and judgment. To do that, it may be helpful for them to use an anonymous identity to do that. Some of the kids that I’ve worked with do have multiple accounts, even within one specific product like Instagram, where they have different identities they explore in those different accounts. So, as with all things, you need to weigh the benefits of requiring that people are identifiable when they are online, versus some of the situations where requiring that may in fact produce negative outcomes for some people, that may restrict their freedom to explore or discuss issues that are important to them.
Miller: I take your overall point about the lack of conclusive research about these complex topics. But it makes me wonder, if we’re going to base regulation on really trustworthy research – the time scale for which could be very long – if that means that we’re just going to be at the whims of an ongoing experiment of a sort that is being perpetrated on us by a bunch of corporations, who are not focused on the well-being of their users, whether they’re 14 or 84, but they’re focused on their bottom line. If we wait until there’s research, what are the possible risks?
Allen: I think that’s a very good point. And just to be clear, I’m not saying that we shouldn’t take some action in order to explore the effects of these things. You asked me about where the research is and unfortunately, in terms of a lot of features, there are a whole lot of features that have been discussed … For example, is it good to have messages be ephemeral, that they disappear, versus, they’re always there? Is it good to let kids use filters on their photos? There’s a whole bunch of … Anonymity is one that we’ve talked about as well. I just wanted to point out the issue that there is a complexity when we haven’t got really good research and data to understand the harms and the benefits of these particular features.
However, in public policy, as you point out, sometimes we need to take our best bet and we need to try things out in order to see what the effects are. And I do think that there are some design features that could be implemented, that we could explore. One of the things that you may be aware of, and people listening to my accent will probably pick up, is that I live in the United States, I’m from Australia. The Australian government has passed legislation banning the use of social media products by children under the age of 16. Now, that’s world first legislation as I understand, and they’re still, to my understanding, working out how in fact they can implement that, which is not going to be straightforward. But that will be an interesting test case to see if there are actually population-wide benefits for that kind of legislative change. So I’m certainly not saying that we need to sit around and wait until we have definitive research on everything before we take action, but I do think we should be aware of when we have definitive research and when we are making a best guess. So that then the legislation is open to being revised as new data comes in.
Miller: And just in terms of taking this down to the personal, the family, the household level, from the legislative one, what advice do you have for parents and families?
Allen: Well, one piece of advice is that it’s important to understand that if you think you know everything about what your kid is doing online, then that’s at least possibly or probably not true. Many parents have experienced the fact that they think they’ve got things locked down, they understand exactly which products kids are using and so forth. And the fact is that that evolves very quickly and changes. And the reason I’m making this point is because at the end of the day, the best way you have of understanding what your child is doing and experiencing online is to have an open relationship with them, where you can talk about this stuff in a way where they will tell you when they need to tell you about what’s happening there.
That’s not always simple to achieve, because one of the great attractions of the online space for young people is that it is private and particularly that it is private from parents. So young people may have some resistance to telling you everything that is going on online, just the same way they’re probably not telling you everything that’s going on in their peer group or everything that’s going on at school. So I think a critical task is to build the kind of relationship with your child where they will come to you when they need to, and that they will understand that that will be an open and non-judgmental conversation, and one where you’re primarily interested in their well-being and supporting them.
I do think that there can be a great value in sensible limitations to phone use, for example, certainly for younger adolescents, limiting phone use around bedtime. There’s some quite good evidence that using your phone around bedtime can have sleep implications for young people. So I think that that’s also a very sensible family rule.
I also think that modeling use of digital products for young people as an adult is very important too. A lot of us are focused on our phones a lot of the time. Young people see that and they take that as a model. So modeling the fact that you’re able to regulate your use of online products, that you make wise choices about how you use them, that you’re mindful of which online experiences are really helpful, fun and enjoyable, and which ones are not so much fun, are all things that are really good to have open conversations with kids about. I think what happens is sometimes kids get stuck online in ways that are not that fun or enjoyable, whereas other times their experiences are. And helping them to tune into that difference is a really valuable thing.
Miller: Nick Allen, thanks very much for joining us. I really appreciate it.
Allen: You’re welcome.
Miller: Nick Allen is a professor of clinical psychology and the director of the Center for Digital Mental Health at the University of Oregon.
“Think Out Loud®” broadcasts live at noon every day and rebroadcasts at 8 p.m.
If you’d like to comment on any of the topics in this show or suggest a topic of your own, please get in touch with us on Facebook, send an email to thinkoutloud@opb.org, or you can leave a voicemail for us at 503-293-1983.