This article was featured on The Los Angeles Blade on September 5, 2023.
By Laura Marquez-Garrett | SEATTLE, WA. – You’re fourteen years old, in a small, conservative midwest town, and queer. Where do you go? Afraid to approach anyone in person, you go online. You search on YouTube for “Gay Pride” and “Pride Parade,” then get videos over days, weeks, even months telling you that gay is bad, and you are going to hell. Now where do you go?
This is what LGBTQ+ youth deal with everyday. Not because there isn’t enough gay-positive content on the Internet, but because certain online platforms choose to prioritize engagement over safety. They push kids into excessive use through things like engagement with strangers, endless scroll, push notifications, and extreme content.
Which brings me to the Kids’ Online Safety Act (KOSA). KOSA was introduced in late 2021. As a parent, a member of the LGBTQ+ community, and an attorney who walked away from a successful, business litigation practice in early 2022 to work with children and families harmed by certain online platforms, I admired its intent but could not unequivocally support it. Understanding, as I do, both the life-saving importance of resources for LGBTQ+ youth and the devastating harms these products cause millions of kids every day, KOSA’s original wording fell just short of striking a necessary balance between the two.
But that KOSA is gone.
In response to feedback and concerns, KOSA’s co-authors met with LGBTQ+ organizations and communities and listened – as reflected in the changes they made – and a new KOSA was born. The new bill will protect young people from harmful design and programming decisions, while explicitly safeguarding youth autonomy to explore online.
Because, yes, online platforms do provide much needed communities for LGBTQ+ youth. In 1995, my first year of college, we searched for and found local resources, as well as entire communities via virtual bulletin boards and chatrooms, on this new thing called the Internet. For most of us, me included, it was the first time we felt seen and heard. Only those platforms helped us without also manipulating and exploiting us; while the platforms at which KOSA takes aim – in fact, the only platforms at which KOSA takes aim – are exploiting and abusing the youth who need them the most. Don’t forget the Facebook documents exposed in late 2021 showing images of teenage brains, discussing vulnerabilities of youth and related design and programming “Opportunities.”
I see it every day. Transgender teens targeted with violent, suicide-themed, and transphobic content, no matter what they search; children so locked-in to extended use designs and social metric tools that they stop sleeping and self-destruct; young women – almost every one of them – flooded with connection recommendations to predatory, adult users; middle and high school kids targeted with drug dealer “Quick Adds” and advertisements for vaping, alcohol, and other harms. What about the 16-year-old who goes through a break-up and searches for “positive affirmations” and “inspirational quotes,” and gets hundreds of videos advocating self-harm and suicide instead?
If you think these are hypotheticals, think again. I have investigated accounts and associated data for multiple 16-year-olds who asked platforms that would have been covered under KOSA to send them uplifting content only to receive overwhelming amounts of self-harm and suicide. Those children are gone. I have seen accounts used by 10- and 11-year-old girls where adult strangers reached out via direct message, saying how happy they were to find them, that the platforms’ technologies recommended them, then abused and exploited them. Some of those children are gone as well.
This is the status quo for millions of American children and these are the types of harms KOSA tackles, while also now making clear that content is not at issue. To accomplish this, it defines harms via existing statutes, definitions, and other objective metrics; and also now includes a “Limitation” carving out any circumstance where a child deliberately and independently searches for or specifically requests content. The new and improved KOSA explicitly recognizes that it is the province of youth to search for LGBTQ+ content (or any content).
To say it plainly: If a child is seeking out LGBTQ+ content, under KOSA, they would still be able to search for it. KOSA prevents Big Tech algorithms from pushing dangerous content onto that child’s feed without their consent.
For the record, KOSA does not require identity verification. It also does not cover every type of online platform or service. Non-profits, for example, are exempt, meaning that any non-profit providing LGBTQ+ resources will fall outside of KOSA’s reach entirely. These are just a few examples of how KOSA was changed to ensure the protection of LGBTQ+ youth on every side of this debate.
When did we become okay with companies treating people like this, much less children? And now that we know, are we actually thinking about waiting another year or two to see what happens?
For adults, if you remain unsure, ask LGBTQ+ youth (or any teen) about their experience. Not open-ended questions like “Do you enjoy social media?” but things like “Do these platforms ever give out your information to or try to connect you with predators or drug dealers?” and “Do you ever want something positive but get self-harm, disordered eating, or suicide content instead?”
For American youth, if you have yet to read KOSA, please do. Please ask questions and share your own experiences with your representatives – good or bad. It’s easy to get caught up in panic and what-ifs. But I promise you, the biggest danger right now is not KOSA, it is relying on anyone but yourself to decide your future and the future of the kids who come after you.
This is your moment. Speak up so we can hear you.