A Parent’s Guide to Threads: Instagram’s Twitter-Like App

In July 2023, Instagram released the Threads app. This text-based platform is similar to Twitter and exposes children to many of the same dangers as Instagram, such as cyberbullying. With the growing body of evidence that social media impacts mental health, it is important for parents with children who use social media to stay current on the latest apps to help keep their children safe online.

Written and edited by our team of expert legal content writers and reviewed and approved by Attorney Matthew Bergman

Written and edited by our team of expert legal content writers and reviewed and approved by

a photo of Matthew Bergman

If you’re the parent of a tween or teen, you may have heard about the Threads app. Mainstream media have repeatedly compared it to Twitter, now known as X, primarily because of its text-based format and highly scrollable feed. But for parents, it can seem like yet another opportunity for children to encounter content they shouldn’t see.

At the Social Media Victims Law Center, we understand parents’ concerns about unfamiliar social media. We advocate for families whose children have suffered irreparable harm—and have even lost their lives—due to social media addiction and exposure to harmful content. If your child has been harmed by using Threads, we can help you hold Meta, its parent company, accountable.

What Is the Instagram Threads App?

The Threads app is a new social media platform owned and operated by Meta, which also owns Instagram. Threads launched with intense momentum in early July 2023, logging over 100 million active users in its first week. It’s now the world record holder for the fastest app to reach 150 million global downloads, reaching the benchmark in seven days—5.5 times faster than the previous record holder, Pokemon Go.

According to The New York Times, Instagram created Threads to provide users a channel for text-based conversations while keeping the original Instagram focused on photos and videos. Users can scroll through a personalized text-based feed on Threads, though posting pictures and videos is also possible. However, unlike its primary competitor, Twitter, Threads doesn’t yet allow direct messaging.

People sign up for Threads using their Instagram account and username. You can create a Threads-specific profile, but your username will stay the same on both platforms. Once logged in, you’ll see threads from people you follow alongside content from creators the app recommends.

Is there a minimum age limit to use the Threads App?

Threads is open to teenagers and adults. Instagram has a minimum age of 13, which carries over to Threads. However, Instagram’s help center says children under 13 can have an account if a parent or other adult “manager” oversees it.

This may sound safe, but parents need to know that Instagram does not verify a child’s age or, if applicable, the identity of their manager. Unless your child selects an unreasonably old age—as when one reporter claimed to be 146 years old to test the system—they can enter a fake birth date and open an account.

Similarly, although Threads sets accounts for users under 15 years old to private, the company says nothing about whether the child can change the setting after the fact. Loopholes such as these leave children open to various types of harmful content.

What Threats Can Children Experience From the Threads App?

Like any social media app, Threads can expose young users to harmful content. The good news is that the likelihood of a child encountering nudity or other explicit content is lower on Threads than on Twitter, which permits sexual images and other sensitive content for users over 18.

Like Instagram, Threads does not tolerate nudity or related adult content. However, the apps do permit specific potentially sensitive photos in health-related situations or acts of protest.

Parents may feel better knowing that Threads does not yet allow direct messages, or DMs. It’s tempting to assume that predators may have difficulty contacting your child without DMs, but many predators know how to lure children onto more private platforms. Also, recently leaked documents suggest that a DM feature is coming soon to Threads.

Even without DMs and nudity, the Instagram Threads app still leaves your child vulnerable to harmful content, including age-inappropriate posts and comments, harassment, and cyberbullying. As a text-based medium, Threads offers plenty of opportunities for children to send and receive threatening messages and for adults to prey on vulnerable young users.

What Information Does the Threads App Collect?

Threads may threaten children’s privacy due to the amount and type of information it collects about each user. According to Apple’s App Store, which gathers information from Threads developers, the app can collect and store user data such as:

  • Names and usernames
  • Contact information
  • Physical location
  • Search history
  • Health and fitness data

Threads’ Supplemental Privacy Policy offers more detail, stating that the app collects information about each user’s activity. That includes:

  • The content you or your child create
  • The type of content you interact with
  • When and for how long you use Threads
  • Whom you follow and comment on

Threads and its parent company, Meta, may use this information to deliver personalized experiences, including ads. It may also use your data or your child’s to suggest connections. For example, if your child follows someone on Instagram, Meta may tell that person about your child’s new Threads account and encourage them to follow your child.

Also, Threads makes certain elements of a profile public. Even in private accounts, a person’s name, profile picture, username, and bio are public.

Can You Delete the Threads App?

Be cautious about letting your child experiment with Threads, even for a short amount of time. The app has already come under fire for permanently connecting users’ Threads profiles to their Instagram accounts. The Threads Privacy Policy states:

You may deactivate your Threads profile at any time, but your Threads profile can only be deleted by deleting your Instagram account.

Because of this feature, young people with Instagram addictions will have a strong incentive to remain on Threads, possibly increasing their risk of exposure to dangerous content.

Issues With Threads' Parent Company Meta

Meta’s existing apps, Facebook and Instagram, already enable the distribution of harmful content to young users. Meta has been aware that this content can damage young people’s mental health, yet it has not implemented reliable protections.

The Social Media Victims Law Center has filed multiple lawsuits against Meta to hold the company accountable for its wrongdoing.

One suit involves a 12-year-old girl who signed up for an Instagram account without her parents’ knowledge. Her parents attempted to restrict her use, but the app made it easy for her to evade their safeguards. Instagram led her to dangerous content that glamorized anorexic behavior. She developed an eating disorder leading to multiple hospitalizations and near-death experiences.

Documents show that Instagram has known for several years that its platform makes young girls feel worse about themselves, yet Meta has done nothing meaningful to stop these patterns from occurring. The platform has also enabled adults to send sexually explicit messages to young users, sometimes leading to abuse and youth suicide.

Despite repeated instances of social media addiction and harm to children, tweens, and teens using its platforms, Meta has failed to take accountability. The Social Media Victims Law Center helps families fight back.

Contact a Social Media Lawyer Today

Threads is new to the social media world, but it’s had plenty of time to expose children to harmful content. If you believe Threads has put your child at risk, the Social Media Victims Law Center can help.

We represent families against powerful social media companies and are committed to standing up for you. We’ll listen to your story and help you determine whether joining or filing a lawsuit is in your best interests. Contact us today for your free case evaluation.

Matthew Bergman
Content Reviewed by:

Matthew P. Bergman

Add Your Heading Text Here

Contact Us Today

Related Pages

Client Testimonials

Sue H.
People who are taking care of loved ones who are traumatized can’t stop thinking what they are going through. One reason it was so invaluable we had Matt in our lives is I was so completely distraught over what was happening that it was wonderful to know that someone in the background was making certain that things were being taking care of. I will always be grateful for that..
Richard M.
I would like to thank you and your staff for your continued efforts and support in assisting our family. . . Your firms’ relentless pursuit for justice is greatly appreciated.