Suing Social Media for Addiction?

It is almost a joke that many have social media addiction today. Tons of people seem to have at least a low-level and manageable addiction. Now we need to raise questions about the ethical and legal responsibilities of social media companies regarding addiction.

A recent case brought this to the fore where a woman is suing Facebook/Meta and Snap(chat) over her daughter’s suicide after addiction to their platforms. This case presents interesting legal and ethical concerns. Let’s look at the case, then talk about the issues with social media addiction.

Suicide after Social Media Addiction

Social media icons (cc0 Pixabay)

Ironically, I first saw this report on Social media. Insider reports:

A Connecticut mother is suing Meta, the company formerly known as Facebook, and Snap, alleging their “dangerous and defective social media products” played a role in her 11-year-old daughter’s suicide.

The complaint, filed by Tammy Rodriguez in San Francisco federal court earlier this week, claims Selena Rodriguez suffered from depression, sleep deprivation, eating disorders, and self-harm tied to her use of Instagram and Snapchat.

According to the filing, Selena began using social media roughly two years before her death by suicide in July 2021, during which time she developed “an extreme addiction to Instagram and Snapchat.” The filing also claims the 11-year-old missed school multiple times because of her social media use and that she was asked to send sexually explicit content by male users on both platforms.

Rodriguez wrote in the filing that she attempted to get her daughter mental health treatment several times, with one outpatient therapist saying she had “never seen a patient as addicted to social media as Selena.” At one point, Selena was hospitalized for emergency psychiatric care, according to the complaint. […]

In other documents retrieved by Facebook whistleblower Frances Haugen, the company found 13.5% of teen girls said Instagram makes thoughts of suicide worse, while 17% of teen girls said Instagram exacerbates eating disorders.

Read the rest over there.

Moral Analysis of Social Media

Unfortunately, under current law, this is likely a slam dunk for social media companies. They offered a product with certain terms and she consented to using it under those terms.

The deeper question is what are the ethics and what should be the law. Long ago we realize cigarettes were dangerous and addictive so we put big warnings on them.

Addictive Apps in General

It is clear that several categories of apps are both addictive and can be dangerous or destructive. The lowering of the barrier allows the negative part of the app to seep into every corner of life. Before having to go to a casino or XXX video store across town created a barrier for those entering or wanting to quit. Sure, you can still have an addiction and it can still destroy your life across town but if one is trying to quit, the current 2 clicks and 15 seconds provides a lot less time for the conscience to act than needing to get out of bed and drive 20 minutes.

Likewise, teens have compared themselves to others and at times had negative reactions to matching up to standards of beauty. However, it’s a lot different when it was going to the town square or local mall on the weekends to compete for young men’s attention, and reading a fashion magazine or two a month. Now, these same young women can get addicted to something constantly in their pocket and see more images that make them feel inadequate over a weekend than the average young lady saw in a year 20-40 years ago. This is amplified by the pressure to then use these apps to send explicit pictures.

Restrictions to Prevent Addiction

Consumers are naturally at a disadvantage where a corporate entity has an addictive product that is often dangerous to consumers. This is a place where the government puts in regulation.

For pornography, I definitely think there are some good ideas. (Either defaulting to filtered access or have some kind of verification linked back to your real identity in some way that you need to do to view it. In some sense, no pornography would be ideal but I doubt that is realistic. Also, restricted pornography may be one of those evils that are better to tolerate than make illegal. Aquinas and Augustine both note this about prostitution, so applying it here is reasonable, but not required.)

I think we need to put some restrictions on online gambling. From the ads I see, I can only imagine how many families online gambling addiction is ruining. But I’m not certain what restrictions work here. I keep thinking to write something but that would take more research.

Social Media Apps in Particular

Social media presents three additional challenges.

  • First, it is ethical if used moderately (no amount of pornography is ethical), and in many cases can even be good (reading saint quotes, discussing important issues, etc.). As an extension of this, it is only certain interactions, not the whole system that cause issues.
  • Second, Social media is about connections, not specific content. Porngraphy sites have static content, so in theory at least, it could all be previewed by real humans to avoid more horrendous forms of content. YouTube filters videos to make sure they aren’t pornography. What is most important about social media is the connections and conversation. Certain big social media posts might be considered static content, but the main reason people go back is the discussion and interactions with those they have social media connections to. This can be good and can also be part of the addiction. I can message my sister to see if there are photos of something her kid did yesterday, but someone else could use the same messaging ofr bullying or sexual content.
  • Third, there is not quite as obvious of a place for any legal restrictions on it. Porngraphy has an obvious place for a lock: (fitler or real age verification before one gets on the site. But where on social media? A lot of public social meida is viewable without an account.
Concluion

I’m not sure exactly what is best on social media.

Maybe we should just ban minors from it, or ban them from certain features like no private messaging. This would be a challenge requiring some connection to the person’s real identity. “Click here if over/under 18” seems inadequate. Twitter asked me to prove my birthday with a government document before verifying me. But I’m not sure that can be universal from a practical standpoint. (Facebook does have a feature to message kids where goes through their parent’s phone, then the parent shows their child. I use it to send cool lego things I see online to one nephew.)

Maybe some warning labels on Instagram models’ photos to prevent bad body image. Even something as simple as identifying photos that are photoshopped. Maybe something could be done to the algorithm that would move away from that, even if it costs these social media companies some money short term.

Obviously, a lot more moderators to pick up on bullying and minors sexting would be good. This could happen but they can’t see everything and at a certain point, it becomes impossible financially.

We definitely need some new legal restrictions on social media, but what is unclear.

Please add comments below as this is an area where I would love more suggestions.

Liked it? Take a second to support Fr. Matthew P. Schneider, LC on Patreon!
Become a patron at Patreon!
Share:

Add your voice to the discussion