Press "Enter" to skip to content

Social Media Addiction Case in L.A.

For nearly thirty years, Section 230 of the Communications Decency Act has protected online platforms from liability over content posted by users. Recently, however, a new trial in Los Angeles that may change this.

At the center of roughly 1,600 similar cases across the U.S. is a 19 year old Californian plaintiff by the name “Kaley G.M.” She is one of three plaintiffs who were selected for bellwether trials, test trials used to guide future cases. Jury selection concluded in late January 

K.G.M. started watching YouTube at the age of 6, created an Instagram account at 11, began her Snapchat use at 13, and started using TikTok a year later. Her attorney, Joseph VanZandt, says that this “excessive and problematic social media use changed the course of her childhood.”

She is suing for damages caused by social media platforms, claiming that she got addicted to the technology because of her social media use from an early age which worsened her depression and suicidal thoughts. In addition, the lawsuit claims that social media companies achieved this through deliberate designs for profit using its addictiveness. According to KGM’s deposition, photo-altering filters on Instagram and Snapchat deteriorated her self esteem and contributed to her body dysmorphia.

Victoria Burke, a therapist who diagnosed and treated then 13-year-old KGM, testifies that she believed her social media use was a “contributing factor” to her mental issues.

The lawsuit names Meta’s Instagram, Google’s YouTube, TikTok, and Snap, though the latter two settled and are now out of the trial.

Plaintiff’s attorney Josh Autry argues that harm was not solely from third party content. Rather, he says that it is the result of the platform being designed to be addictive. 

Contrary to KGM’s lawyers argument that lengthy social media use is associated with mental health harms, addictive features such as autoplay on YouTube and feeds that allow for endless scrolling are used to keep users on the platform

On the other hand, Meta’s attorney Ashley Simonsen insists that Kaley “was addicted to Instagram because of third-party content, not because of any design feature. 

“Let’s be crystal clear,” says YouTube’s attorney Brian Willen. “Exposure to third-party content is at the heart of these claims.”

There are, however, revealed internal documents that say otherwise. In one of the documents written by Youtube and Instagram employees, it reads, “The goal is not viewership, it’s viewer addiction” (the “goal” they are mentioning is the company’s business strategy).

“If we want to win big with teens,” a Youtube internal strategy memo reads, “we must bring them in as tweens.” 

“We’re basically pushers…We’re causing reward deficit disorder, because people are binging on Instagram so much they can’t feel the award,” says an Instagram employee in an internal message.

Because the design features or algorithms that these companies use have been central to the trial, understanding how algorithms work is essential. The algorithm is designed to keep users engaged. The more engaged the user, the longer time he or she spends on a platform. Increased time spent means that more ads can be displayed, generating more revenue.

To keep users engaged, algorithms  show content that is the most relevant to their interests. They first gather data on information such as the user’s scroll speed, how long they spend on similar content, and the time since the video was posted. Then, a machine learning model uses the data and predicts the probability of the user clicking on the video, watching to the end, or commenting. All of this information is used to create an optimized feed that keeps its users locked in and engaged

Proving that algorithms are behind addiction can be difficult because of the variety of factors that can play a role in an adolescent’s mental health. This includes genetics, stress, and tensions within the family, which Meta’s lawyer noted in an opening statement. Additionally, companies may always argue that it is ultimately the user who chooses to scroll or click on a video.

Even with the recent reveal of seemingly incriminating internal documents, there are still many challenges to the plaintiff’s claims. The outlined intent to enhance addictive engagement is not direct evidence that a platforms’ features not content specifically leads to addiction. 

Regardless of the court’s decision in this case, the trial underscores the growing discontent surrounding social media companies’ accountability. The outcome will not only affect this case but also how the liability of tech companies is addressed many years in the future.

Be First to Comment

Leave a Reply

Discover more from THE FORCE FILE

Subscribe now to keep reading and get access to the full archive.

Continue reading