TikTok has to face a lawsuit from the mother of 10-year-old Nylah Anderson, who “unintentionally hanged herself” after watching videos of the so-called blackout challenge on her algorithmically curated For You Page (FYP). The “challenge,” according to the suit, encouraged viewers to “choke themselves until passing out.”

TikTok’s algorithmic recommendations on the FYP constitute the platform’s own speech, according to the Third Circuit court of appeals. That means it’s something TikTok can be held accountable for in court. Tech platforms are typically protected by a legal shield known as Section 230, which prevents them from being sued over their users’ posts, and a lower court had initially dismissed the suit on those grounds.

    • t3rmit3@beehaw.org
      link
      fedilink
      arrow-up
      14
      ·
      edit-2
      4 months ago

      Yes, but that is not the entirety or even majority of the problem with algorithmic feed curation by corporations. Reducing visibility of those dumb challenges is one of many benefits.

    • schnurrito@discuss.tchncs.de
      link
      fedilink
      arrow-up
      5
      ·
      4 months ago

      No it wouldn’t, but people would only see them if they were part of a preexisting community where such things are posted or they specifically looked for them.

      On the Internet, censorship happens by having too much information for our limited time and attention span, so going after recommendation algorithms will work.