• Toribor@corndog.social
    link
    fedilink
    English
    arrow-up
    18
    ·
    10 days ago

    The problem is algorithmically driven content feeds and the lack of transparency around them. These algorithms drive engagement which prioritizes content that makes people angry, not content that make people happy. These feeds are full of misinformation, conspiratorial thinking, rage bait, and other negativity with very little user control to protect themselves, curate the feed or to have neutral access to news and politics.

    Lemmy sorts content very simply based on user upvotes. If you want to know why you’re seeing a post you can see exactly who upvoted it and what instances that traffic came from. It’s not immune to being manipulated but it can’t be done secretly or in a centralized way.

    Yet based on their actions we already know that Facebook has levers they can pull to directly affect the amount of news people see about a specific topic, let alone the source of information on that topic. These big social media companies guard these proprietary algorithms that are directly determining what news people see on a massive scale. Sure they claim to be a neutral arbiter of content that just gives people what they want but why would anyone believe them?

    Lemmy is not the same thing, though it’s not without its own problems.

      • Toribor@corndog.social
        link
        fedilink
        English
        arrow-up
        9
        ·
        edit-2
        10 days ago

        Here is a bit of information on how Lemmy’s “Hot” sorting works.

        I’m not going to argue about how addictive any specific feed or sorting method is, but this method is content neutral, does not adjust based on user behavior (besides which communities you subscribe to) and is completely transparent as all post interactions are public. With this type of sorting users can be sure that certain content is not prioritized over others (outside of mod actions which are also public). Having a more neutral straightforward ranking system that isn’t based on user behavior reduces addictiveness and is less likely to form echo chambers. This makes it easier to see more diverse content, reduces the spread of misinformation and is much more difficult to manipulate.

        • AstralPath@lemmy.ca
          link
          fedilink
          English
          arrow-up
          5
          ·
          10 days ago

          Thank you for posting this crucial context for the algorithms. I didn’t even know this information was available.

      • jeffw@lemmy.worldOP
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 days ago

        Except there’s no company (possibly pressured by governments) manipulating what shows up in those places and it’s all transparent algorithms.