Recommendation algorithms operated by social media giants TikTok and X have shown evidence of substantial far-right political bias in Germany ahead of a federal election that takes place Sunday, according to new research carried out by Global Witness.

The non-government organization (NGO) undertook an analysis of social media content displayed to new users via algorithmically sorted “For You” feeds — finding both platforms skewed heavily toward amplifying content that favors the far-right AfD party in algorithmically programmed feeds.

Global Witness’ tests identified the most extreme bias on TikTok, where 78% of the political content that was algorithmically recommended to its test accounts, and came from accounts the test users did not follow, was supportive of the AfD party. (It notes this figure far exceeds the level of support the party is achieving in current polling, where it attracts backing from around 20% of German voters.)

MBFC
Archive

  • redwattlebird@lemmings.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    edit-2
    23 hours ago

    That’s was the original intent of social media but it doesn’t function that way now. It’s also filled with bots and you’ve got no idea if you’re actually talking to a human being. The only way to be sure is to touch grass so you’re actually outside the sphere of influence of misinformation and those that control the algorithm.

    Let’s go back to the topic at hand, which is whether or not we should remove fascist content from social media like Tiktok and X. These platforms are not a gateway to that library you linked to and you have to go out of your way, i.e. purposely search for it, to access it. Tiktok and X are full of sound bites that do more harm than good; posts with links to your library, for example, would not get the same exposure as, say, a sound bite promoting fascism. Case in point: the article.

    The generation that has grown up with 24/7 access to the Internet have no idea what it feels like to not have information streaming into your brain everywhere you look. When information is fed and available to you 24/7, you can’t learn to tell the difference between what’s bullshit and what’s not because the 24/7 stream is so overwhelming, all you have left is the will to scroll for sound bites. The only way to counter that is to step out of that stream and slow down the incoming information so you can process it and critically analyse it. Our brains can only process a maximum amount of information that is far below what an algorithm can do.

    And that library link, what are you reading on there right now? Are you researching talking points against fascism? Are you looking at the history of previous fascist regimes and how they came to be?

    • Ledericas@lemm.ee
      link
      fedilink
      English
      arrow-up
      1
      ·
      21 hours ago

      half of the comments on some subs, if not the whole site of reddit are bots(mostly from countries like RU and Israel(if you are in a post specific to that region).

    • commander@lemmings.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      23 hours ago

      you’re actually outside the sphere of influence of misinformation and those that control the algorithm.

      You think people you talk to outside of the internet aren’t influenced by social media?

      • redwattlebird@lemmings.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        23 hours ago

        Before the rise of corporate social media, yes. It was easy to speak with people who had different opinions on things because of their life experience.

        Now, absolutely not because everyone is in some way connected to a stream of misinformation that is social media, which dictates the news cycle and determines what the talking points are for the day. But if society weaned itself off this social media drug, there is a better chance for improvement. At the very least, rage bait would be a lot less effective.

        Getting back to what the discussion is though: you don’t want censorship of certain ideologies because you believe that’s a form of control that society doesn’t need. You want a ‘free’ flow of information to allow the user to decide for themselves.

        But that information isn’t free flow. It’s controlled by corporate interests. And removing fascist content and ideology on Tiktok and X from the general uninformed public? Hell yeah. If they want to look it up, they can read it in a history book and see what has actually happened under fascist regimes, then decide for themselves if they want that.