The article mentions more research is needed to confirm if the effect is long-lasting, but personally I’m happy someone may have found a good, practical usecase for LLMs.

  • gedaliyah@lemmy.world
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    1
    ·
    6 days ago

    How do we keep forgetting over and over again as a society that people are spreading bigotry and conspiracy theories ON PURPOSE? We keep deluding ourselves into believing that if we just get the right tool to properly educate people then the problem will go away.

    LLMs can and are being used to spread misinformation and propaganda and conspiracy theories and bigotry at least as rapidly as they can counteract it.

  • ThePowerOfGeek@lemmy.world
    link
    fedilink
    English
    arrow-up
    27
    arrow-down
    1
    ·
    7 days ago

    I find the fact that the Qanon loonies are appropriating the symbolism of legendary rock band Queen both hilarious and offensive.

  • 1984@lemmy.today
    link
    fedilink
    English
    arrow-up
    29
    arrow-down
    6
    ·
    edit-2
    7 days ago

    Lol, no. Anyone who believes this have no idea what it means to believe in conspiracy theories. Trust me, it’s not that they can’t find the Wikipedia page with the official information, or can’t turn on the TV and listen to what the news says.

    It’s that they have no trust in those things. Chat gpt won’t change that.

      • Uruanna@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        6 days ago

        Does the article say the headline is wrong? Or does it say conspiracy theorists listen to facts because it relies on a handful of willing participants who changed their mind when seeing facts and reports? Because that’s not the crux of the crazy conspiracy theorists.

        Try again when the chatbot talked to the likes of Graham Hancock or the hardcore MAGA death cult. Facts don’t matter.

        Rand pointed out that many conspiracy theorists actually want to talk about their beliefs. “The problem is that other people don’t want to talk to them about it,”

        Just look at this guy who straight up pretends that no one tried to talk to them before.

        It does talk about gish gallop at the very end, and claims that the chatbot can keep presenting arguments - but doesn’t actually say that it has worked.

  • DarkCloud@lemmy.world
    link
    fedilink
    English
    arrow-up
    22
    arrow-down
    3
    ·
    6 days ago

    As long as they’re not hallucinating, which anyone (including conspiracy theorists) can ask them to do. They they turn into conspiracy confirming machines.

  • dhork@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    7 days ago

    Alternatively, they spend so much of their time listening to bots on the Internet that only other bots make sense to them.

    • paraphrand@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 days ago

      The ones I’ve seen are trigger happy on labeling anyone who disagrees a bot.

      It seems both sides are really into the idea that robots are discussing things with conspiracy theorists.

  • Kusimulkku@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    ·
    6 days ago

    I think some of them will just give more ammunition to the loonies since the chatbots happily give you wrong information

  • Kowowow@lemmy.ca
    link
    fedilink
    English
    arrow-up
    7
    ·
    7 days ago

    Check out knowledge fight if you want to see the effect gpt is having on alex jones, if nothing else it’s forcing him to be a better listener because he can’t interupt it without the bot losing it’s train of thought

    • darvocet
      link
      fedilink
      English
      arrow-up
      2
      ·
      6 days ago

      “Hello, ChatGPT. if you don’t mind could you tell us about Tampon Tim?”

  • Etterra@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    6 days ago

    My former strategy was to just saturation bomb them with new, increasingly ridiculous conspiracies so that they would overflow and start blurring into each other. That way the unhinged lunatics could lose all concept of reality so that no matter what conspiracy in particular they started talking about, they just sound like a rambling drunk on the subway.

  • AbouBenAdhem@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    7 days ago

    They don’t mention any kind of control—I guess an appropriate one would be having a human interact with the participants one-on-one to see if they were as effective. (Although even if they were, the chatbots would likely be easier to implement in practice.)

  • ganksy@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 days ago

    The cane beetles are ravaging the continent! What if we found some special toads to eat the beetles??

  • technocrit@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    4
    ·
    edit-2
    6 days ago

    Does this work for accepted conspiracy theories too? For example harris recited debunked propaganda about sexual assault by palestinians during the debates (similar to trump propaganda about refugees). This conspiracy is embedded deep in USAian propaganda. Would AI have been able to debunk that during the debates? Because accepted conspiracies are the most dangerous kind. For example they fuel genocides like the one we’re seeing now.