I asked ChatGPT about cognitive dissonance in the context of climate change. I thought the answer was interesting.

    • cerement@slrpnk.net
      link
      fedilink
      arrow-up
      4
      ·
      1 year ago

      another metaphor to throw into the mix – “it’s just a fancy version of the predictive text on your phone”

    • Solar Bear@slrpnk.net
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      LLMs can be really good at answering simple to moderate questions on a subject if you prime it with a bunch of relevant material first, or even better, finetune it on that material if you have the time, power, and access to do so. There’s a really good case to make for using them in search. But general models are pretty spotty at answering anything but the most basic questions on the most common topics.

      • silence7@slrpnk.netM
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        eg: LLMs are good at answering simple to moderate questions if you do the work of figuring out what truthful relevant material is in the first place. But you can’t really do that if you don’t already have the background knowledge to write the answer yourself.

        • Solar Bear@slrpnk.net
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Well, yes and no. You don’t have to be an expert in a subject to be able to find trustworthy sources. And sometimes all you’re really lacking is just the correct vocabulary to describe a concept.

          You can feed the algorithm known trusted info, have it process that info looking for specific things and return it, and then you can use that to reference back to the source material to find exactly where it came from. This would dramatically improve time spent looking stuff up without having to hunt for exact keywords, as long as it was used correctly and verified afterwards. It’s a tool to speed things up, not a replacement for a human brain.

          Unfortunately, people keep trying to use it as exactly that. And it doesn’t help that the people leading the hype are trying sell it as exactly that.

          • silence7@slrpnk.netM
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            If you don’t have basic competence in a subject, you can’t really tell which sources are reliable. And since everybody starts out ignorant, there are always going to be a lot of people who can’t do that.

            You need to bootstrap with some basic knowledge and then have an epistemic model in order to get anywhere, and that’s pretty much impossible with the current generation of machine learning tools.

            • Solar Bear@slrpnk.net
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              That’s not true though. If it were, nobody would be able to reliably learn something new without directly instructional education from an experienced person. Basic media literacy is often all you need to get started.

              • silence7@slrpnk.netM
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                A huge chunk of the population doesn’t have basic media literacy though, and for that population, trying to use an LLM probably makes things worse.

  • cxtinac@sh.itjust.worksOP
    link
    fedilink
    arrow-up
    1
    ·
    1 year ago

    OP here. CLARIFICATION: First apologies, I should have framed this better originally, I will try to correct that here*.

    My purpose in interacting with ChatGPT and asking about cognitive dissonance (c.d. hereafter) related to climate change denial was not to look for a clinical definition of c.d. nor to test how much ChatGPT “knew” about it, or to illustrate how cute LLM models could be. It was only to get a working starting point for a discussion.

    My goal was, given a quick reasonable “average user” working definition of c.d., how could I a) relate that to climate change denial, and further b) what could those of us who are not “deniers” (for want of a better label) do to attempt to get beyond c.d. amongst those who are.

    Opinion: It seems to me that we have to somehow try to go beyond publicising link after link on the apocalyptic impacts of climate change, although those impacts clearly are apocalyptic, and the facts are still extremely important to get out there. Honestly myself I feel I have to work towards some closure of the “c.d. gap” - which I believe is much more widespread than only out & out deniers (and includes myself).

    @Mod /u/silence7@slrpnk.net, I hope I’m not out of line here, if so, I apologise, please be gentle.

    *I considered editing the original post to add this, but I do not want to ‘move the goal posts’ on the discussion, hence added as another comment.