Full doc title: “The AI Doc: Or How I Became an Apocaloptimist”

Per wiki:

The AI Doc: Or How I Became an Apocaloptimist is a 2026 American documentary film directed by Daniel Roher and Charlie Tyrell. It is produced by the Academy Award-winning teams behind Everything Everywhere All at Once (Daniel Kwan and Jonathan Wang) and Navalny (Shane Boris and Diane Becker).

What to say here? This is a doc being produced by the producer and one of the directors of Everything Everywhere All At Once, who notably have been making efforts to, uh, negotiate? I guess? with AI companies vis a vis making movies. Anyway the title is a piece of shit and this trailer makes it look like this is just critihype the movie. I guess we’ll hear more about it in the coming month.

Really interesting framing this as brought about by thinking about the director’s child, given Yud’s recent comments about how one should raise a daughter if you had certain beliefs about AI.

  • lurker@awful.systems
    link
    fedilink
    English
    arrow-up
    7
    ·
    3 days ago

    my god I just cringed so hard. I thought the book would be the end….

    Also yeah, someone pointed this out on old SneerClub but Yud loves using kids to illustrate his AI fears, and to beat a very dead horse here that’s a weird thing to do in his case

    If anyone here wants to jump on the grenade and watch it/acquire a transcript for the rest of us to sneer at you’ll be my hero

    • lurker@awful.systems
      link
      fedilink
      English
      arrow-up
      6
      ·
      3 days ago

      also, what the fuck does “apocaloptimist” mean??? does it mean he’s optimistic about our chances of apocalypse??? (which makes no sense, just say pessimist) has he finally gone crazy and is now saying that apocalypse is the optimistic outcome?

      • Architeuthis@awful.systems
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        I mean, they mostly don’t have a problem with AI instances inheriting the earth as long as they’re sufficiently rationalist.

      • swlabr@awful.systemsOP
        link
        fedilink
        English
        arrow-up
        6
        ·
        3 days ago

        Pure speculation: my guess is that an “apocaloptimist” is just someone fully bought into all of the rationalist AI delulu. Specifically:

        • AGI is possible
        • AGI will solve all our current problems
        • A future where AGI ends humanity is possible/probable

        and they take the extra belief, steeped in the grand tradition of liberal optimism, that we will solve the alignment problem and everything will be ok. Again, just guessing here.

        • Soyweiser@awful.systems
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          2 days ago

          According to a site : https://apocaloptimist.net/the-apocaloptimist/

          “An Apocaloptimist sees the trouble, but is optimistic we can do anything–including fixing all the world’s problems”

          So if jesus wins the war during the second coming all problems are fixed.

          (The thing is also nuts “we are the people actually working on fixing things [by hoping AGI will fix it all for us]”, my brother in Eschatology you are running a podcast sorry the guy is unrelated to the agi people theyvare just using his term).

          E: does seem the site itself isnt about AI so they just stole this guys term. Nope they just took this clean energy guys term, sorry about sneering at him he seems to actually want to introduce clean energy and works hard (that seems to be a lot of conventions and blogging however, so buying ourself out of the capitalist problems) for it, as far as I can tell.

        • lurker@awful.systems
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 days ago

          my personal guess is that “apocaloptimist” is just them trying to make a “better” term for “pessimist”