Literally just mainlining marketing material straight into whatever’s left of their rotting brains.

  • sooper_dooper_roofer [none/use name]@hexbear.net
    link
    fedilink
    English
    arrow-up
    5
    ·
    edit-2
    1 year ago

    To me, it seems plausible that simply increasing processing power for a sufficiently general algorithm produces sentience.

    How is that plausible? The human brain has more processing power than a snake’s. Which has more power than a bacterium’s (equivalent of a) brain. Those two things are still experiencing consciousness/sentience. Bacteria will look out for their own interests, will chatGPT do that? No, chatGPT is a perfect slave, just like every computer program ever written

    chatGPT : freshman-year-“hello world”-program
    human being : amoeba
    (the : symbol means it’s being analogized to something)

    a human is a sentience made up of trillions of unicellular consciousnesses.
    chatGPT is a program made up of trillions of data points. But they’re still just data points, which have no sentience or consciousness.

    Both are something much greater than the sum of their parts, but in a human’s case, those parts were sentient/conscious to begin with. Amoebas will reproduce and kill and eat just like us, our lung cells and nephrons and etc are basically little tiny specialized amoebas. ChatGPT doesn’t…do anything, it has no will