• chaosCruiser@futurology.today
      link
      fedilink
      English
      arrow-up
      10
      ·
      6 months ago

      Optical Character Recognition used to be cutting edge AI buzz in the 70s and 80s. Eventually, it got applied to all sorts of places, so OCR kinda lost some of the magic and sparkle. After that, people stopped thinking of it as AI, even though it relies on a neural network.

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        2
        ·
        6 months ago

        A simple line of code that goes “if moisture < 0.25 then loaddone” of “water = weight * 0.43” isn’t AI, true.
        But when you start stacking enough of them with the goal and results being “We could get a chef to check how the pizza is doing every few seconds and, and control all of the different temperatures of this oven until it’s perfectly done, but we have made a computer algorithm that manages to do that instead”, then it’s quite hard to argue it isn’t software that is “performing a task typically associated with human intelligence, such as … perception, and decision-making.”

        Especially if that algorithm was (I have no idea if it was in this case btw) not done by just stacking those if clauses and testing stuff manually until it works, but by using machine learning to analyze a mountain of baking data to create a neural network that does it itself. Because at that point, it definitely is artificial intelligence - it’s not an artificial general intelligence, which many people think is the only type of “true AI”, but it is an AI.