• CluckN@lemmy.world
    link
    fedilink
    English
    arrow-up
    172
    ·
    6 months ago

    Pshh, I’m working on an AI blockchain cloud based customer first smart learning adaptive agile Air Fryer that will blow the competition away.

  • Pennomi@lemmy.world
    link
    fedilink
    English
    arrow-up
    143
    ·
    6 months ago

    Me too, but I make pathfinding algorithms for video game characters. The truly classic Artificial Intelligence.

    • bobotron@lemm.ee
      link
      fedilink
      English
      arrow-up
      25
      ·
      6 months ago

      I still remember fighting grunts in the original half life for the first time and being blown away. Your work makes games great!

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        24
        ·
        edit-2
        6 months ago

        It’s been a while since I looked at how Valve does it but it could be called a primitive expert system. And while the HL1 grunts were extraordinary for their time, HL2’s combine grunts are still pretty much the gold standard. Without the AI leaking information to the player via radio chatter it would feel very much like the AI is cheating because yes, HL2’s grunts are better at tactics than 99.99% of humans. It also helps that you’re a bullet sponge so them outsmarting you, like leading you into an ambush, doesn’t necessarily mean that you’re done for.

        OTOH they’re a couple of pages of state-machines that would have no idea what to do in the real world.

        Also, for the record: “AI” in gamedev basically means “autonomous agent in the game world not controlled by the player”. A “follow the ball” algorithm (hardly can be called that) playing pong against you is AI in that sense. Machine learning approaches are quite rare, and if then you’d use something like NEAT, not the gazillion-parameter neutral nets used for LLMs and diffusion models. If you tell NEAT to, say, drive a virtual car it’ll spit out a network with a couple of neurons, and be very good at doing that but be useless for anything else but that doesn’t matter you have an enemy AI for your racer. Which probably is even too good, again, so you have to nerf it.

  • Ekky@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    117
    arrow-down
    2
    ·
    edit-2
    6 months ago

    LLMs (or really ChatGPT and MS Copilot) having hijacked the term “AI” is really annoying.

    In more than one questionnaire or discussion:

    Q: “Do you use AI at work?”

    A: “Yes, I make and train CNN (find and label items in images) models etc.”

    Q: “How has AI influenced your productivity at work?”

    A: ???

    Can’t mention AI or machine learning in public without people instantly thinking about LLM.

    • smeg@feddit.ukOP
      link
      fedilink
      English
      arrow-up
      100
      arrow-down
      1
      ·
      6 months ago

      I imagine this is how everyone who worked in cryptography felt once cryptocurrency claimed the word “crypto”

      • Ekky@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        22
        ·
        6 months ago

        Luckily that was only the abbreviation and not the actual word. I know that language changes all the time, constantly, but I still find it annoying when a properly established and widely (within reason) used term gets appropriated and hijacked.

        I mean, I guess it happens all the time in with fiction, and in sciences you sometimes run into a situation where an old term just does not fit new observations, but please keep your slimy, grubby, way-too-adhesive, klepto-grappers away from my perfectly fine professional umbrella terms. :(

        Please excuse my rant.

      • dvlsg@lemmy.world
        link
        fedilink
        English
        arrow-up
        22
        ·
        edit-2
        6 months ago

        I’m still mad that ML was stolen and doesn’t make people think about the ML family of programming languages anymore.

    • funkless_eck@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      29
      ·
      edit-2
      6 months ago

      I had a first stage interview with a large multinational construction company where I’d be “the only person in the organization sanctioned to use ai”

      they meant: use chatgpt to generate blogs

      • MonkeMischief@lemmy.today
        link
        fedilink
        English
        arrow-up
        19
        ·
        6 months ago

        “That’s some high security clearance to have a computer rapidly tap auto-complete for entire paragraphs, hoss…wait it pays how much?(Ahem) I shall take this solemn responsibility of the highest order so very seriously!” Lol

    • marcos@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      6 months ago

      We are just taking “crypto” back to mean something useful. It was just a matter of some stupid people losing enough money.

      I hope in a few years we can take “AI” back too.

    • explodicle@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      25
      ·
      6 months ago

      They’re just betting on what will get bailed out because of their own bribes. It’s pure feedback at this point; all noise and no signal.

  • Waldowal@lemmy.world
    link
    fedilink
    English
    arrow-up
    59
    ·
    6 months ago

    As an older developer, you could replace “machine learning” with “statistical modeling” and “artificial intelligence” with “machine learning”.

    • QuaternionsRock@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      ·
      6 months ago

      I think people are hesitant to call ML “statistical modeling” because traditional statistical models approximate the underlying phenomena; e.g., a logarithmic regression would only be used to study logarithmic phenomena. ML models, by contrast, seldom resemble what they’re actually modeling.

    • cmfhsu@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      6 months ago

      We used to distinguish AI as automatically / programmatically making a decision based on an ML model, but I’m guilty of calling it AI for wow factor, lol.

      Now I have to be careful because AI = LLMs in common language .

  • tory@lemmy.world
    link
    fedilink
    English
    arrow-up
    38
    ·
    6 months ago

    My old coworker used to say this all the time back around 2018:

    "What’s the difference between AI and machine learning?

    Machine learning is done in Python. AI is done in PowerPoint."

  • Emptiness@lemmy.world
    link
    fedilink
    English
    arrow-up
    32
    arrow-down
    1
    ·
    6 months ago

    There needs to be a new taunting community for this mlm similar to the one for #buttcoin.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    29
    arrow-down
    10
    ·
    6 months ago

    Machine learning is a subset of artificial intelligence, so I don’t see anything wrong here. The character’s using a more generic term when talking to a layperson.

    • apocalypticat@lemmy.world
      link
      fedilink
      English
      arrow-up
      39
      arrow-down
      1
      ·
      6 months ago

      I think the point that they’re making is that they used the latest buzz word for the people dishing out the dough.

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        15
        arrow-down
        3
        ·
        6 months ago

        Yes, and I’m saying there’s nothing wrong with that “buzz word.” It’s accurate, just more generic.

        I see a lot of people these days raising objections that LLMs and whatnot “aren’t really artificial intelligence!” Because they’re operating from the definition of artificial intelligence they got from science fiction TV shows, where it’s not AI unless it replicates or exceeds human intelligence in all meaningful ways. The term has been widely used in computer science for 70 years, though, applying to a broad range of subjects. Machine learning is clearly within that range.

        • Ephera@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          6 months ago

          There’s a distinction into “narrow AI” and “Artificial General Intelligence”.

          AGI is that sci-fi AI. Whereas narrow AI is only intelligent within one task, like a pocket calculator or a robot arm or an LLM.

          And as you point out, saying that you’re doing narrow AI is absolutely not interesting. So, I think, it’s fair enough that people would assume, when “AI” is used as a buzzword, it doesn’t mean the pocket calculator kind.

          Not to mention that e.g. OpenAI explicitly states that they’re working towards AGI.

          • exocrinous@startrek.website
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 months ago

            If I built a robot pigeon that can fly, scavenge for crumbs, sing matings calls, and approximate sex with other pigeons, is that an AGI? It can’t read or write or talk or compose music or draw or paint or do math or use the scientific method or debate philosophy. But it can do everything a pigeon can. Is it general or not? And if it’s not, what makes human intelligence general in a way that pigeon intelligence isn’t?

  • Ephera@lemmy.ml
    link
    fedilink
    English
    arrow-up
    17
    ·
    6 months ago

    Whenever people say “AI”, I like to mentally insert an M, G and C: ✨Magic✨

    Or as it’s also known:
    ✨I don’t want to explain what I actually did, so here’s a meaningless word to stop you asking questions.✨

  • Socsa@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    14
    ·
    edit-2
    6 months ago

    One guy spends a summer implementing a backprop algorithm in CUDA and now my mom thinks butterflies are stealing her blood at night.