• Domi@lemmy.secnd.me
        link
        fedilink
        arrow-up
        9
        ·
        8 months ago

        AMD’s ROCm stack is fully open source (except GPU firmware blobs). Not as good as Nvidia yet but decent.

        Mesa also has its own OpenCL stack but I didn’t try it yet.

          • Domi@lemmy.secnd.me
            link
            fedilink
            arrow-up
            13
            ·
            8 months ago

            It does not.

            ROCm runs directly through the open source amdgpu kernel module, I use it every week.

            • Possibly linux@lemmy.zip
              link
              fedilink
              English
              arrow-up
              3
              ·
              edit-2
              8 months ago

              How and with what card? I have a XFX RX590 and I just gave up on acceleration as it was slow even after I initially set it up.

              • Domi@lemmy.secnd.me
                link
                fedilink
                arrow-up
                9
                ·
                edit-2
                8 months ago

                I use an 6900 XT and run llama.cpp and ComfyUI inside of Docker containers. I don’t think the RX590 is officially supported by ROCm, there’s an environment variable you can set to enable support for unsupported GPUs but I’m not sure how well it works.

                AMD provides the handy rocm/dev-ubuntu-22.04:5.7-complete image which is absolutely massive in size but comes with everything needed to run ROCm without dependency hell on the host. I just build a llama.cpp and ComfyUI container on top of that and run it.

  • db2@lemmy.world
    link
    fedilink
    arrow-up
    24
    arrow-down
    16
    ·
    8 months ago

    I can’t wait for this bullshit AI hype to fizzle. It’s getting obnoxious. It’s not even AI.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    8 months ago

    This is the best summary I could come up with:


    Ryzen AI is beginning to work its way out to more processors while it hasn’t been supported on Linux.

    Then in October was AMD wanting to hear from customer requests around Ryzen AI Linux support.

    Well, today they did their first public code drop of the XDNA Linux driver for providing open-source support for Ryzen AI.

    The XDNA driver will work with AMD Phoenix/Strix SoCs so far having Ryzen AI onboard.

    AMD has tested the driver to work on Ubuntu 22.04 LTS but you will need to be running the Linux 6.7 kernel or newer with IOMMU SVA support enabled.

    In any event I’ll be working on getting more information about their Ryzen AI / XDNA Linux plans for future article(s) on Phoronix as well as getting to trying this driver out once knowing the software support expectations.


    The original article contains 280 words, the summary contains 138 words. Saved 51%. I’m a bot and I’m open source!

  • ProgrammingSocks@pawb.social
    link
    fedilink
    arrow-up
    5
    arrow-down
    1
    ·
    8 months ago

    A+ timing, I’m upgrading from a 1050ti to a 7800XT in a couple weeks! I don’t care too much for “ai” stuff in general but hey, an extra thing to fuck around with for no extra cost is fun.

    • Dremor@lemmy.world
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      Unfortunately not.

      “The XDNA driver will work with AMD Phoenix/Strix SoCs so far having Ryzen AI onboard.”. So only mobile SoC with dedicated AI hardware for the time being.

      • Harbinger01173430@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        8 months ago

        Welp…I guess Radeon will keep being a GPU for gaming only instead of productivity as well. Thankfully I no longer need to use my gpu for productivity stuff anymore

  • Pantherina@feddit.de
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    8 months ago

    Is that the stuff used on servers? Or just small tasks on Laptops? Because if on servers anything else would be stupid