• 0 Posts
  • 1 Comment
Joined 1 year ago
cake
Cake day: November 1st, 2023

help-circle
  • If you are talking about AI and gaming workloads you are going to need a GPU so form factor is important. I squeezed a 1050ti in my Dell R720xd for a gaming vm a few years ago and it was great but nowhere near what is needed for running simple inference on open source models. I’d consider more of a gaming rig for AI workloads. See /r/localllama

    I ended up picking up last year’s top of the line Mac Studio with 128gigs of shared memory on sale at Microcenter for ai inference workloads. I’m using its 10g onboard to connect to my R720 via sfp+ to cat 7/rj45 and dig that. I have many secondary use cases for the Mac Studio which is a pleasure to have at my desk. If I was a PC gamer, I might have gone that route.