• Glide@lemmy.ca
    link
    fedilink
    arrow-up
    101
    ·
    10 months ago

    A 1080TI still plays every release at medium or higher settings. /shrug

    Unless you’re worried about 4k or VR, I wouldn’t upgrade anyway.

    • schmidtster@lemmy.world
      link
      fedilink
      arrow-up
      18
      ·
      10 months ago

      If you care about refresh rate it matters, not a lot of people can stand 30-40fps with hard drops to single digits just to be able to play a game.

      • Glide@lemmy.ca
        link
        fedilink
        arrow-up
        23
        ·
        10 months ago

        I’m curious. What game do you think drops to single digits fps on medium settings with a 1080TI?

        I was playing Darktide on a 1060 with minimum 30fps recently, and that game is optimized like absolute trash.

        • schmidtster@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          10 months ago

          Starfield.

          1080 is the minimum card, the TI is decently more powerful (30%), but you’ve got to make concessions on medium to get 30fps, and there’s drops.

          • Viking_Hippie@lemmy.world
            link
            fedilink
            arrow-up
            21
            ·
            10 months ago

            To be fair though, that’s a VERY new game and they cared so little about optimizing it that they went out and said “you’re probably going to need a new computer to play this” …

            I hear the 1080ti runs Doom just fine 😛

            • schmidtster@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              10 months ago

              Every release means every release, and the requirements aren’t going to get lower. It’s a great card and I know people hate losing it, but it’s on its last legs and likely won’t be able to play new releases at all next year.

          • Perfide@reddthat.com
            link
            fedilink
            arrow-up
            6
            ·
            10 months ago

            That’s Bethesda’s fault. There is no fucking reason that game can’t run well on a 1080ti for how mediocre it looks.

        • Psaldorn@lemmy.world
          link
          fedilink
          arrow-up
          3
          ·
          edit-2
          10 months ago

          Cod MW2/3 are total crapshoots with frame rates, even on a 3080 and set to performance it can still just turn to crap. It seems to run more stable and on higher settings on 2070 laptop. I don’t understand. (I tried to get as much hardware running DMZ as possible for friends and family, lots of machines)

    • banneryear1868@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      10 months ago

      I actually prefer the crisp edges without as much post-processing effects sometimes. Source engine games look great to me, just minimal crisp and clean geometry. I find a lot of modern graphics distracting, but it depends on the game. I do love really pushing graphics for a game like Skyrim.

      • spader312@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        10 months ago

        Modern game engines don’t use the amazing SSAA (super sampling anti aliasing). Most have post processing anti aliasing like FXAA or TXAA which always makes edges look fuzzy. Source engine is one of those that still supports super sampling

        • banneryear1868@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          10 months ago

          Yeah that’s exactly it, MSAA isn’t too bad but FXAA makes edges look pretty blurry. Temporal anti aliasing is also really blurry looking sometimes but gives the impression that the edges could be crisp.

    • Khrux@ttrpg.network
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      10 months ago

      My 980ti is still a toss up between amazing or mediocre performance. The big issue is that I bought it for £600 which is a lot of money to me, and new GPUs are 3 times that, or more.

  • HexesofVexes@lemmy.world
    link
    fedilink
    arrow-up
    42
    ·
    10 months ago

    I’ve said it before and I’ll say it again, when humanity is wiped out future species will find a Nokia on half battery and a fully working 1080Ti.

    • adhocfungus@midwest.social
      link
      fedilink
      arrow-up
      8
      ·
      10 months ago

      My 960 runs Unity games like Overcooked at 4k, so I probably won’t be upgrading any time soon. With a toddler I don’t have time for AAA games anymore, but I’m guessing the frame rate would be painful.

    • crashoverride@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      1080 here, got to use for like 220, before the graphics card nightmare. I may be looking to try to get a 2080 or 2070. I wonder if I can do that on 550 at 600 watts

  • Coaster@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    10 months ago

    Having to run -400mhz on the vram to prevent mine from crashing all the time but I’m hanging in there 🥲👍

      • Coaster@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        10 months ago

        I thought it was driver problems at first because it happened so infrequently, but it has gotten worse this year.

        It took ages to narrow it down to memory faults since I’ve been running stock settings forever. Stumbled upon this tool: https://github.com/GpuZelenograd/memtest_vulkan/

        Found a load of errors, which went away after the down clock and it’s been stable since. It must be just age related degradation, temps were never high but I know the repeated temp changes have an impact.

    • Max17@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Same problem as mine! I thought I was the only one. I’m using msi afterburn and tuning down power usage to 90% sigh

  • Kadath (she/her)@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    10 months ago

    Depends on the games. My 980TI can still rock 3440x1440 in most of the games I play.

    The fact that what I play is mostly metroidvania shouldn’t be an issue, right? 😅

    • smooth_jazz_warlady@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      5
      ·
      10 months ago

      My 980ti still holds up pretty well at 1920x1440 (high-end CRT monitors were beautiful things, restart production you cowards) for most 3d games I play on Linux, but it is starting to have performance issues in some games, and I’m getting real sick and tired of the dumb shit Nvidia keeps pulling with their Linux drivers. The current driver gives me horrible black flickering in a lot of games, and of course they arbitrarily lock me out maxing out my CRT monitor (which don’t have a fixed resolution, only a balance of resolution vs refresh rate, and it keeps blocking me from a whole range of refresh rate/resolution combinations). So I confess I am starting to eye the higher-end AMD 6xxx GPUs, and I would definitely try and grab one as cheaply as I could if I ever got a 3440x1440 ultrawide.

      Incidentally, how are ultrawides for having two or three windows open side-by-side at the same time?

      • Kadath (she/her)@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Incidentally, how are ultrawides for having two or three windows open side-by-side at the same time?

        Awesome. For work (even if I am a Linux system engineer) I need to use W11 due to corporate policy. I have two 34" in landscape and a 27" in portrait. I split the screens with FancyZones.

        Time for my bad drawing skills, lol.

        In order:

        1. SSH
        2. SSH
        3. SSH
        4. Outlook
        5. Edge for work
        6. Teams
        7. Firefox with YouTube running. Firefox is the only browser that allows for in-window full screen.
        • smooth_jazz_warlady@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          I see

          I’m debating getting a 3440x1440 monitor for coding and because I hear they work well with tiling window managers (hence the question), it’s just annoying that I have almost no chances to try them out for free, and also the cost is enough that I wouldn’t get one without serious consideration first. Although you have nudged me a bit closer to “maybe I could get one without testing them first, if it’s second hand and cheap(er)”.

          Also I’d be replacing my existing 27 inch LCD with it, and keeping the 4:3, 21 inch CRT, for a highly cursed monitor setup, where everything gets letterboxed or pillarboxed. And then to make things worse, I could grab a 16:10 monitor to put in portrait besides one of the other two, for maximum “what is 16:9 and why do I have black bars on everything”.

    • MystikIncarnate@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      I had a 1060 3G version and it just couldn’t hack it anymore. Picked up a 20 series this year and it was such an improvement.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          Yeah. I’m about 90% sure that if I had gotten the 1060 6G and I hadn’t gotten a really good deal on a 2080, I’d probably still be using the 1060.

          For now though, I don’t suspect I’ll be replacing the 2080 anytime soon… So when the 50 series comes out, this meme will be me with my 2080.

    • JPSound@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      10 months ago

      Same here. Works great and does all I need it to. It would be nice to have a new GPU but I’m driving this one until the wheels fall off.

  • AdrianTheFrog@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    ·
    10 months ago

    It’s still a decent card, probably can still do well at 1080p max settings in most games. Very similar to a 3060 in terms of performance, which is the card I have.

    • daq@lemmy.sdf.org
      link
      fedilink
      arrow-up
      2
      ·
      10 months ago

      I’m playing D4 at 4K on medium settings. No complaints. This is on a 3yo laptop with 1080ti in an external enclosure hooked up via Thunderbolt.

  • NaoPb@eviltoast.org
    link
    fedilink
    English
    arrow-up
    12
    ·
    10 months ago

    This is how I’ve always felt, running on old second hand unstable rigs. But it’s better than nothing.

    • weeeeum@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Haha, bold to assume new rigs are stable, speaking from a ryzen 7 7800x3d and 7900xtx. Thing can barely fucking boot everytime I power it on.