Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.
Honestly most people sit far enough from the TV that 1080p is already good enough.
I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn’t quite enough to get there.
Also, there’s some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.
Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.
For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.
TV makers DON’T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.
720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance
to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.
or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.
For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched
Devs already use it instead of optimisations, what makes you think that bosses don’t try to push it further because deadlines and quarterly profits, immortals of aveum is example and it’s not even end of generation, only half (i agree with you from user standpoint though)
4K is overkill enough. 8K is a waste of energy. Let’s see optimization be the trend in the next generation of graphics hardware, not further waste.
Yeah. Once games are rendering 120fps at a native 6K downscaled to an amazing looking 4K picture, then maybe you could convince me it was time to get an 8K TV.
Honestly most people sit far enough from the TV that 1080p is already good enough.
I find 4k is nice on computer monitors because you can shut off anti-aliasing entirely and still leave jagged edges behind. 1440p isn’t quite enough to get there.
Also, there’s some interesting ideas among emulator writers about using those extra pixels to create more accurate CRT-like effects.
Oh yeah, I have read some very cool things about emulators and being able to simulate the individual phosphors with 4K resolution. I have always been a sucker for clean crisp pixels (that’s what I was trying to achieve on the shitty old CRT I had for my SNES) so I haven’t jumped into the latest on crt shaders myself.
But anti-aliasing needs far less performance. And you need to mess about with scaling on a 4k monitor which is always a pain. 1440p for life IMHO
Anti-aliasing also softens the image a bit. Image quality is better if you can leave it off.
*monkey’s paw curls*
Granted! Everything’s just internal render 25% scale and massive amounts of TAA.
For TV manufacturers the 1K/4K/8K nonsense is a marketing trap of their own making - but it also serves their interests.
TV makers DON’T WANT consumers to easily compare models or understand what makes a good TV. Manufacturers profit mightily by selling crap to misinformed consumers.
Divide resolution by 3 though, current gen upscale tech can give that much, 4k = upscaled 720p and 8k = upscaled 1440p
can doesn’t mean should.
720p to 4k using dlss is okay, but you start to see visual tradeoffs strictly for the extra performance
to me it really shines at 1080p to 4k where it is basically indistinguishable from native for a still large performance increase.
or even 1440p to 4k where it actually looks better than native with just a moderate performance increase.
For 8k that same setup holds true. go for better than native or match native visuals. There is no real need to go below native just to get more performance. At that point the hardware is mismatched
Devs already use it instead of optimisations, what makes you think that bosses don’t try to push it further because deadlines and quarterly profits, immortals of aveum is example and it’s not even end of generation, only half (i agree with you from user standpoint though)