- cross-posted to:
- pcgaming@lemmy.world
- cross-posted to:
- pcgaming@lemmy.world
cross-posted from: https://lemmy.world/post/11840660
TAA is a crucial tool for developers - but is the impact to image quality too great?
For good or bad, temporal anti-aliasing - or TAA - has become a defining element of image quality in today’s games, but is it a blessing, a curse, or both? Whichever way you slice it, it’s here to stay, so what is it, why do so many games use it and what’s with all the blur? At one point, TAA did not exist at all, so what methods of anti-aliasing were used and why aren’t they used any more?
The first things I always turn off are motion blur, anti-aliasing and ray tracing.
Motion blur just makes it look like you’re drunk, anti-aliasing makes everything look like it’s smeared with vaseline and ray tracing tanks your FPS for not much added quality.
I don’t think I could stomach a game without AA. It’s on par with playing a game with an unstable 30fps frame rate, it’s just nauseating.
Try playing Forza without AA. Ray Tracing tanks your performance, but it gives great visual Enhancements, once you experience it, there’s no going back.
I don’t really play racing games or Forza so maybe it’s unique to Forza or racing in general but every RPG, action, adventure, strategy, survival, shooter and sim game I have played looks worse with AA and ray tracing is not worth cutting your FPS in half for.
You must not notice aliasing and shimmering then? Most find it very distracting to see everything flickering and shimmering and stair step with the slightest motion.
And ray tracing really depends on the game, implementation, and hardware. Ray traced global illumination alone fixes the classic video game look that stems from rasterized lighting errors (light leaking, default ambient light, etc). It is the future for high quality games even not photo-realistic ones. Its expense is offset by both reconstruction and improved hardware. You wont be able to avoid it forever even if you want to.
It has gotten much better in the last 7 years. I will say that I usually test 1.5× or 2× my resolution if possible, which can to be less taxing depending on the engine, as I’m always trying to eek out a little extra on my 970.
2x on a 970? I struggled with my 970 at 1440p low-medium settings until i got the 3080. Often had to put scaling to 1080p. And that was on “last gen” titles, cant imagine still trying to limp that thing along nowadays, despite as much as i loved it.
Depends on the game, but I don’t usually pick up current gen for a bit. Unless you count Switch Emulation?
Someone hasn’t tried motion blur since 2004 GTA
It’s like you’ve used each thing once in some specific game where it was badly implemented and decided that’s how it looks in all games.
There is no objective “it looks like this”, every game does things slightly or very differently. I’m certain you are unusually blind to detail, have serious vision problems, or you’re just very good at convincing yourself of your own bad ideas.
There are actually a few unreal engine games where you can’t disable AA in the settings and I have tried to play with it on but I just end up disabling it in the ini files anyways because it looks bad. I have not encountered AA that does not make the game look blurry.
I have never met anyone who doesn’t disable motion blur just outright so didn’t think anyone would ever defend that.
Same. I also disable stuff like filmgrain and lens-flares, whenever possible.
I always have film grain enabled. It provides some half decent dithering that helps mask color banding, especially noticeable on my low end monitor.