Pic above is 4k. So for some context I’ve been a largely above common (wouldn't say hardcore) completely console gamer for a pair many years now. My first gaming expertise as a child was a PC, however shortly migrated to consoles because the Nintendos have been so handy and in a position to hook up in my room. I'm 38 now, have all the foremost consoles (Xbox Sequence X, had a Sequence S in my workplace, PS5, and Swap OLED) and as of Might have a prime flight PC. I'm really transitioning to PC full time as I’ve simply change into bored with devs not utilizing the effectivity options of the methods we purchase, and Microsoft not pushing for these methods for use both. Additionally the low resolutions and counting on FSR reconstruction to upscale the picture. Now that I've been PC gaming for some time I can say definitively that resolutions are the biggest hole and visible affect vs consoles. Sure path tracing seems to be method higher however you actually don't decide up on the small print of most of it except you see the aspect by aspect. Decision nonetheless is quickly and simply obvious. The subsequent consoles actually actually need to have the ability to produce constantly greater resolutions extra constantly. The upper graphics settings are a lot much less necessary as when you get to medium more often than not something greater is diminishing returns vs efficiency. Once I see what console graphics settings are literally set at in DF evaluations it makes full sense, normally med/excessive. In abstract subsequent gen consoles want to take care of medium settings and be capable to run native 1440p. That's the most important hole in visuals I've observed going from console to PC. submitted by /u/hammtweezy2192 |