Assuming that the 9th gen (excluding Switch) will mark the true beginning of 4K consoles and signal an immediate push for 8K and 16K from there (especially justifiable for VR), how long do you think developers will be playing catch-up? If 8K is near-par with the human eye and 16K is allegedly flawless, are we nearing a "final standard" for screen resolution? Until then, it would seem that we're caught in an indefinite loop of moving goalposts and performance sacrifice where migration and development costs grow increasingly expensive and most games are capped between 15 and 30 FPS in favor of increased resolution and graphics.
We've already passed that point (limit of the human eye) some time ago wrt smartphones, yet they seem to continue increasing their resolution regardless (4K on a ~5 inch screen? Seriously?). Personally, I've long since stopped caring, but as long as there's people that feel the need to define themselves by "bigger, flatter, hires-er" TVs (and the video sources to match, of course), technology will continue to improve move on.
We might be approaching the limit for screens, but for VR the increasing of resolution could continue for a while. The human eye can perceive more than 500 megapixels (roughly equivalent) which is more than 8K or even 16K. Also, you might need very high resolution if you have an extremely large display. Eventually screens/displays of any kind will probably be replaced with direct brain interfaces.
I'm still not sold on the whole "VR 2.0" craze that's currently going on. It's a nice (shiny, new!) toy, but unless they solve the problem of moving around, it'll probably fade away like it did in the 90s. And no, point-and-teleport is not a proper solution. Also, keep in mind that those "500 megapixels" you mention are for your full(!) field of vision, meaning you'd need a heavily curved (on multiple axes) UUUUHD display right in front of your mug to cover all of that area. And it'd likely cover your nose/mouth too, which might be a tad uncomfortable. I.e. I don't see "full retina replacement" VR happening any time soon, if ever...
Aside from optical gear, I'd only find 4K+ useful for projectors and (on the HW side) raw power for designing higher resolution assets. 1080p (even 720p) is quite nice for TV sets, especially with all the recent advancements in display tech.
It'll never really stop really, really. After resolution it'll be colors, I expect 64K 256bits/pixel TVs in 2250. Btw, my 40" 720p TV has perfectly indiscernable pixels at a comfortable viewing distance of 9 to 10 feet. Maybe 1080p could help a little for very sharp high contrast images, but 4K would be totally overkill. I think really high defs are only pertinent for PC monitors, where you're expected to be close to the screen. I'd love a 32" 4K monitor to code, for sure. But for TV gaming, I'd rather have photorealistic 480p @60fps than choppy tons of polygons with exaggerated HDR in 4K @18fps.
VR headsets already take up almost your entire field of vision. I used a Vive recently, and I was pretty impressed with it. I was impressed by how smooth the experience was; the motion of the visuals relative to your head and body make it seem pretty convincing (it tracks your head position as well as the angle). The one I was using was connected to a high-end PC. It looks pretty good, but it would definitely benefit from an increase in resolution. As far as movement, I think all you need is something like a treadmill that can move in any direction. The difference between 1440p and 1080p on a 27" monitor is pretty noticeable if you're sitting close to it. I can tell the difference between 1440p and 4K/5K on a 27" monitor, but maybe not everyone would notice. I'll probably get a 4K monitor and/or TV eventually, but for right now my 1440p monitor is pretty cool. It's already been done experimentally. It's just a matter of time before it becomes advanced/cheap enough to be a consumer product. It might be a while, though. I'm glad it doesn't exist now, actually. Humanity isn't ready for that kind of technology.
Difference in so high resolutions are only noticed when looking for it (cherry picking) or really wanting a lot of different information on the screen at the same time. And the latter is simply unrealistic for a main stream entertainment product, because said product needs a screen setup for a lot of resolutions to be main stream in the first place. For a specific professional use computer, yes, very simple even today. Not for entertainment.
It's good for gaming too, but like I said, it's a lot more noticeable if you're close to the monitor. It's definitely more noticeable for things like the UI and text, but it makes a difference for gaming too. It's often the case with technology that people say, "Oh well, blah blah is totally sufficient. There's no way you could ever need more than that." But inevitably it is upgraded, whether it's really necessary or not. What seems like overkill now will seem perfectly normal in the future compared to the other things that will exist at that point. As technology advances, our ability to make use of that technology also advances; they're very closely related. I definitely think it could make a difference for VR. With the Vive, I was still able to see individual pixels, and I think that headset is 1080p for each eye (edit: it's actually 1080 x 1200 per eye). If you remove that issue, then it would look almost like reality. btw, I just now understood what the title is supposed to mean. I was thinking, how do you hire advancement? Whoever you hire, they better be hi-res.
Another side to this is that we really don't have the processing power for some of these very high resolutions yet. Not to any real affordable extent. And we may never have enough for scenarios like 360 degree 16k 60+ FPS VR experience on a traditional set up. Baring some leap in architecture or physics, most scientists think around 5nm processors are the best we can do, and we're fast approaching that. Like 2025 fast. Yeah sure parallelism could solve that, but that might require some advances in cooling. While I do think we'll solve all of this eventually, it could be a major stumbling point for a time.
My 2 cents as a physicist: Quantum effects, such as tunnelling, will probably be a big problem much before that, don't expect 5nm CPUs before decades, anything under 10nm would be a tour de force. Before that, they'll probably start stacking transistors vertically, just like they do for flash memory nowadays. It'll require advancements in cooling, maybe graphene layers between levels or something similar, but at one point it'll be more economical than trying to circumvent the laws of nature. That or completely redesigning the architecture to optimise each of those transistor, which would be enormously costly too.
Actually, I can confirm this is where things are headed! Intel is already redesigning the queuing and memory management to be all processor side. The idea is that parts of the code that can feasibly be run asynchronously can be done so regardless of how something is actually programmed, thus providing a speed improvement. Seamless optimization for parallelization. However now that queuing is out of the OSes domain, doing things this way also has the effect of allowing them to easily redesign the rest of the processor at a low level without having to worry much about software support.
So, not only are apps moving into the browser - now the OS is seeping into the silicon too? Not sure how I feel about that tbh...
Only the direct processor queue is going to be on silicon. One could argue that it should have been this way long ago, when we first started getting into multiple cores. Hardware wasn't quite as "smart" back then though, so it probably wouldn't have been great. There's plenty of other non-Intel CPU examples of hardware managing it's own queues though, so it's not really a new concept.