Generally yes, although there are certain games that an old CPU would definitely choke on. It’s a bit disingenuous to say the upgrade cycle is dead and then to spend more money on upgrading the GPU than the total value of everything else combined.
Battle field 4 and battlefield 1, both of those games max out my cpu (3570k overclocked to 4 Ghz) and leave my 1070 twiddling it thumbs based on task manager and software to see GPU utilization.
Either my cpu is slower than it should be, your cpu is faster than it should be, or you don't check out one of the poster child's aaa game series.
>I cannot think of a single AAA title in recent memory that causes my i7 2600k -- an 8 year old CPU -- to struggle
I cannot think of any modern, graphically demanding game which is bottlenecked by the CPU. I imagine you are harming your GPU performance to some degree with such an old CPU though. Not sure I buy what you're saying here. What settings are you using?
My point, of course, is that the only thing you’re going to measure a gaming rig by is gaming, which has very little to do with the age (or speed) of any component other than the GPU.
Do you suspect, though, that the standard home user (or user of a repurposed old machine with Lubuntu on it, maybe) is going to be pushing those other components harder than a modern AAA game?
Yeah, me neither.
As an aside to answer your question:, I can run pretty much any game on its highest settings just fine with this machine: which is a gtx1070, 2600k, and 16gb of RAM driving a 120hz 1080p display. It was originally built with two HD5990s in crossfire.
It can play absolutely every single modern game, with the only component upgrade in its 8 year life being the GPU.