I'm going on record to suggest that people who have used computers from the 80's and 90's have rose tinted glasses on when remembering their experiences. Nostalgia is a bitch and lies to us all the time. Just loading a fucking program from a spinning disk hard drive adds a significant amount of startup time to programs that we've completely imagined away from the "golden age" of our computer usage. At this point I'm convinced it's just a natural extension of "they don't make it like they used to" which has plagued mankind for basically every generation ever which completely ignores survivor bias. I distinctly remember experiencing *major* performance gains when moving to more memory than my system and apps needed to use and the move to solid state drives. None of the "programmer's don't know how to code today" nonsense even comes close to eclipsing the performance gains from those two changes.
DOS programs loaded fast, since there was around 1MB of working memory. Granted they did not do much, but where responsive, except when needed to do real compute, then you just waited forever.
Windows 3x/9x with first real multitasking and swap - things got quite sluggish. Early browsers were particularly eager of gulping ram and bringing everything to halt.
Nowdays it feels somewhere in the middle with copious amounts of ram stuff can be fast once up and running, but it seems every app either wants to load from web on every step or tries to index your drive on every keypress.
Part of it is rose-tinted glasses, sure, but it's not entirely rose-tinted glasses.
The "Living Computer Museum" in Seattle was (up until covid closed it) an excellent way to experience the past.
I had this same insight and feeling then. The old machines, running their old operating system versions, felt responsive and crisp compared to performing similar tasks on modern machines with modern software equivalents.
So it's funny you mention that. I own a couple of decidedly retro machines from the 90s, one runs DOS and the other runs Windows 98. These are used both for gaming and for productivity, both are equipped with solid-state drives as of late (I still have the original spinning rust.)
Both of these machines are many times more responsive than any thing put out in the last decade despite being thousands of times less powerful. The applications they run are made to serve my needs as a user 1st. It gets out of my way. The user interface is clearly designed for the mouse and keyboard I am obviously using. It is made to help me accomplish tasks more efficiently rather than stroking some designer's ego or chasing some fad. Most of the software I use was released "done" rather than released half-baked with the hope of future updates. I don't have to worry about having my privacy invaded.
It's not rose tinted glasses when I'm not wearing any and can look behind me and see the color. Modern mass-market technology is "worse is better" writ large.
Yes and no. I am now running an equivalent of a computer power that was once reserved for some dedicated government agency in early 90s[1]. I accept that some of that power is put to good use, but I do see wasted power in OS [2] and even games [3] on a semi-regular basis.
For the record, "they don't make em' like they used to" is an absolutely valid complaint. Note that now even with SSD, ridiculous memory and cpu, websites still manage to stutter ( but I accept online is its own animal ). Part of me wants to go over Windows releases as an example of resource use across generations ( and how they leaped ).
<<I distinctly remember experiencing major performance gains when moving to more memory than my system and apps needed to use and the move to solid state drives.
But do you still see it or is that performance assumed now ( making lazy design decisions easier )?
It's just not monolithic. We've made serious improvements in a number of areas, but there are a few areas where we've clearly stepped backwards. Neither opinion really contradicts the other.