I used to dual boot linux with windows Vista on an old laptop. I had only installed there the first assassin’s creed and Rome total war. Nothing else, never really connected to internet. After 1 year of not using it a part than few total war sessions, vista was so slow that was unusable. It spontaneously became slow for no reason. I completely removed it, left only linux, and that laptop survived 7 years of intensive use, and was still working 10 years later (just too old).
Yeah, XP did that with most of the drivers other than graphics, which lead to a reduction in BSOD crashes (because if a user thread crashes, the OS just kills it and continues on, but an unhandled kernel error will crash the entire OS to a generic “turn the screen blue, report and error, and log it, if possible”).
Vista further improved this by moving most of the graphics driver code out of kernel land.
I sort of agree with you, but not in the way I think you meant it.
Vista’s problem was that it’s hardware requirements were too high for it’s time. Operating systems have very long project development lifecycle and at a point early on they did a forward looking estimate of where the PC market would be by the time Vista released, and they overshot. When it was almost ready to release it to the world Microsoft put out the initial minimum and recommended specs and PC sellers (Dell, HP, Gateway) lobbied them to lower the numbers; the cost of a PC that met the recommended specs was just too high for the existing PC market and it would kill their sales numbers if they started selling PCs that met those figures. Microsoft complied and lowered the specs, but didn’t actually change the operating system in any meaningful way - they just changed a few numbers on a piece of paper and added some configurations that let you disable some of the more hardware intensive bits. The result was that most Vista users were running it on hardware that wasn’t actually able to run it properly, which lead to horrible user experiences. Anyone that bought a high end PC or built one themselves and ran Vista on that, however, seemed quite happy with the operating system.
Because at this time the internet was still slow, not always on and optional on most computers, and Microsoft did not know if and how they should integrate the internet into the OS. The only thing they had at the time was some link to MSN on the desktop, and activeX (???) Where you could display websites on your desktop or within your program, but without the Browser controlls.
I don’t recall such issues with Win98 or XP
Dude, that was 22 years ago… I also remember Prince of Persia as if it were yesterday
deleted by creator
I miss Windows Vista.
The arrow pointing downwards is about to be absolutely destroyed today. Edit: it turns out that it didn’t.
I used to dual boot linux with windows Vista on an old laptop. I had only installed there the first assassin’s creed and Rome total war. Nothing else, never really connected to internet. After 1 year of not using it a part than few total war sessions, vista was so slow that was unusable. It spontaneously became slow for no reason. I completely removed it, left only linux, and that laptop survived 7 years of intensive use, and was still working 10 years later (just too old).
Vista was a scam
Good for you, I’m never gonna get convinced.
Vistas problem was that it was ahead of its time
I both agree and disagree with that statement.
Windows finally got animations and transparency when Mac OS has beaten it by 6 years. Truly an oomph moment.
The actual technological advancement of Vista was userspace graphics drivers.
Also correct.
Yeah, XP did that with most of the drivers other than graphics, which lead to a reduction in BSOD crashes (because if a user thread crashes, the OS just kills it and continues on, but an unhandled kernel error will crash the entire OS to a generic “turn the screen blue, report and error, and log it, if possible”).
Vista further improved this by moving most of the graphics driver code out of kernel land.
I sort of agree with you, but not in the way I think you meant it.
Vista’s problem was that it’s hardware requirements were too high for it’s time. Operating systems have very long project development lifecycle and at a point early on they did a forward looking estimate of where the PC market would be by the time Vista released, and they overshot. When it was almost ready to release it to the world Microsoft put out the initial minimum and recommended specs and PC sellers (Dell, HP, Gateway) lobbied them to lower the numbers; the cost of a PC that met the recommended specs was just too high for the existing PC market and it would kill their sales numbers if they started selling PCs that met those figures. Microsoft complied and lowered the specs, but didn’t actually change the operating system in any meaningful way - they just changed a few numbers on a piece of paper and added some configurations that let you disable some of the more hardware intensive bits. The result was that most Vista users were running it on hardware that wasn’t actually able to run it properly, which lead to horrible user experiences. Anyone that bought a high end PC or built one themselves and ran Vista on that, however, seemed quite happy with the operating system.
I had no problems with Vista. I also built a new PC for it though.
Very similar story here: I bought a new computer that shipped with Vista.
I got horrendously tired of that Pentium 4 thing.
Blasphemy! Windows XP is the only King!
I don’t really like XP’s design anymore. I didn’t like it back then either.
Windows 2000 🫶
Because at this time the internet was still slow, not always on and optional on most computers, and Microsoft did not know if and how they should integrate the internet into the OS. The only thing they had at the time was some link to MSN on the desktop, and activeX (???) Where you could display websites on your desktop or within your program, but without the Browser controlls.