March 28th, 2020 at 1:24 AM
unfortunately that's common with most manufacturers these days.
Planned obsolescence is a thing and they do anything to stick to that.
it's why they hate DIY techies like me, because we sit there, tear the whole thing down, figure out the points of failure, then patch, repair, improve them making devices last 10 sometimes 15 or more years.
if you only buy a computer once in a decade or so they aren't making a lot of money off you.
and since the needs of general computing aren't advancing much anymore you can get away with using decade old hardware with very few caveats, many of which can be gotten around.
an HP laptop from 2012, with new capacitors on the board, an after market extended life battery, and an SSD is just as useful for day to day use as a brand new one because unless you're a gaming enthusiast you're not going to see a real difference between a 1080p and 4K video on a 1600x900 screen.
throw in a linux distro with open source drivers on there and some tweaks to make it run as light as possible and well... you get where I'm going with this.
the needs of the user don't require the leaps and bounds of 20 years ago.
you don't need a GTX 1080 to run MS word or browse the web
and honestly 16gb of ram is all you need really unless you're gaming in 4K or VR, most software can't really take advantage of it, or multi-core CPU's beyond maybe quad core?
though I expect that to change as ryzen becomes more popular... even then you don't need a 12+core CPU @3.5+ghz, 32+gb of ram, and the latest and greatest GPU from even 5 years ago to do end user stuff.
windows 10 runs fine on a quad core 2.5ghz with 16gb of ram and a geforce gtx980 with office apps, youtube at 1080p60 and online TV streaming stuff like netflix and hulu.
and that's still higher spec or comparable spec to a lot of common tablets and smart phones which people use for exactly the stuff listed above.
Planned obsolescence is a thing and they do anything to stick to that.
it's why they hate DIY techies like me, because we sit there, tear the whole thing down, figure out the points of failure, then patch, repair, improve them making devices last 10 sometimes 15 or more years.
if you only buy a computer once in a decade or so they aren't making a lot of money off you.
and since the needs of general computing aren't advancing much anymore you can get away with using decade old hardware with very few caveats, many of which can be gotten around.
an HP laptop from 2012, with new capacitors on the board, an after market extended life battery, and an SSD is just as useful for day to day use as a brand new one because unless you're a gaming enthusiast you're not going to see a real difference between a 1080p and 4K video on a 1600x900 screen.
throw in a linux distro with open source drivers on there and some tweaks to make it run as light as possible and well... you get where I'm going with this.
the needs of the user don't require the leaps and bounds of 20 years ago.
you don't need a GTX 1080 to run MS word or browse the web
and honestly 16gb of ram is all you need really unless you're gaming in 4K or VR, most software can't really take advantage of it, or multi-core CPU's beyond maybe quad core?
though I expect that to change as ryzen becomes more popular... even then you don't need a 12+core CPU @3.5+ghz, 32+gb of ram, and the latest and greatest GPU from even 5 years ago to do end user stuff.
windows 10 runs fine on a quad core 2.5ghz with 16gb of ram and a geforce gtx980 with office apps, youtube at 1080p60 and online TV streaming stuff like netflix and hulu.
and that's still higher spec or comparable spec to a lot of common tablets and smart phones which people use for exactly the stuff listed above.