Of course there’s a financial reason, they’ve probably done a cost/benefit analysis and decided that it’s financially better to screw over those customers than to spend money fixing it. But that’s exactly the issue!
Of course there’s a financial reason, they’ve probably done a cost/benefit analysis and decided that it’s financially better to screw over those customers than to spend money fixing it. But that’s exactly the issue!
I think what most people disagree with, is that the active choice from AMD to not fix a very fixable issue, is a choice they know leaves customers is a seriously bad position. This is something they choose to do to their customers, because they could just as well choose to help them.
what I meant was that apparently only compromised systems are vulnerable to this defect.
That is not correct. Any system where this vulnerability is not patched out by AMD (which is all of gen 1, 2 and 3 CPUs) is left permanently vulnerable, regardless of whether or not they already are compromised. So if your PC is compromised in a few months for some reason, instead of being able to recover with a reinstall of your OS, your HW is now permanently compromised and would need to be thrown out…just because AMD didn’t want to patch this.
Ryzen 3000 series CPUs are still sold as new, I even bought one six months ago, they’re no where near being classified as “old”, they’re hardly 5 years old. And this is not only an issue for already infected systems because uninfected systems will intentionally be left vulnerable.
No they are just choosing not to roll out the fix to a known issue, which is screwing customers over on purpose (to increase profits). It’s not a matter of goodwill, they sold a product that then turned out to have a massive security flaw, and now they don’t want to fix even though they absolutely could.
They are 100% not patching old chips intentionally by not allocating resources to it. It’s a conscious choice made by the company, it is very much “on purpose”.
I bought a couple of 12tb “used” drives from servershop24.de, thay all had less than 150h of runtime.
It’s not dependent of circuit, things just need to be on the same phase. Our house uses three phases total, so power line adapters only work for 1/3 of the house here.
deleted by creator
I think 3.5" are usually priced better per tb than 2.5" drives and performance is usually better too. So unless you feel like burning money for an inferior solution, are have some space constraints that doesn’t allow 3.5" drives, I wouldn’t go with 2.5" drives. They’re more energy efficient though, but you’d need a fuckton of drives for that to make a worthwhile difference in your power bill.
Well that’s a big ol’ “whoosh” on me then 😅
That does sound more like a user issue than a software issue though
I might not be paid a lot, but things like this makes up for it.
Really? Being given stuff you don’t have a need or use for is good compensation to you?
I bought an old Intel NUC with a 2.x GHz i3, 8gb ram and 120gb nvme used for $65, upgraded it to 16gb of ram and 1tb nvme for another $50. I run everyting from that in either VMs or LXCs (HA, jellyfin, NAS, CCTV, pihole) and it draws about 10W
You’re completely missing the point…this will add an ugly extra box cluttering up an otherwise clean setup. Of course there is a workaround, there always is, but it’s far from optimal. It’s a bad solution to an annoying problem that shouldn’t exist.
None of the >40" monitors I’ve looked at today had any audio outputs. But finding one that isn’t an ultrawide format for gaming is probably the bigger issues it seems.
Not true at all…e.g. Chromecast doesn’t have a dedicated audio output and neither does the apple TV, they only have HDMI output. Now the HDMI does also carry audio, but many amps and especially sound bars, do not have HDMI to pass through and rely on getting the audio signal from the TV/monitor if you’re using those devices.
As I mentioned earlier, use a soundbar or dedicated speakers (most TV speakers suck anyways).
Yes, but there is no audio output (as in a RCA, Optical etc., not built-in speaker) to get the audio from the monitor to the amplifier.
Unless it’s “load this file on a USB stick and plug in in your TV”-easy, it’s still out if reach to most consumers.
But my point is, it shouldn’t be necessary to do these things in the first place. Fucking drop the “smart” element from them completely, they always suck ass anyway and are laggy as hell to navigate.
Should you really be concerned about a system that can be physically ruined by malware? I would say definitely yes…