“Yay! We’ve created artificial general intelligence!”
“…Fuck, it’s an asshole.”
“Yay! We’ve created artificial general intelligence!”
“…Fuck, it’s an asshole.”
For single player games, I absolutely agree. If you’re going to stop supporting the game, send out one last patch turning off any always online DRM and let people keep playing their game.
For multiplayer games, it seems like it’s a bit more complicated. Who should be shouldering the cost to keep the game servers alive?
Assuming my setup is typical, the dryer is on a 240V circuit. The washing machine is on a 15A 120V circuit.
Depending on how much time your server spends with those CPUs actually under load, newer processors may not really help your energy bills. Even old processors idle at single digits wattages. Most of the idle power consumption (where most home servers spend 99% of their time) on the server will be coming from fans, RAM, and storage.
This happens every time AMD starts cutting into Intel’s market share. Competition is a wonderful thing.
It’s a function of ZFS itself. Data that is to be written to the drives is first written to RAM, then transferred to the drives. One of the benefits of this is that if you are moving a file that is smaller than the available RAM, your transfer won’t appear to be limited to the write speed of the drives.
ZFS. It can use up as much RAM as you care to give it for caching. So if you are slinging a lot of data back and forth, more RAM is better. Especially if you are using HDDs instead of SSDs.
Worse. Terminally online edgelord.