Zooz 800 ZSE42 800LR. $30, the coin battery lasts about a year IME.
🅸 🅰🅼 🆃🅷🅴 🅻🅰🆆.
𝕽𝖚𝖆𝖎𝖉𝖍𝖗𝖎𝖌𝖍 𝖋𝖊𝖆𝖙𝖍𝖊𝖗𝖘𝖙𝖔𝖓𝖊𝖍𝖆𝖚𝖌𝖍
Zooz 800 ZSE42 800LR. $30, the coin battery lasts about a year IME.
I have zwave water detectors that are pretty easy to pair. The only problem is that every time they reset, they trigger an alarm event. Fortunately, this only happens when I upgrade zwavejs, or change the batteries, so the false positives are not random. Still, it’s a weird design decision.
Mine is 3-pronged:
/root
change, plus one nightly /home
snapshot. but it’s pretty demanding on disk space, and doesn’t handle drive failure; so I also doThe only “restore entire system b/c of screwing up the OS” is #1. I could - and probably should, make a whole disk snapshot to a backup drive via #2, but I’m waiting until bcachefs is more mature, then I’ll migrate to that, for the interesting replication options it allows which would make real-time disk replication to slow USB drives practical; I’d only need to snapshot /efi
after kernel upgrades, and if I had that set up and a spare NVME on hand, I could probably be back up and running within a half hour.
Yeah, I left when it became impossible to really advance without coop play. Even strikes were annoying, but raids were impossible if you didn’t have a team, or were just a casual player. When Bungie obviously stopped giving a shit about casuals or PvE players, I stopped giving a shit about Destiny.
the practice of deliberately wasting enormous amounts of energy for the purpose of being able to prove that you’ve wasted enormous amounts of energy.
C’mon, that’s being disingenuous. Back when Bitcoin was released, nobody was giving a thought to computer energy use. A consequence of proof-of-work is wasted energy, but a focus on low-power modalities and throttling have been developed in the intervening years. The prevailing paradigm at the time was, “your C/GPU is going to be burning energy anyway, you may as well do something with it.”
It was a poor design decision, but it wasn’t a malicious one like you make it sound. You may as well accuse the inventors of the internal combustion engine of designing it for the express purpose of creating pollution.
The best proof of advancements in the field of AI is Zuckerberg himself. He looks more and more like a real human every time I see a new picture of him.
OK, so, you’re right. Let’s be fair, though: this is capitalism. There are companies that make quality mice, and they are more expensive and don’t compete at the same scale Logitech does. If Logitech made quality mice, they’d be more expensive, and even more consumers would look at and choose cheaper mice from their competitors.
Part of this is absolutely “margins & profit.” Part is the veiled curse of online shopping: when you can’t feel and handle the product, much more of shopper decision comes down to simply price: this is the T-Shirt Effect: if two online products look identical, but one is less expensive, most people are going to opt for the less expensive one. It’s put established companies known for quality out of business, or driven their product quality down to compete. Part of it is that there are few reliable, authoritative review sources; many are barely disguised paid ads, or star-manipulation. The end sum is consumers voting with their dollars, and companies responding accordingly. Sales are down, your competitors’ are up, people are choosing products you know are cheaper crap, and so it’s obvious people prefer cheaper crap, so you make it.
It’s a lose-lose for everyone except those companies able to quickly clone reputable products, but with lower-quality components, and flood the online market with them.
Low-quality, low-cost mass manufacturing has put products in the hands of people who wouldn’t otherwise be able to afford them. But it’s also driven down quality, and driven waste up; the same decision process being used by low-income folks is also used by middle-class, and with nearly all shopping being online, consumers have few options for a better process.
The equation changes when you get to the wealthy, who can shop with companies who aren’t competing on volume, but reputation and margins: the Bang & Olufsens; the Breguets, and the Urban Jurgensens. People who can afford to shop with artisans shop differently, but all t-shirts look the same online.
That would be a different market, where the CEO benefits.
So, they’re essentialy claiming they’ve found a way around Amdahl’s Law?
Beautifully summarized.
I think another factor will emerge: people are starting to realize that they’re paying $60 to rent a game. They don’t own it, and the game developer can shut it down at any time, and even if they don’t, it probably requires some online access for something, and the game stops working once the developer turns off those servers.
I don’t think we’ll see a revolt, but companies will be forced through competition to allow rental models with less or no up-front cost. I think people will simply become less willing to pay $60 for a rental. At this point, I don’t know what happens to development studios, because they need seed funding to get to market. I think it’s already happening; as a very casual gamer, most of what I hear from the industry is pure-play game studios shutting down, or being acquired by corporations like Sony or Microsoft, who have other revenue streams they can redirect into speculative game development.
Personally, I care about these factors for my desktops as well. CPU, GPU, memory, and (and this surprised me) SSD temps - how many fans do I need? At least three in a proper tower-style desktop. I feel like the Grinch: “all the noise, noise, noise, noise, noise!” And fans take power. Everything takes power.
So I’ve been running a micro-PC for a while: a Ryzen 7, integrated GPU, little 6x6x2 enclosure. It still has a fan in it, and I’ve got it in a space in my desk made for hiding computer devices and wires - I had to build a fan into that because it was getting warm in there and raising average temps on the computer.
My point is that these battery-optimized architectures are also pretty important for the desktop market, too. Gaming rigs with GPUs bigger than the entire rest of the motherboard notwithstanding, average desktop user would be fine with one of these micro computers. As long as you stay away from the hog software like Electron and Java applications, they’re perfectly capable; heck, even rustc burns through compilations pretty fast, and that’s not exactly an efficient compiler. And Go programs compile in no time on a Ryzen 7, or even 5. I suspect it’d even handle my mom and her Firefox with 200 tabs.
Hugo isn’t a server, per se. It’s basically just a template engine. It was originally focused on turning markdown into web pages, with some extra functionality around generating indexes and cross-references that are really what set it apart from just a simple rendering engine. And by now, much of its value is in the huge number of site templates built for Hugo. But what Hugo does is takes some metadata, whatever markdown content you have, and it generates a static web site. You still need a web server pointed at the generated content. You run Hugo on demand to regenerate the site whenever there’s new content (although, there is a “watch” mode, where it’ll watch for changes and regenerate the site in response). It’s a little fancier than that; it doesn’t regenerate content that hasn’t changed. You can have it create whatever output format you want - mine generates both HTML and gmi (Gemini) sites from the same markdown. But that’s it: at its core, it’s a static site template rendering engine.
It is absolutely suitable for creating a portfolio site. Many of the templates are indeed such. And it’s not hard to make your own templates, if you know the front-end technologies.
It depends on how you want to write. If you want to use a web interface, WriteFreely is decent. If you like your text editor, Hugo is fantastic.
:shrug:
It’s trivial to host yourself, and super light on resources. Personally, I don’t use it; for blogging I write markdown and rsync it over to the server where Hugo picks it up and turns it into a blog. Now that I think about it, I should probably go shut my WriteFreely down. I have a few pages on it, but I hate web app interfaces, so I didn’t put much content in it.
But he was responding to someone who was unconfortable with putting all their eggs in one basket. That’s not what backups are for.
RAID is necessary because drives fail, and sometimes you can’t afford, or want, be offline until you can get around to sourcing & installing a new drive, and restoring from backup.
It is not. But backups are also not RAID.
Backups are important, but we were talking about drive failures. Backups help when you screw up the data; RAID6 helps when drives go bad. If you don’t trust the hardware, RAID.
Backups only means you’re down until you restore; RAID5/6 means you stay up.
RAID6, my person. RAID6.
And, for computers, was almost exclusively limited to monitors. In 2009, the Energy Star specification was version 4.0, released in 2006. In that specification, the EPA’s objective was to get 40% of the computers on the market to have power management capabilities 2010 – 40% by the year after Bitcoin was introduced. Intel’s 2009 TCO-driven upgrade cycle document mentions power management, but power use isn’t included in any of the TCO metrics.
All of the focus on low-power processing units in 2009 was for mobile devices and DSPs. Computer-oriented energy savings at the time was focused on processes, e.g. manually powering down computers or use of suspension and hibernation - there was very little CPU clock scaling available for desktop computers – you turned them off to save power. DVFS didn’t become widely available – or effective – until 2006, and a study published in 2009 (again, the same year Bitcoin was introduced) found that “only 20% of initiatives had measurable targets.”
So, yes: technically, there were people thinking about these sorts of things, but it wasn’t a common consumer consideration, and the tools for power management were crude: your desktop was on and consuming power – always the same amount of power – or it was off. And people did power down their computers to save energy. But, like I said, if your desktop was on, it was consuming the same amount of energy whether you were running a miner or weren’t. There was a motto at the time bandied about by SETI@home, that your computer was using energy anyway, so you might as well do science with the spare CPU cycles. That was the mindset of most people who had computers at the time.