Out of curiosity, what do you think Berlin’s secret is in this regard? Like, do people naturally not litter e-waste, or are there easier recycling options, import restrictions, or enforced litter laws?
Out of curiosity, what do you think Berlin’s secret is in this regard? Like, do people naturally not litter e-waste, or are there easier recycling options, import restrictions, or enforced litter laws?
Noise may be something to look for when you’re shopping, depending on where your server lives. I have 1 Iron Wolf drive in my NAS (that is in my living room), and it is way louder than the combined noise of 3 WD Reds next to it.
As for failures, Backblaze publishes quarterly failure reports that I always brush up on before looking for a new drive.
Its nice to see what my own version of hell is going to be like.
deleted by creator
But in the mean time EC2 hasn’t gotten a meaningful feature that’s not to accelerate training or inference since GP3, and people folks are backing away from serverless-first designs because cost-control and other features we’ve been screaming for for several years aren’t being addressed.
Edit to add: on the EC2 side I forgot we got Gravaton3 processors like 18 months ago. That was appreciated for sure.
The majority of our household stuff is on a Synology DS920+ (x86). I installed Docker and Portainer on it and then run most of my local services (Immich, Invidious, Alexandrite (the Lemmy frontend), Miniflux, Dokuwiki, and Heimdall) using the Portainer UI.
I’m still running Plex as a manually installed Syno package, because I haven’t taken the time to figure out hardware trans-coding for other setups.
The 920 also manages cameras (via Surveillance Station), all off site backups (we all backup workstations to the 920 and it backs up online), handles private DNS and the reverse proxy for Docker, and hosts my personal VPN. I’m currently in the process of swapping the 4+ year old drives with new ones what will up my capacity (using SHR) from 12TB to 30 (with redundancy).
Good to know. Thanks for the breakdown.
Our nearest Pizza Hut delivers via Doordash whether you order direct or through DD, but if you order direct its 30% cheaper. I’m not sure who’s eating the markup.
I recognized the name AU10TIX, because I half-joked on Lemmy about a potential mass doxxing of Xitter’s most vile users back in September when they announced the partnership. I assumed they’d be a target for ransomware/hackers, not that they’d just leave their admin creds out in the open.
Clearly that’s what blu-tack is for.
“Secure that SSD in a bay and get the faceplate off my butterfly, you monster!” -Buster
I’m taking note of that that combo feather teaser / ball track / butterfly toy. I think my big orange boy would lose his mind over that.
I can sort of see the appeal, but its not for me. If anything is ever going to rename files for me its going to be a script that I’ve either written or at least read top to bottom. Not a blackbox inference engine, and especially not one based on an LLM.
Yeah, the headline sort of reads like Ars is daring Google to remove the flag.
But they specifically said in their blog post that it has “privacy you can trust.” Just imagine all the trust you have in Microsoft plus all the trust you have in the accuracy of AI and rest easy. Plus the AI runs locally so they can trust you to pay the power bill.
Don’t think about how much money they could make with their business customers, based on telemetry alone.
The Brennan monorail rides again!
Some of this technology may sound a bit “over-ambitious,” but keep in mind the project was inspired by a fully functional self-balancing monorail that mechanical engineer Louis Brennan designed and demonstrated back in the early 1900s.
Are we taking bets on how long it will be before Google Search ends up on killedbygoogle,com?
When I was running a site, I had special rules in my firewall to look for things that said they were googlebot but which didn’t come from one of googles published public IPs.
I mentioned this in another thread, but I do worry that google is eventually going infect the APIs that metasearch engines like DDG, Kagi, searchxng, etc depend on.
In my experience, a lot of the sysadmins who run high traffic sites will treat all bots as scrapers that have to be blocked or slowed to a crawl. Then they make special allowances for googlebot, bing/msnbot, and a few others. That means there is a massive uphill climb (beyond the technical one) to making a new search engine from scratch. With Google and MS both betting the farm on LLMs I fear we’re going to lose access to two of the most valuable web reverse indexes out there.
If I had 25 surprise desktops I imagine I’d discover a long dormant need for a Beowulf cluster.
Yeah, I’m hoping to get at why. Drop-shipped disposables took over Juul’s market in the US and then grew it by about 600%. It was so dramatic (in a business sense) that it’s caused ripples in US and UK trade policy, and I just assumed that blitz was happening everywhere.