• 2 Posts
  • 133 Comments
Joined 1 year ago
cake
Cake day: July 2nd, 2023

help-circle


  • h3ndrik@feddit.detoSelfhosted@lemmy.worldCloudflare is bad. Youre right.
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    edit-2
    5 months ago

    Well, centralization and giving up your freedoms, letting someone else control you, is always kinda easy. Same applies to all the other big tech companies and their platforms. I’d say it applies to other aspects of life, too.

    And I’d say it’s not far off from the usual setup. If you had a port forward and DynDns like lots of people have, the Dns would automatically update, you’d need to make sure the port forward is activated if you got a new router, but that’s pretty much it.

    But sure. if it’s too inconvenient to put in the 5 minutes of effort it requires to set up port forwarding everytime you move, I also don’t see an alternative to tunneling. Or you’d need to pay for a VPS.








  • I installed it like 2 weeks ago. As of now it’s still running and has a really low memory footprint compared to Synapse. But a lot of things aren’t implemented. Chatting works fine. I get a lot of warning messages about not implemented things, though. Like my client (FluffyChat) trying to query some profile status … I’d say try it. I’ve done so. But I can really only give some good advise after a few more weeks of using it. Maybe there is a dealbreaker.


  • Seems the two German supermarket chains really like to have the same infrastructure everywhere. Everywhere I go the Aldis look exactly the same. They have slightly different products depending on the country. But the price tags, interior, … is basically the same. Okay and we don’t have “Flaschenpfand” everywhere… (deposit on the plastic bottles and the machines where you can return bottles.) I bet all of this makes it a lot easier for their techs and management. And it could also explain why they sometimes redo a store that still looks fine and fit it with the latest shenanigans.

    And as an aside: I’ve shopped in the first Aldi store ever. It’s not far from where I live.







  • As of now all advice here is kinda missing the point or wrong… (Exept the one recommendation to do updates ;-) I wouldn’t use Cloudflare as it’s really bad for freedom, watches your traffic and most interesting things aren’t even in the free/cheap plans… You can’t restrict connections to the “Established state” or you can’t ever connect to your server… And SSH is a safe protocol. Just depends on the strength of your passwords… And yeah, opening ports is never 100% safe. Neither is using computers. They can be hacked but that’s not helping… And I’d agree using Wireguard or Tailscale would help. But you already said you don’t want a VPN…

    I didn’t have a proper look at the Forgejo Docker container. I’d say it’s safe. It’s probably using keys instead of passwords(?!) I hope they configured it properly if they ship it per default. And it’s running sandboxed in your Docker container anyways and not running a system shell on the machine.

    The issue with SSH is, there are lots of bots scanning the internet for SSH servers and testing passwords all day. Your server will be subject to a constant stream of brute-forcing attempts. Unless you take some precautions. Usually that’s done by blocking attackers after some amount of failed login attempts. This is either preconfigured in your Docker container (you should check, or watch the logs.) Or you’d need to use something like fail2ban on top. Or ignore the additional load and have all your users use good passwords.

    (What I do is use Git over https. That worked out of the box while ssh would have required additional work. But I also have lots of other ports forwarded to several services on my home-server. Including ssh. No VPN, no Cloudflare … I have fail2ban and safe passwords. I’m happy with that.)


  • It depends on the exact specs of your old laptop. Especially the amount of RAM and VRAM on the graphics card. It’s probably not enough to run any reasonably smart LLM aside from maybe Microsoft’s small “phi” model.

    So unless it’s a gaming machine and has 6GB+ of VRAM, the graphics card will probably not help at all. Without, it’s going to be slow. I recommend projects that are based on llama.cpp or use it as a backend, for that kind of computers. It’s the best/fastest way to do inference on slow computers and CPUs.

    Furthermore you could use online-services or rent a cloud computer with a beefy graphics card by the hour (or minute.)