• 19 Posts
  • 103 Comments
Joined 1 year ago
cake
Cake day: June 5th, 2023

help-circle
  • I can’t give you the technical explanation, but it works.
    My Caddyfile only something like this

    @forgejo host forgejo.pe1uca
    handle @forgejo {
    	reverse_proxy :8000
    }
    

    and everything else has worked properly cloning via ssh with git@forgejo.pe1uca:pe1uca/my_repo.git

    My guess is git only needs the host to resolve the IP and then connects to the port directly.


  • One of my best friends introduced me to this series back in MH4U for the 3DS.
    As someone mentioned in other comment, these games are definitely not newbie friendly haha. I started it and left it after a few missions, I don’t remember what rank I was, but definitely the starting village. Afterwards we finally got time to play and he mocked me since my character had less armor than his palico :D
    We played more often and he helped me reach higher ranks until G-rank.

    Each game has had a different kind of end game.
    For MH4U were the guild quests which were randomly generated, I loved this, it made the game not feel like a total grind, but it only made it feel like that, because it really was a grind to both get the correct quest and level it up to get the relics you wanted.

    The one I enjoyed the least was MHGen/MHGU because there’s no end game loop, once you reach G-rank the game doesn’t have anything else to offer, so you can just grind the same missions you already have. Of course this can be considered an end game loop since maxing your armor and weapons takes a long time (and IIRC some older fans mentioned this was ad-hoc with the theme of remembering old games since they where like that).

    For MHW were the investigations which felt a bit like MH4U guild questions but without the random map.
    The only downside of this game and the Iceborn expansion was the game as a service aspect, you could only access some quests on some days of the week, you had to connect to the internet to get them, and also one of the last bosses is tied to multiplayer, which if you have bad internet or only time for a single quest is impossible to properly finish.

    I’ve bought each game. Around 200 minimum in each one. IIRC 450+ in MH4U and around 500 in MHW (mostly because it’s harder to pause in PS4). MHRise/Sunbreak

    MHRise is one of the most relaxing ones with the sunbreak expansion since you can take NCPs on all missions, they help a lot to de-aggro the monsters and enjoy the hunt.

    I was with some friends from work when the trailer for MHW released and we literally screamed when we realized it was an MH game haha.

    The only change they’ve made between games that I found really annoying was to the hunting horn. It was really fun to have to adapt your hunt to each horn’s songs and keep track of what buffs were active and which ones you needed to re-apply (in reality you always rotated your songs over and over so you never ran out of your buffs).
    But in Rise each song now is X -> X, A -> A, and X+A -> X+A, there’s no combinations.
    Every hunting horn only has 3 songs, previously some horns could have up to 5.
    When you play a song twice the buff applied goes up a level, well, in Rise they made it a single attack to play all your songs twice.
    It feels like they tried to simplify the weapon but two teams got in charge of providing ideas and they implemented both solutions, which made the weapon have no depth at all.
    Also, previously you felt like the super support playing hunting horn, each time you applied a buff a messages appeared showing the buff you applied. Yeah, it was kind of spammy, but it felt nice having a hunting horn on the hunt.
    In Rise they decided to only display a message the first time you apply the buff and that’s it, so if you re-apply it there’s nothing, even when you keep buffing your team. Ah, but if you use bow the arc shot does spam the buff message, so you feel less than a support than the bow :/

    Due to work I haven’t followed all the news of MHWilds, but I’ll definitely buy it.


    For the next posts my recommendations would be the series Sniper elite, Mario and Luigi, Pokemon mystery dungeon, and Disgaea.
    (Maybe also another theme of posts could be genre/mechanic, like tactics games or colony management in general)




  • I’ve only played P5 and currently P5R.
    The RPG part is amazing, the story, combat, dungeon crawling, interactions, etc, and all the other comments people had already made.

    My only con for it would be the strictness of the schedule to do the story. Yeah, it’s also an interesting part of the game which differs from other RPGs, but it’s frustrating you might permanently lose something because you planned it a bit off or selected the wrong dialog option with a confidant so you don’t have enough points which makes you have to spend an extra day with them to increase their rank.
    Either you follow a guide or you accept the idea of missing some parts of the history.

    And even then with a guide I think I might as well not experience everything since I won’t go to visit some of the places to hang out with confidants, only the main ranks and that’s it.

    Also, you can’t focus on finishing a confidant because I think all of them have some requirement, or they are not available that day, so you need to do other stuff.
    For example, Yoshida is only available Sundays, Kawakami IIRC is also only the last days of the week, but not weekends and only during the evening.

    But I plan to also play P3 and P4 since the stories are so good.


    My recommendation for the next post would be series of Monster Hunter, Paper Mario, or Kingdom Rush.



  • pe1uca@lemmy.pe1uca.devOPtoSelfhosted@lemmy.worldAny good linux voice changer?
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    2
    ·
    2 months ago

    Text to speech is what piper is doing.
    What I’m looking for is called voice changer since I want to change a voice which already read something.

    That’s exactly what I want: “the thing in the Darth Vader halloween masks” but for linux, preferably via CLI to ingest audio files and be able to configure it to change the voice as I want, not only Darth Vader.






  • Borderlands 2, specifically the mechromancer class.
    It has a perk where you get more damage each time you unload your full clip, and it resets when you manually reload.
    In PC the reload action is its own key.

    But I had a potato PC and was only able to play it at low settings. When I got a PS4 I bought the game again to play it with nice graphics. It quickly got very frustrating since the reload action is bouns to the same button as interact! So each time you tried to talk to someone, to get into a vehicle, or even pick up something from the ground you got into the risk of not aiming well enough and reloading by accident which resets your buff!



  • I’ve used it to summarize long articles, news posts, or videos when the title/thumbnail looks interesting but I’m not sure if it’s worth the 10+ minutes to read/watch.
    There are other solutions, like a dedicated summarizer, but I’ve investigated into them and they only extract exact quotes from the original text, an LLM can also paraphrase making the summary a bit more informative IMO.
    (For example, one article mentioned a quote from an expert talking about a company, the summarizer only extracted the quote and the flow of the summary made me believe the company said it, but the LLM properly stated the quote came from the expert)

    This project https://github.com/goniszewski/grimoire has in it’s road map a way to connect to an AI to summarize the bookmarks you make and generate at 3 tags.
    I’ve seen the code, I don’t remember what the exact status of the integration.


    Also I have a few models dedicated for coding, so I’ve also asked a few pieces of code and configurations to just get started on a project, nothing too complicated.


  • In that case I’d recommen you use immich-go to upload them and still backup only immich instead of your original folder, since if something happens to your immich library you’d have to manually recreate it because immich doesn’t update its db from the file system.
    There was a discussion in github about worries of data being compressed in immich, but it was clarified the uploaded files are saved as they are and only copies are modified, so you can safely backup its library.

    I’m not familiar with RAID, but yeah, I’ve also read its mostly about up time.

    I’d also recommend you look at restic and duplocati.
    Both are backup tools, restic is a CLI and duplocati is a service with an ui.
    So if you want to create the crons go for restic.
    Tho if you want to be able to read your backups manually maybe check how the data is stored, because I’m using duplicati and it saves it in files that need to be read by duplicati, I’m not sure if I could go and easily open them unlike the data copied with rsync.


  • For local backups I use this command

    $ rsync --update -ahr --no-i-r --info=progress2 /source /dest
    

    You could first compress them, but since I have the space for the important stuff, this is the only command I need.

    Recently I also made a migration similar to yours.

    I’ve read jellyfin is hard to migrate, so I just reinstalled it and manually recreated the libraries, I didn’t mind about the watch history and other stuff.
    IIRC there’s a post or github repo with a script to try to migrate jellyfin.

    For immich you just have to copy this database files with the same command above and that’s it (of course with the stack down, you don’t want to copy db files while the database is running).
    For the library I already had it in an external drive with a symlink, so I just had to mount it in the new machine and create a simlar symlink.

    I don’t run any *arr so I don’t know how they’d be handled.
    But I did do the migrarion of syncthing and duplicati.
    For syncthing I just had to find the config path and I copied it with the same command above.
    (You might need to run chown in the new machine).

    For duplicati it was easier since it provides a way to export and import the configurations.

    So depending on how the *arr programs handle their files it can be as easy as find their root directory and rsync it.
    Maybe this could also be done for jellyfin.
    Of course be sure to look for all config folders they need, some programs might split them into their working directory, into ~/.config, or ./.local, or /etc, or any other custom path.

    EDIT: for jellyfin data, evaluate how hard to find is, it might be difficult, but if it’s possible it doesn’t require the same level of backups as your immich data, because immich normally holds data you created and can’t be found anywhere else.

    Most series I have them in just the main jellyfin drive.
    But immich is backedup with 3-2-1, 3 copies of the data (I actually have 4), in at least 2 types of media (HDD and SSD), with 1 being offsite (rclone encrypted into e2 drive)


  • Just tried it and seems too complicated haha. With traccar I just had to deploy a single service and use either the official app or previously gpslogger sending the data to an endpoint.

    With owntracks the main documentation seems to be deploy it into the base system, docker is kind of hidden.
    And with docker you need to deploy at least 3 services: recorder, Mosquitto, and the front end.
    The app doesn’t tell you what’s expected to be filled into the fields to connect to the backend. I tried with https but haven’t been able to make it work.

    To be fair, this has been just today. But as long as a service has a docker compose I’ve always been able to deploy it in less than 10 minutes, and the rest of the day is just customizing the service.



  • I can share you a bit my journey and setups so maybe you can take a better decision.

    About point 1:

    In vultr with the second smallest shared CPU (1vCPU, 2GB RAM) several of my services have been running fine for years now:
    invidious, squid proxy, TODO app (vikunja), bookmarks (grimoire), key-value storage (kinto), git forge (forgejo) with CI/CD (forgejo actions), freshrss, archival (archive-box), GPS tracker (traccar), notes (trilium), authentication (authelia), monitoring (munin).
    The thing is since I’m the only one using them usually only one or two services receive considerable usage, and I’m kind of patient so if something takes 1 minute instead of 10 seconds I’m fine with it. This is rare to happen, maybe only forgejo actions or the archival.

    In my main pc I was hosting some stuff too: immich, jellyfin, syncthing, and duplicati.

    Just recently bought this minipc https://aoostar.com/products/aoostar-r7-2-bay-nas-amd-ryzen-7-5700u-mini-pc8c-16t-up-to-4-3ghz-with-w11-pro-ddr4-16gb-ram-512gb-nvme-ssd
    (Although I bought it from amazon so I didn’t had to handle the import.)

    Haven’t moved anything off of the VPS, but I think this will be enough for a lot of stuff I have because of the specs of the VPS.
    The ones I’ve moved are the ones from my main PC.
    Transcoding for jellyfin is not an issue since I already preprocessed my library to the formats my devices accept, so only immich could cause issues when uploading my photos.

    Right now the VPS is around 0.3 CPU, 1.1/1.92GB RAM, 2.26/4.8GB swap.
    The minipc is around 2.0CPU (most likely because duplicati is running right now), 3/16GB RAM, no swap.

    There are several options for minipc even with potential to upgrade ram and storage like the one I bought.
    Here’s a spreadsheet I found with very good data on different options so you can easily compare them and find something that matches your needs https://docs.google.com/spreadsheets/d/1SWqLJ6tGmYHzqGaa4RZs54iw7C1uLcTU_rLTRHTOzaA/edit
    (Here’s the original post where I found it https://www.reddit.com/r/MiniPCs/comments/1afzkt5/2024_general_mini_pc_guide_usa/ )

    For storage I don’t have any comments since I’m still using a 512GB nvme and a 1TB external HDD, the minipc is basically my start setup for having a NAS which I plan to fill with drives when I find any in sale (I even bought it without ram and storage since I had spare ones).

    But I do have some huge files around, they are in https://www.idrive.com/s3-storage-e2/
    Using rclone I can easily have it mounted like any other drive and there’s no need to worry of being on the cloud since rclone has an encrypt option.
    Of course this is a temporary solution since it’s cheaper to buy a drive for the long term (I also use it for my backups tho)

    About point 2:

    If you go the route of using only linux sshfs is very easy to use, I can easily connect from the files app or mount it via fstab. And for permissions you can easily manage everything with a new user and ACLs.

    If you need to access it from windows I think your best bet will be to use samba, I think there are several services for this, I was using OpenMediaVault since it was the only one compatible with ARM when I was using a raspberry pi, but when you install it it takes over all your net interfaces and disables wifi, so you have to connect via ethernet to re-enable it.

    About point 3:

    In the VPS I also had pihole and searxng, but I had to move those to a separate instance since if I had something eating up the resources browsing internet was a pain hehe.

    Probably my most critical services will remain in the VPS (like pihole, searxng, authelia, squid proxy, GPS tracker) since I don’t have to worry about my power or internet going down or something that might prevent me from fixing stuff or from my minipc being overloaded with tasks that browsing the internet comes to a crawl (specially since I also ran stuff like whispercpp and llamacpp which basically makes the CPU unusable for a bit :P ).

    About point 4:

    To access everything I use tailscale and I was able to close all my ports while still being able to easily access everything in my main or mini pc without changing anything in my router.

    If you need to give access to someone I’d advice for you to share your pihole node and the machine running the service.
    And in their account a split DNS can be setup to only let them handle your domains by your pihole, everything else can still be with their own DNS.

    If this is not possible and you need your service open on the internet I’d suggest having a VPS with a reverse proxy running tailscale so it can communicate with your service when it receive the requests while still not opening your lan to the internet.
    Another option is tailscale funnel, but I think you’re bound to the domain they give you. I haven’t tried it so you’d need to confirm.


  • Ah, then no, the last thing I knew about it you can’t migrate accounts from one server to another, which is what you’re trying to do here.
    As I mentioned if you were able to move the keys which identify your account it would be easy for someone to impersonate you.
    Also, your public keys are shared among all the instances you’ve interacted, so this might break your interactions there.


  • Do you still have the old database? You should be able to move your instance around as long as you have a dump of your DB, that’s where all the keys of each community and user in your instance are. Those are the ones telling other instances you’re actually you, if you loose those I don’t know what can be done so other instances flush your old content and treat you as a new account. But I would count on thi s being a feature since it could lead to people impersonating someone else if they get a hold of the domain without the DB.

    EDIT: amm, maybe I didn’t understand correctly, are you trying to move to a new domain? Or to a new server with the same domain?
    What’s re-home?


  • A note taking app can be turned into a diary app if you only create notes for each day.
    Even better if you want to then expand a section of a diary entry without actually modifying it nor jumping between apps.

    Obsidian can easily help you tag and link each note and theme/topic in each of them.
    There are several plugins for creating daily notes which will be your diary entries.
    Also it’s local only, you can pair it with any sync service, the obsidian provided one, git, any cloud storage, or ones which work directly with the files like syncthing.

    Just curious, what are the special features you expect from a diary service/app which a note taking one doesn’t have?