Maybe they are illuminating their living room with the front end of a BMW.
Better yet, it’s a Pimp My Ride style makeover that replaces those unused turn signals with a projection system for an instant drive-in movie experience.
Maybe they are illuminating their living room with the front end of a BMW.
Better yet, it’s a Pimp My Ride style makeover that replaces those unused turn signals with a projection system for an instant drive-in movie experience.
Like every system? What’s the actual distinction you’re trying to point out?
I think the more nuanced take is that we should be making “piracy” legal by expanding and protecting fair use and rights to make personal copies. There are lots of things that are called piracy now that really shouldn’t be. Making “piracy” legal still leaves plenty of room for artists to get paid.
I’m just curious how much RAM you think that is.
Docker compose is just a setting file for a container. It’s the same advantage you get using an ssh config file instead of typing out and specifying a user, IP, port, and private key to use each time. What’s the advantage to putting all my containers into one compose file? It’s not like I’m running docker commands from the terminal manually to start and stop them unless something goes wrong, I let systemd handle that. And I’d much rather the systemd be able to individually start, stop, and monitor individual containers rather than have to bring them all down if one fails.
You don’t need to get too complicated with scripts if you let Picard do all the tagging and renaming. In my experience it works pretty well with the default out of the box configuration. Just don’t try to do your whole library at once, just go album by album and check each one is matching with the correct release. I was in the same boat about a decade ago and did the same, just a few albums a day getting tagged and renamed into a fresh music directory. And of course, make a backup first, just in case.
Lately I’ve been going through this process again because I messed up configuring Lidarr and many files got improperly renamed. Since they were all still properly tagged, fixing them has been easy, especially with Picard. I haven’t really bothered to find all the stray files yet (they’re still roughly in the right location) because Plex ignores the paths and just reads the tags so the misnamed files aren’t even noticable in Plex
When you say Plex interface remotely, are you referring to the Plex app or PlexAmp app? I feel like PlexAmp fixed all of my complaints about listening to music through Plex (the same app I use for videos).
Easytag works pretty well for me on Linux, when I’m not just using Picard. I use EasyTag mostly for fixing and normalizing the tags on audiobooks these days.
Jack of all trades, master of none. Forcing a router reboot to get the home Internet working again has become a thing of the past since I set up a unifi router and APs.
I’d had router/WiFi combos before running either dd-wrt, open-wrt, or tomato. None of them were stable. But I suspect that was because the hardware just couldn’t keep up, not because the open source software was faulty.
Why? If the power has gone out there are very few situations (I can’t actually think of any except brownouts or other transient power loss) where it would be useful to power my server for much longer than it takes to shut down safely.
They are pretty similar. It’s hard to judge because they are different sizes, different boundaries, and different brightness. If VLC is playing a bit dimmer, it makes sense that some artifacts would be less visible.
None of what you’ve just said here is true. They don’t work like house keys. Your system and my system are VERY different because I’m not making copies of my private keys anywhere. They never leave the safe place I created them. I only ever transfer the public keys. I could post my public keys here and there would be no security compromise for me. You came here asking for help. I tried to help you. I’m sorry it wasn’t what you wanted to hear. Your attitude sucks.
No, it is inherently bad to copy around private keys. You have some fundamental misunderstandings of how key authentication security works. RTFM.
No, you’re missing the point and creating a false choice here. You’re supposed to generate new keys for each client device and load their various public keys into the authorized keys file in your server user’s home folder. Copying around your private key like that is just BAD security and not how public key authentication is designed to work. It’s not as if the only two options are your bad way or passwords.
As an example, you copy your single private key to various devices and even carry (a probably un encrypted) copy around with you on a thumb drive, while I generate a fresh key set from each client that I use to connect. When your private key is compromised (when, NOT if), you must remove that public key from your server to lock out the bad actor, but that also completely locks you out. Unless you have physical password access to the machine at the moment its compromised, you’re also locked out. When one of my keys is compromised, I can just exclude that machine’s key from my authorized keys list on the server and continue accessing my machine remotely via any of the other uncompromised clients.
Why are you trying to reuse an ssh key? That seems like a really bad practice. It’s just not the way key pair authentication is supposed to work. Passing around and sharing private keys is BAD. Client devices create their own private keys and only share public keys. Just create a new key from ConnectBot and get it to your server via other methods. If you’re already away from home without any other means of connecting, that last part is admittedly tricky and you may be SOL.
Isn’t ConnectBot a dead project anyway? Last I checked, it hadn’t been updated in years. Well, I guess I was wrong here. I can’t find a simple full list of all the past updates, but I seem to remember moving away from ConnectBot because it lacked some feature I wanted and no longer worked on my new Android device. I’ve been satisfied with JuiceSSH, but I’m happy that ConnectBot is still alive since it was one of the first apps I installed on the first generation Android phone.
Yeah, I remember. My favorite DOS game was Scorched Earth, which fit on a high density 1.44MB floppy disk. But that was the point of the article. Space used to be at a premium and a terabyte used to feel like more space than I’d ever need. Now a terabyte is only just enough for portable devices because the cost of extra capacity was increasing so fast and development of space saving tech seemed like a waste of time, but that trend of increasing capacity and decreasing cost has significantly plateaued (as shown by the graph in the article).
If by nuts you mean, a very modest low single digit terabyte range. Which, according to the game sizes cited in the article could only hold around 6-10 games per terabyte. Given the way games tend to disappear from online sources over time, that doesn’t seem like enough space to me to really keep all those digital purchases. I guess if most of them will become abandonware eventually anyway when the companies shut down their servers, it hardly matters.
Combination wifi & router devices are notoriously unstable. Those provided by ISPs are particularly bad. If you have the ability and the funds, spend a little more to get a prosumer router and wireless AP as separate devices that connect to your modem in bridge mode. In the long run, for me anyway, the stability and reliability of this kind of setup paid for itself quickly in less of my time wasted. My setup: my own cable modem per the specs my ISP provided, a unifi edge router X, and a unifi AP. I already had a server so I installed the AP management software on it, but unifi also sells a single board device to run that. Everything except the AP live in a little electronics cabinet tucked away. The AP gets it’s power over Ethernet, so it can be mounted to a better placement with regard to walls, doors, pipes, etc. on a wall or ceiling with only a length of Ethernet cable running to the router. The AP itself just looks like a hand sized bump of white on the wall. I turned off the AP’s status lights once it was setup so that it remains as discreet as possible. Adding a WiFi repeater from unifi nearer to the one room I still had a little trouble with was almost as easy as plugging it into the wall outlet.
Not everyone can or should go this route, and it was a learning experience for me with some growing pains, but in the end it was worth it to me. UniFi isn’t the only game in town either. Either way separating your network devices so that each only does one job (the modem connects, the router routes, and the AP does wifi) means that one underpowered chip isn’t being crushed under the weight of too many tasks at once.
You could spend a little for a prosumer router and AP. I have a very similar setup with a cable modem, edge router X (ubnt), a single UniFi AP, and a service running on my server (this could be replaced with a separate hardware device or Raspberry Pi, but the server is going to be running anyway). It’s been rock solid since I set it up, compared to the WiFi/router combo with open-wrt I was running before that struggled and needed restarting regularly.
In my experience, 2 devices will ultimately save you effort and frustration. Anything you choose as a good NAS/seedbox will be unlikely to have a good from the couch interface or handle Netflix reliable and easily. A small Android TV box may have a much better interface, simple app setup, and support all the streaming services, but probably won’t be very powerful or convenient to use as a NAS. The NAS is always on, plugged directly into the Internet access point, and tucked away out of sight and sound. The Android TV or Apple TV box is silent, small, and can be mounted directly to the Beamer/Projector.
Yes, Kodi exists and it’s add-ons can bridge this gap. But I still think that a SBC NAS running Jellyfin or plex + an Nvidia shield with jellyfin, Plex, Netflix, Spotify, YouTube, amaon, etc. will be so much easier to setup, manage, find support for, and upgrade.
I have a similar setup even though my server has a direct HDMI link to my TV. I’m not a fan of viewing using the server it from the couch. Setting up IR remotes sucks always. And it’s confusing for anyone but me to use. But if my Nvidia Shield dies or I’m having network trouble, VLC a pretty good backup.