• 4 Posts
  • 653 Comments
Joined 1 year ago
cake
Cake day: October 4th, 2023

help-circle

  • locking down the Windows kernel in order to prevent similar issues from arising in the future. Now, according to a Microsoft blog post about the recent Windows Endpoint Security Ecosystem Summit, the company is committing to providing “more security capabilities to solution providers outside of kernel mode.”

    So first off, from a purely-technical standpoint, I think that that makes a lot of sense for Microsoft. Jamming all sorts of anti-cheat stuff into the Windows kernel is a great way to create security and stability problems for Windows users.

    However.

    I don’t know if my immediate take would be that it would permit improving Linux compatibility.

    So, from a purely-technical standpoint, sure. Having out-of-kernel anti-cheat systems could make it easier to permit for Linux compatibility.

    But it also doesn’t have to do so.

    First, Microsoft may very well patent aspects of this system, and in fact, probably has some good reasons to do so. A patent-encumbered anti-cheat system solves their problem. But that doesn’t mean that it’s possible for other platforms to go out and implement it, not for another 20 years, at least.

    Second, it may very well rely on trusted hardware, which may create issues for Linux. The fundamental premise of a traditional open-source Linux system is that anyone can run whatever they want and modify the software. That does not work well with anti-cheat systems, which require not letting users modify their local software in ways that are problematic for other users. My Linux systems don’t have ties up and down the software stack to trusted hardware. Microsoft is probably fine with doing that, on both XBox and newer trusted-hardware-enabled Windows systems.


  • tal@lemmy.todaytoGaming@beehaw.orgShmup suggestions
    link
    fedilink
    English
    arrow-up
    4
    ·
    3 days ago

    Do you have any examples of shumups that you like and any you dislike? That might help give a better idea of what you like.

    I mean, if you just want “good shmups”, it’s easy to go to Steam, search for games with the “Shoot 'em Up” tag, and sort by user reviews.

    But if you’re looking for something in particular, a list like that might help.




  • tal@lemmy.todaytoSelfhosted@lemmy.worldProgrammatic access to discord
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    edit-2
    7 days ago

    I get that.

    Honestly, though I’m still a little puzzled as to why people initially got into Discord; I never did.

    I can understand why people wanted to use some systems. Twitter does massive-scale real-time indexing. That was a huge feature, really changed what one could do on the platform.

    Reddit provided a good syntax (Markdown), had a low barrier to entry (no email verification at a time when that was common), and third-party client access. It solved the spam problem that was killing Usenet and permitted for more-reasonable moderation.

    There were a whole host of services that aimed to lower the complexity bar to get a web page and some content online associated with someone’s identity; it was clear that lack of technical knowledge and the technical knowledge required to get stuff up was a real limiting factor for many people.

    But I just didn’t really get where Discord provided much of a win over stuff like IRC. I mean, I guess maybe it bundled a couple services into one, which maybe lowered the bar to use a bit. IRC really seemed pretty fine to me. Reddit bundling image-hosting seems to have lowered the bar, been something that people wanted. Maybe Discord doing images and file-hosting made it more-accessible.

    I have no idea why a number of people who liked Cataclysm: Dark Days Ahead used Discord rather than Reddit; it seemed like a dramatically-worse system if one was aiming to create material for others to look back at and refer to.

    kagis

    https://old.reddit.com/r/RedditForGrownups/comments/t417q1/can_someone_please_explain_discord_to_me_like_im/

    It’s just modern day IRC with video.

    Ahaha, thanks. This is indeed an ELI60 response, although it doesn’t really explain how Discord suddenly got so popular. But if I couple this with /u/Healthy-Car-1860’s response, I’m kind of getting the picture.

    Got popular because it spread through the entire gamer/twitch community like wildfire due to actually being a more complete package and easier to use than anything prior. Online gamers have been struggling with voip software forever (Roger Wilco, Teamspeak, Ventrilo, Skype, and many others).

    Once it was rooted in the people who are on their computers app day every day it was bound to spread because the UX is incredibly easy compared to previous options for both chat and voip.

    Maybe that’s it. I never had a lot of interest in VoIP, especially group VoIP. When I was playing online games much, people used keyboards to communicate, not mics. There was definitely a period where people needed the ability to collaborate in games and games didn’t always provide that functionality. I remember people complaining about Teamspeak and Ventrilo. I briefly poked at Mumble – nice to have an open-source option – but I just had no reason to want to do VoIP with groups of people.

    But I suppose for a video game clan or something, that might be important functionality. And if it’s also a one-stop shop for some other things that you might want to do anyway, it maybe makes sense to just use that rather than multiple services.



  • If I need to do an emergency boot from a USB stick to repair something that can’t boot, which it sounds like is what you’re after, pretty much any Linux distro will do. I’d probably rather have a single, mainstream bootable OS than a handful.

    I’d use Debian, just because that’s what I use normally, so I’m most familiar with it. But it really doesn’t matter all that much.

    And honestly, while having an emergency bootable medium with a functioning system can simplify things, if you’re familiar with the boot process, you very rarely actually need emergency boot media on a Linux system. You have a pretty flexible bootloader in grub, and the Linux kernel can run and be usable enough to fix things on a pretty broken system, if you pass something like init=/bin/sh to the kernel, maybe busybox instead for a really broken system, and can remount root read-write (mount -o rw,remount /) and know how to force syncs (echo s > /proc/sysrq-trigger) and reboots (echo b > /proc/sysrq-trigger).

    I’ve killed ld.so and libc before and broght back systems without alternate boot media. The only time I think you’d likely really get into trouble truly requiring alternate boot media is (a) installing a new kernel that doesn’t work for some reason and removing all the old, working kernels before checking to see that your new one works, or (b) killing grub. Maybe if you hork up your partition table or root filesystem enough that grub can’t bring the kernel up, but in most of those cases, I’m not sure that you’re likely gonna be bringing things back up with rescue tools – you’re probably gonna need to reinstall your OS anyway.

    EDIT: Well, okay, if you wipe the partition table, I guess that you might be able to find the beginning of a filesystem partition based on magic strings or something and either manually reconstruct the partition table or at least extract a copy of the filesystem to somewhere else.



  • Internet Archive creates digital copies of print books and posts those copies on its website where users may access them in full, for free, in a service it calls the “Free Digital Library.” Other than a period in 2020, Internet Archive has maintained a one-to-one owned-to-loaned ratio for its digital books: Initially, it allowed only as many concurrent “checkouts” of a digital book as it has physical copies in its possession. Subsequently, Internet Archive expanded its Free Digital Library to include other libraries, thereby counting the number of physical copies of a book possessed by those libraries toward the total number of digital copies it makes available at any given time.

    This appeal presents the following question: Is it “fair use” for a nonprofit organization to scan copyright-protected print books in their entirety, and distribute those digital copies online, in full, for free, subject to a one-to-one owned-to-loaned ratio between its print copies and the digital copies it makes available at any given time, all without authorization from the copyright-holding publishers or authors? Applying the relevant provisions of the Copyright Act as well as binding Supreme Court and Second Circuit precedent, we conclude the answer is no. We therefore AFFIRM.

    Basically, there isn’t an intrinsic right under US fair use doctrine to take a print book, scan it, and then lend digital copies of the print book.

    My impression, from what little I’ve read in the past on this, is that this was probably going to be the expected outcome.

    And while I haven’t closely-monitored the case, and there are probably precedent issues that are interesting for various parties, my gut reaction is that I kind of wish that archive.org weren’t doing these fights. The problem I have is that they’re basically an indispensible, one-of-a-kind resource for recording the state of webpages at some point in time via their Wayback Machine service. They are pretty widely used as the way to cite a page on the Web.

    What I worry about is that they’re going to get into some huge fight over copyright on some not-directly-related issue, like print books or something, and then someone is going to sue them and get a ton of damages and it’s going to wipe out that other, critical aspect of their operations…like, some random publisher will get ownership of archive.org and all of their data and logs and services and whatnot.





  • released for public testing

    I mean, it’s not publicly-available either; it’s just available to a select group of testers.

    I haven’t been following the game’s development. But my guess is that the devs are going to prioritize targeting the machines that they’re using to do development of the thing. They won’t be using a Deck to develop the thing. This probably won’t be the only tradeoff made, either – I’d guess that performance optimizations aimed at the Deck or other lower-end machines might be something that would be further down on the list. I’d guess that any kind of tutorial or whatever probably won’t go in until late in the development – not that that’s not important to bring new users up to speed, but it’s just not something that the devs need to work on it. Probably not an issue for this game, which looks like it’s multiplayer, but I’d guess that breaking save or progress compatibility is something that they’d be fine with. That’s frustrating for a player, but it can make development a lot easier.

    Doesn’t mean that those don’t matter, just that they won’t be top of the priority list to get working. What they’re gonna prioritize is stuff that unblocks other things that they need.

    I worked on a product in the past that had a more “customer-friendly” interface and a command line interface. When a feature gets implemented, the first thing that a dev puts in is the CLI support – it’s low-effort, and it’s all that the dev needs to get the internal feature into a testable state for the internal people. The more-customer-friendly stuff, documentation, etc all happens later in the development cycle. Doesn’t mean that we didn’t care about getting that out, just that we didn’t need it to unblock other parts of the the development process. Sometimes we’d give access to development builds to customers who specifically urgently needed a feature early-on and were willing to accept the drawbacks of using stuff that just isn’t done, but they’re inevitably gonna be getting something that’s only half-baked.

    I mean, if it bugs you, I’d just wait. Like, they aren’t gonna be trying to provide an ideal customer experience at this point in the development cycle. They’re just gonna want to be using it as a testbed to see what works. It’s gonna inevitably be a subpar experience in various ways for users. The folks who are using the thing at this point are volunteering to do unpaid testing work in exchange for getting to play the thing very early and maybe doing so at a point where they can still alter the gameplay substantially. There are some people who really enjoy that, but depends on the person. It’s not really my cup of tea. I dunno about you, but I’ve got a Steam games backlog that goes on forever; it’s not like I’ve got a lack of finished games to get through.



  • I haven’t played it, but it sounds like the situation may be in flux:

    https://www.oneesports.gg/gaming/does-deadlock-have-controller-support/

    At the time of writing, the action game is in closed beta, and it doesn’t offer native controller support. However, it does have an option that players can use to play the game with a controller.

    With that in mind, the game is likely to feature controller support when it releases on PC, as it is expected to be Steam Deck compatible.

    However, you must keep in mind that since the game is still in early development, it doesn’t offer any key binding or customization feature.

    Additionally, even with a controller on default settings, some key actions in the game may not be mapped, so you might encounter limitations during gameplay.

    In the near term, if a keyboard can do what you want, if you can dig up macro software for your platform that can look for specific gamepad combinations and send keystrokes as a result, I imagine that you could make it work that way.


  • CIFS supports leases. That is, hosts will try to ask for exclusive access to a file, so that they can assume that it hasn’t changed.

    IIRC sshfs just doesn’t care much about cache coherency across hosts and just kind of assumes that things haven’t changed underfoot, uses a timer to expire the cache.

    considers

    Honestly, with inotify, it’d probably be possible to make a newer sshfs that does support leases.

    I suspect that the Unixy thing to do is to use NFSv4 which also does cache coherency correctly.

    It is easy to deploy sshfs, though, so I do appreciate why people use it; I do so myself.

    kagis to see if anyone has benchmarks

    https://blog.ja-ke.tech/2019/08/27/nas-performance-sshfs-nfs-smb.html

    Here are some 2019 benchmarks that show NFSv4 to generally be the most-performant.

    The really obnoxious thing about NFSv4, IMHO, is that ssh is pretty trivial to set up, and sshfs just requires a working ssh connection and sshfs software installed, whereas if you want secure NFSv4, you need to set up Kerberos. Setting up Kerberos is a pain. It’s great for large organizations, but for “I have three computers that I want to make talk together”, it’s just overkill.



  • While I generally agree, I think that there are some other ways that one could make games:

    • One is to just do games incrementally. Like, you buy a game, it doesn’t have a whole lot of content then buy DLC. That’s not necessarily a terrible way for things to work – it maybe means that games having trouble get cut off earlier, don’t do a Star Citizen. But it means that it’s harder to do a lot of engine development for the first release. Paradox’s games tend to look like this – they just keep putting out hundreds of dollars in expansion content for games, as long as players keep buying it. It also de-risks the game for the publisher – they don’t have so much riding on any one release. I think that that works better for some genres than others.

    • Another is live service games. I think that there are certain niches that that works for, but that that has drawbacks and on the whole, too many studios are already fighting for too few live service game players.

    • Another is just to scale down the ambition of games. I mean, maybe people don’t want really-high-production-cost games. There are good games out there that some guy made on his lonesome. Maybe people don’t want video cutscenes and such. Balatro’s a pretty good game, IMHO, and it didn’t have a huge budget.

    I do think, though, that there are always going to be at least some high-budget games out there. There’s just some stuff that you can’t do as well otherwise. If you want to create a big, open-world game with a lot of human labor involved in production, it’s just going to have a lot of content, going to be expensive to make that content. Even if we figure out how to automate some of that work, do it more-cheaply, there’ll be something new that requires human labor.


  • The price is reasonable

    I was going to “say $60 for re-releases of several older DS games doesn’t seem that amazing”, but then I realized that I was looking at the price for the bundle below it, which apparently includes all of:

    The games in this bundle:

    • Castlevania: Dawn of Sorrow
    • Castlevania: Portrait of Ruin
    • Castlevania: Order of Ecclesia
    • Haunted Castle Revisited
    • Haunted Castle

    Plus:

    • Castlevania: Circle of the Moon

    • Castlevania: Harmony of Dissonance

    • Castlevania: Aria of Sorrow

    • Castlevania: Dracula X

    • Castlevania

    • Castlevania II Simon’s Quest

    • Castlevania III Dracula’s Curse

    • Super Castlevania IV

    • Castlevania The Adventure

    • Castlevania II Belmont’s Revenge

    • Castlevania Bloodlines

    • Kid Dracula

    That’s…actually pretty darn good too, if you enjoy Metroidvanias.

    I do kind of wish that they’d figured out some sort of way to do higher-resolution graphics, but the games themselves ain’t bad.