• 0 Posts
  • 20 Comments
Joined 1 year ago
cake
Cake day: August 20th, 2023

help-circle





  • Personally if I already had a 5800x I probably wouldn’t upgrade to the 3d, though there would likely be some gains, especially if you’re cpu bound on a game.

    Here’s like 40 games where they’re compared on a nividia 3090:

    https://www.techspot.com/review/2451-ryzen-5800x3d-vs-ryzen-5800x/

    I upgraded from a 3700x to a 5800x3d so there was a big boost.

    But a 5800x3d isn’t even much cheaper than a 7800x3d, and the socket type switched now. So if I already had a 5800x I’d probably just wait and switch to a 3d chip in the future when I was ready to upgrade my motherboard. If cost is no object and you’re not gonna swap the motherboard for a long time, then yes it’s the best gaming cpu you’re going to be able to use with that board and likely always will be.


  • That the eye can only perceive 24 fps is a myth. Plus vision perception is very complicated with many different processes, and your eyes and brain don’t strictly perceive things in frames per second. 24 fps is a relatively arbitrary number picked by the early movie industry, to make sure it would stay a good amount above 16 fps (below this you lose the persistence of motion illusion) without wasting too much more film, and is just a nice easily divisible number. The difference between higher frame rates is quite obvious. Just go grab any older pc game to make sure you can get a high frame rate, then cap it to not go higher to 24 after that, and the difference is night and day. Tons of people complaining about how much they hated the look of Hobbit movie with its 48 fps film can attest to this as well. You certainly do start to get some diminishing returns the higher fps you get though. Movies can be shot to deliberately avoid quick camera movements and other things that wouldn’t do well at 24 fps, but video games don’t always have that luxury. For an rpg or something sure 30 fps is probably fine. But fighting, action, racing, anything with a lot of movement or especially quick movements of the camera starts to feel pretty bad at 30 compared to 60.








  • There’s a lot of problems with this. Just some include that it’s a blog and doesn’t link to the actual study so it’s impossible to see what’s going on with the this report. They also never explain what this “reliability score” even means or what’s included in that. Then they start doing things like using a percent to compare the scores saying this is percent more reliable. But we still don’t even know what this score is, and comparing as a percent may not make any sense to say depending on what the scores are and how they’re calculated. Unfortunately you can’t really draw any conclusions from what’s in this article.




  • It’s a tough call. Many forums have a rule against changing the title at all. People posting are often used to this and post the title as is from the article. The idea being to help prevent editorializing and clickbait on the part of the poster. Every headline these days though seems to be some variation of blatant clickbait or so and so “slams” this or “destroys” that. At this point I probably trust randos on the internet to make headlines more than publishers.




  • I hope this works out and becomes a viable competitor to DLSS3, especially with this most recent generation of games getting so demanding spec wise. I also appreciate that they make it available for any graphics card from any company. Nvidia certainly has some edge in propiatary features that AMD is having trouble matching at the moment, but Nvidia becoming even more dominant is bad news. Lack of competition will only encourage them to stagnate in the future and increase prices even higher. I’ll probably be looking to upgrade my own gpu soon so am very interested in how the just announced amd 7800xt compares against the Nvidia 4070.