Consoles made sense when they required specialized hardware. The $700 for a PS5 is probably better spent on a much better GPU for a PC, IMHO.
It’s cool if you like consoles! They still have a specific allure, so I get it.
Consoles made sense when they required specialized hardware. The $700 for a PS5 is probably better spent on a much better GPU for a PC, IMHO.
It’s cool if you like consoles! They still have a specific allure, so I get it.
I am curious as to why they would offload any AI tasks to another chip? I just did a super quick search for upscaling models on GitHub (https://github.com/marcan/cl-waifu2x/tree/master/models) and they are tiny as far as AI models go.
Its the rendering bit that takes all the complex maths, and if that is reduced, that would leave plenty of room for running a baby AI. Granted, the method I linked to was only doing 29k pixels per second, but they said they weren’t GPU optimized. (FSR4 is going to be fully GPU optimized, I am sure of it.)
If the rendered image is only 85% of a 4k image, that’s ~1.2 million pixels that need to be computed and it still seems plausible to keep everything on the GPU.
With all of that blurted out, is FSR4 AI going to be offloaded to something else? It seems like there would be a significant technical challenges in creating another data bus that would also have to sync with memory and the GPU for offloading AI compute at speeds that didn’t risk create additional lag. (I am just hypothesizing, btw.)
It seems like it would be extremely fast to me. Take a 50x50 block of pixels and expand those across a 100x100 pixel grid leaving blank pixels were you have missing data. If a blank pixel is surrounded by blue pixels, the probability of the missing pixel being blue is fairly high, I would assume.
That is a problem that is perfect for AI, actually. There is an actual algorithm that can be used for upscaling, but at its core, its likely boiled down to a single function and AI’s are excellent for replicating the output of basic functions. It’s not a perfect result, but it’s tolerable.
If this example is correct or not for FSR, I have no clue. However, having AI shit out data based on a probability is mostly what they do.
No Starfield? Oh noes!
Future chips not affected by THIS cpu bug yet.
Same. I support AI completely as a tool to solve specific problems and that is about it. What is really cool is that AI libraries and such got a massive boost of needed development so plebs like me can code simple ANN apps in Python with little skill. Documentation has improved 100x and hardware support is fairly good.
LinkedIn seems to be an interesting indicator of where tech is in its hype cycle. I guess LinkedIn went from 100% AI-awesome-everything about 2 months ago to almost zero posts and ads about it. I suppose most of the vaporware AI products are imploding now…
Of course, algorithmic feeds are a thing, so your experience might be different.
Reptiles are interesting creatures, for sure. (I still have a female chameleon and bearded dragon. My male cham died of old age last year.) I never got into breeding feeders, but that is a really good route to avoid the excessive supplementation that is super common.
I was a mod of /r/chameleons and still technically the owner of /r/chameleonholdingstuff, but I haven’t been on Reddit for a long while. Strangely enough, some mods tried to pull me back into the fold a few months ago, but Reddit is just a shit show and I don’t want any part of it. The /r/reptiles mods seemed a little more on the sane side though.
Most other moderators I met or talked with in person from Reddit were raging alcoholics, myself included. The subs I was involved with had a community that was… eh… very unique. (Reptile owners themselves tend to be a special breed, to say the least.)
Mental health is super important so do what you need to do and find a good headspace. Modding is a good time suck when you aren’t busy but is a serious drain when other things are more important in life.
When I was in a much darker place, I would tend to make decisions that would isolate me from everyone and everything else. In some ways that is nice but in other ways… not so much. If you believe your decision is with specific purpose, awesome. If this decision is more of a trend across your life, try and only make positive decisions. (This is just something I share when people express decisions in the context that you used.)
Wishing you the best! Be happy and stay healthy, friend.
FYI, you can download your photos in bulk with Google Takeout, but you need to have enough space in Google Drive to do it. (Takeout zips up all your photos and will drop 10GB chunks in Drive.)
I was doing something similar to you recently. I downloaded all my photos and de-duped by generating MD5 hashes for all the pictures that were downloaded. (I was moving all of my photos to a local NAS, so it wasn’t quite what you are doing.)
If your dups have consistent MD5 hashes, that might work for you but it’s hard to say.
I am going to need your 50 point summary of those obvious points in the longest form possible by this afternoon so I can be completely convinced that I have already made up my mind in the correct way. Thanks.
It was on old 3.5" drives a long time ago, before anything fancy was ever built into the drives. It was in a seriously rough working environment anyway, so we saw a lot of failed drives. If strange experiments didn’t work to get the things working, mainly for lulz, the next option was to see if a sledge hammer would fix the problem. Funny thing… that never worked either.
I used to take failed drives while they were powered on and kinda snap them really with a fast twisting motion in an attempt to get the arm to move or get the platters spinning.
It never worked.
Did you get bad sectors? Weird things can absolutely happen but having sectors marked as bad is on the exceptional side of weird.
Maybe? Bad cables are a thing, so it’s something to be aware of. USB latency, in rare cases, can cause problems but not so much in this application.
I haven’t looked into the exact ways that bad sectors are detected, but it probably hasn’t changed too much over the years. Needless to say, info here is just approximate.
However, marking a sector as bad generally happens at the firmware/controller level. I am guessing that a write is quickly followed by a verification, and if the controller sees an error, it will just remap that particular sector. If HDDs use any kind of parity checks per sector, a write test may not be needed.
Tools like CHKDSK likely step through each sector manually and perform read tests, or just tells the controller to perform whatever test it does on each sector.
OS level interference or bad cables are unlikely to cause the controller to mark a sector as bad, is my point. Now, if bad data gets written to disk because of a bad cable, the controller shouldn’t care. It just sees data and writes data. (That would be rare as well, but possible.)
What you will see is latency. USB can be magnitudes slower than SATA. Buffers and wait states are causing this because of the speed differences. This latency isn’t going to cause physical problems though.
My overall point is that there are several independent software and firmware layers that need to be completely broken for a SATA drive to erroneously mark a sector as bad due to a slow conversion cable. Sure, it could happen and that is why we have software that can attempt to repair bad sectors.
It’s been around for a while. It’s the fluff and the parlor tricks that need to die. AI has never been magic and it’s still a long way off before it’s actually intelligent.
That is a very peculiar rant.
I wonder if they are sticking with NVIDIA for the GPU? With NV being swamped with AI and PC GPU demand, it would be interesting to see if they switch teams to AMD. The price would probably be much better…
(The rumor is that an announcement of the 5 series GPUs is comming soon from NVIDIA. I heard that from Gamers Nexus, I think)
Blizzard was (still is?) Activision for a number of years as well, so that didn’t help. There is much more to blame and I can’t even begin to pretend like I know all of it, though.
My own complaint is similar though. When “profit at all costs” takes over, and knowing how to make a quality game is lost, there is usually little hope left.
Employers figured out years ago that caffeine has excellent ROI for productivity. (Amphetamines are probably a close second, but we won’t talk about that right now.)
For Intel to cut basic morale boosters was just pure silliness.