I think this is a case, where there’s an engine that was developed for graphics cards, that epic thought were common when they created UE5. They expected that RT performance would not only increase in the highest pricing tier, but also in mid-range (as in $500 pricing range) to a degree, where all the lumen stuff would be trivial. But steam survey reveals that most people use X060, X050 and then some X070 class cards, where RT performance isn’t that great.
https://store.steampowered.com/hwsurvey/videocard/
So IMO there are several issues:
There are obviously optimization issues with UE5, I’m not going to claim otherwise.
Nvidia catering to the AI crowd, which is obviously more lucrative for them that improving RT performance for mid- and low-range cards.
Epic being bad at forecasting the future state of graphics performance.
Epic trying to offload workload from development to the end user (e.g. shifting from pre-baked lighting to realtime lighting).
The end result stays the same, we now play games that run so bad on current hardware, that everything now needs to be AI-upscaled with framegen and has this weird soft look that will look super dated at some point because RT isn’t there yet.
I think this is a case, where there’s an engine that was developed for graphics cards, that epic thought were common when they created UE5. They expected that RT performance would not only increase in the highest pricing tier, but also in mid-range (as in $500 pricing range) to a degree, where all the lumen stuff would be trivial. But steam survey reveals that most people use X060, X050 and then some X070 class cards, where RT performance isn’t that great. https://store.steampowered.com/hwsurvey/videocard/
So IMO there are several issues:
The end result stays the same, we now play games that run so bad on current hardware, that everything now needs to be AI-upscaled with framegen and has this weird soft look that will look super dated at some point because RT isn’t there yet.