Installer Steam
Logg inn
|
språk
简体中文 (forenklet kinesisk)
繁體中文 (tradisjonell kinesisk)
日本語 (japansk)
한국어 (koreansk)
ไทย (thai)
Български (bulgarsk)
Čeština (tsjekkisk)
Dansk (dansk)
Deutsch (tysk)
English (engelsk)
Español – España (spansk – Spania)
Español – Latinoamérica (spansk – Latin-Amerika)
Ελληνικά (gresk)
Français (fransk)
Italiano (italiensk)
Bahasa Indonesia (indonesisk)
Magyar (ungarsk)
Nederlands (nederlandsk)
Polski (polsk)
Português (portugisisk – Portugal)
Português – Brasil (portugisisk – Brasil)
Română (rumensk)
Русский (russisk)
Suomi (finsk)
Svenska (svensk)
Türkçe (tyrkisk)
Tiếng Việt (vietnamesisk)
Українська (ukrainsk)
Rapporter et problem med oversettelse
How many games are using Unity 5.1 right now?
Fortnite at 4K NATIVE EPIC, 60-70FPS
This unreal 5.1 city sample with average 90 fps
https://youtu.be/E-KFjYu637k
Far Cry 6 on CryEngine, yes the same ♥♥♥♥♥♥♥ engine which was used in Crysis, 90-100FPS 4K Native
https://youtu.be/rHAwoKEDH54
Explain me, how this is not 4K ready?
Or explain me this? Average 145FPS on 13 games...
https://youtu.be/aQklDR8nv8U
How many frames per second you need? If l give you 360Hz monitor, you will want all these 360 frames, but do you even know how big is the difference between 100, 120 and 360?
https://steamcommunity.com/discussions/forum/11/3761102779889269403/?ctp=8#c3761103203680789085
And this ↑↑↑ post is also spot on, as either OP has an unrealistic expectations, or misunderstands how things worked for last 30 years when PC hardware release, and people market things. Truth be told if a product such as GPU, get marketed towards a game, it's not wise to ignore the minor details such as game settings, game version, and what the current state of the game before future changes as that what normally what people market games or GPU for giving said minor details, example Fortnite ALWAYS keep getting updates, content added, and it's a given that it's not going to remain the same.
I agree, this is why l am not giving a link of fortnite, because the content, possible optimizations, APIs and engines changes are very denamic, but these results are my point of view from the tests l saw with Unreal 5.1
As I said, enjoy what you want. I'm not trying to tell you're wrong for having enthusiasm for whatever you do.
My contention was more how you were calling it hypocritical to enjoy one thing (basically, enjoying things without ray tracing) while not being a raving fan of another thing (DLSS 3 and the like) and it rubbed me wrong because there's nothing hypocritical about that. Many of us are just fine appreciating what we can get. Heck, I'm still in awe at Minecraft of all things with shaders (not RTX, but shaders). It's inefficient in performance by adding a shader pipeline to an already inefficient performing game that lacks one, and you get those "fake results" as you say, but I still am just in awe at the results some times.
Now if you're in a position like OP is where "I want to play at 4K, get ray tracing, and maintain hundreds of FPS in everything" then yeah maybe one needs to consider DLSS 3 as performance growth hasn't kept pace with demands. I would agree there as I've made that same point to OP earlier in the thread.
Not always are things linear and based on just one factor, though I'm sure you know this.
If ONLY pixel fill rate (I think this would be the right thing?) is the limitation and that ALSO remains the same limitation all the way up to quadrupling the pixel count, then sure, it might be the case that you need four times the performance to increase the pixel count four times and keep the same performance, but that's not always the case. Often times you might need less than four times the processing speed to drive 4K over 1080p (and maybe other times, more?).
And the inverse can be true. Sometimes you need more than you might think if you just look at how one factor changes. Often in Minecraft, people underestimate that doubling the render distance does not double the demands, but it can often increase it by more than double as you're actually asking four times the chunks to be handled (which actually might be more or even less in reality due to things like culling and such, but still).
But it is also in part with gamers themselves. Because some games are out and ready to do 4k/raytracing whatever, some gamers have a mindset where "it's my civil right to play this game on max settings but I don't have the hardware to do it WTF!, heads will roll!". GPU prices are outrageous but middle tier cards should still be able to run it half way decent looking. If it takes raytracing at 4k to visually please you, you should start looking internally to see if there's an issue there while you're at it.
Edit: Dang, I just watched some of that 4k Forntite video and it does look impressive. That much detail would need a strong system. I can't be mad at that requireing a 4090. Looks like every single piece of grass and leave has it's own shadow that is dynamically rendered. I don't think that's really happening, but the fact that it appears something like that at first glance is cool. Only thing that comes to mind for me that looks like that is Witcher 3. Man those colors are vibrant too.
These games are going to do to me what 4k video did to dvd and regular bluray. I can't appreciate them like I used to.
If so it doesn't seem to working.
What would be the point of buying a 4090 to game on a 1080p monitor though? If you're satisfied with 1080p that's fine but I guarantee you if you were to game on a 4k monitor for six months and switched back to 1080p you'd notice the difference right away and probably wouldn't like it.
This is the point of 4090 and 1080p :)
Now if after like crazy refreshrate 240hz, or higher you would be looking at 1080p or 1440p, since 4k is not an option for that high refreshrate at least for now, there is few but prices are way higher than your avg 4k monitor, as can get 1440p that 240hz that about ~1/3 what cost to getting 240hz 4k, or about ~1/5 if 1080p 240hz basically as see prices scale pretty hefty.
I prefer the better visuals and if I can get higher frame rates I consider that a bonus. My first 4k monitor is 4 yrs old and when it went down I used a 1440p monitor temporarily and the difference in visual quality to me was quite noticeable.