Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
What res are you trying to play?
One big big difference is that despite being newer you do have one of the budget tier cards from that gen. Both the 060 and 070 were heavily criticised for having criminally low VRAM for a next gen card at the time.
I think the 070 only has something like 10 and 060 is even worse with 8 which is just really poor on Nvidia's part.
For comparison the 3090 has 24.
Modern games and higher resolutions are really dependant on VRAM so i'd always go for hte largest option you can each gen as it will serve you better.
Not sure if this is the reason without knowing more about the rest of your system but maybe try playing at 1440p (or 1080p) and see if that makes a difference first. If it does then the issue is that you have low Vram. If it doesn't then its somewhere else like storage, RAM or even your CPU can all have an impact.
You provided absolutely 0 info on anything. There could be a ♥♥♥♥ ton of different causes for this issues. Like your CPU. Yes guess what your GPU isnt the most important thing if you consider playing a game, you will need to have good parts for everything for the best performance. If you have an outdated CPU you will run into bottlenecking which in return causes lag. If your drive is too slow your game will stutter. If you put on raytracing on 4k your game might stutter. If you didnt update the drivers your game might sutter.
And so on and so forth. So if you are actually looking for help provide actual info or nobody will help you. I never understood why people make this thread "game lags" and then give no info so people are supposed to guess what the issue is or lemme ask my crystal ball.
False. It IS ACTUALLY the most important thing. CPU comes heavily in second, then RAM in third. Additionally, CPU depends very much so on the type of game you're playing, i.e. some very high fidelity games don't require strong CPUs, and some games that aren't graphically impressive require monstrous CPU output. Also, you can even lower the bottleneck on your CPU based on what resolution you're playing at, which is, again, heavily dependent on your GPU. Unless the OP is running something like a 4th generation Intel i7 (and using an RTX 4070, mind you), I highly doubt that the CPU is the root cause of their issues. They aren't using DLSS or Frame Gen, but DLSS and Frame Generation are poor excuses for optimization that a lot of developers tend to use as a crutch these days, so I don't blame OP for avoiding those options to get a truer picture of the game's actual performance. One thing the OP still hasn't mentioned that would greatly affect performance is what resolution they're playing at. A 4070 playing this game at 4K at around 50 fps without frame generation sounds normal, for instance.
hate to tell you but everything is going to be using DLSS and AI in the future.
But for some reason playing in windowed fullscreen drops the frame rate down to like 50fps.
OP, keep in mind that all the shiny numbers nvidia is giving you are for DLSS & whatever-fancy-name-they've-got-next. They even fake TFLOPS now, because they're using Tensor FLOPS but they still shorten those to TFLOPS. Without all that mumbo-jumbo 30-60 FPS for a modern game on ultra settings @ high res is indeed a correct number.
****Just booted up the demo, and with everything on Ultra and DLSS Quality+FG, I'm getting 100-120fps @ 1440p and the game is buttery smooth. Also swapped out the DLLs for DLSS+FG to be the newest versions. i9 10850K + 4070
DLSS and FG are not faster calculations. Those are fancy names for generating inexistent data to fill the holes in between pixels of a low-res image (DLSS) and holes in between frames produced at 30 FPS natively (FG). It’s not even an upscaling as you simply can’t restore those data correctly for such low sampling rates (see sampling theorem).
Your words are a pure commercial, implying that everything that is non DLSS&FG is from dinosaur age while DLSS&FG are the only ones to even look at. It’s nasty.
You’re probably not very familiar with technology, because nVidia reached the technological limit with 3 nm lithography (electrons bouncing, hence, “calculator” is not working). But they still need to sell those new GPUs, so they fake the power with “AI” - and it’s not the AI from Philip K. ♥♥♥♥ or Isaac Asimov books. It’s algorithms known since the 1970s with a lot of inherent problems which are becoming apparent now and you would know those unless you’re living under a rock.