Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Are you using the hotfix driver?
Yeah, all the other games you tried so far and that were made not recently I'm sure do work fine, for just about the same reason I can get Witcher 3 original version to work with a GTX 700 series laptop or whatever on high settings. That doesn't mean a GTX 770 is going to be able to run Cyberpunk though. I know for a fact a huge portion of your library is going to be older games so all those games are completely irrelevant. All that counts is NEWER games. And of those titles I guarantee if you tried to run even a slightly older but still new game like Doom Eternal at 1440p ultra your graphics card would start to struggle--not because the GPU die itself is too slow, but because your VRAM configuration was deliberately designed to be as corn cuttingly low end as possible while still being unnoticeable as a problem until the next gen launch. Which again, literally has been what nvidia always does and will do stretching back decades to their 1gb cards being clearly not made with the future in mind.
If someone misled you into believing you are "futureproofing" by going with nVidia then I feel sorry for you, for being that gullible, because the person was wrong or a shill. Current gen RDNA3 raytracing was always going to comparable or better than last gen cards like Ampere, so whatever advantage Ampere had for RT in the past, is erased now, and more importantly because of its limited VRAM makes it perform more poorly than RX 6000 cards because RT is VRAM intensive and so altogether it always was clear that 30 series was going to age badly. I and many others on this forum were involved in those exact tech discussions three years ago when Ampere was announced and nearing release found out about its VRAM, and in tech circles we argued about this. Some fools called it "futureproofing with RT" which I always insisted was stupid just because last gen nVidia RT was likely to be slower than RDNAx anyway, but what I personally didn't count on was that the VRAM limitation was going to be so crippling so fast that RDNA2 cards now outperform nVidia on raytracing.
The simple matter is that knowing nVidia buyers simply replace their cards every 2 years and sell it to some sucker to make the newest purchase affordable. They often sell their old card off to help pay for the new one because nVidia can be slightly faster than AMD and at times much more expensive but it ages really badly usually, and 30 series is aging legendarily badly just like we all knew it would. It will age just Kepler to the point of total irrelevancy, and people having 150 game libraries in 2016 probably could still run most of their games on GTX 780's just fine at the time, but that wouldn't have stopped the newest 2016 games running like ♥♥♥♥♥♥♥♥ on a 780's paltry 3gb VRAM when all settings are turned up. This is why people often buy Radeon GPUs instead, because Radeon often tends to be much cheaper while delivering comparable performance, but usually aging much better and that is why AMD or ATI/Radeon users typically held onto their cards for many more years without replacing it, because AMD users fully expect to get 4-6 years out of their cards at very high settings minimum depending what level.
I don't know how else to explain the facts to you dude
https://youtu.be/Rh7kFgHe21k?t=340
The midrange 30 series are absolutely pathetic at running this game and that's all down to AMD having literally double the VRAM on their comparable cards like the 6800 vs 3070ti which btw the 3070ti was an arse product anyway because at launch it already had a $100 higher MSRP than the 2070 super which a lot of people switched to AMD just because their $400 card was comparable to nVidia's $500 card with the only heavy advantage being if you wanted to RT or not. Since you could enable RT on AMD, the only reason to get a 3070ti was RT, because the 3070ti was slower than an RX 6800 overall but it did raytracing somewhat faster at launch, hence the choice was do you want really good raytracing right now and a ticking time bomb in your computer later, or do you want overall faster performance for $20 cheaper and much better longevity thanks to double the VRAM.
Again, if you want part of those tech discussions I feel sorry for you, but those are the hardware realities. Your 3070ti is going to keep aging super badly particularly at RT ultra and anything above 1080p. It's still a perfectly fine 1080p card, it just can't be doing high end raytracing or any kind of nonstuttery performance in the future thanks to having too little VRAM as it was deisgned.
Seriously you want to sit there and "flex" on people who have 250 games in their library because you have dated 8gb potato in your PC. And that potato in your PC wasn't even high end when it got launched XD
If you're going to make fun of people for being "poor" try not to do it with completely inferior hardware yourself. Also between GOG, Epic, and Steam I've got over 700 games, kid. Why are you so poor you can only afford half the games as me and only afford half the VRAM?))))))
I'm personally just here because I know the average steam user is a complete idiot, and that the average AAA preordering moutbreather is epically an idiot and that therefore is going to be a hotbed of tech illiteracy and tech ignorance and the Omnisiah demands that I spread technical truth. And part of that technical truth is the fact that 8gb was a standard for 1080p gaming introduced in literally the mid 2010s, and so it was obvious nVidia cut way too close this time and was rolling out completely gimped potatoPUs all the way up to high end like the 10gb 3080.
Stupid kids did not know this, and probably bought really expensive cards that were going to age like curdled dog turds as a result. I am here because people are polluting steam with their ignorance, and in turn not only sabotaging working man's efforts but trying to force me to endure yet another ♥♥♥♥♥♥ E3 launch lie where everything from The Division to Vampyr to Watchdogs look like a cartoon to make sure people's potato hardware can run it, and I'm not going to back down and say yeah let's nerf all visuals because nVidia potato owners can't handle high end graphics settings thanks to their crippled GPUs. I have never insisted nVidia nerf Hairworks to run it on something else. I don't particularly care for it and I make fun of the performance hit, but I'm not going to just demand Hairworks and RT get nerfed to work on worse hardware. Likewise, your potato nVidia GPU cannot handle full high res texture packs, deal with it. This happened because you rejected the truth of the omnisiah, and are perishing in the stutters of your darkness and ignorance.
I can just see the tears rolling down your cheeks as you angrily punched this entire TLDR into your Logitech keyboard.
The crash is the Nvidia kernel, it's a driver error.
That's how I know you're completely clueless, anyone with an ounce of knowledge would first try to identify the crash and and the application crashing, it helps avoid embarrassing moments like this when you out yourself as an idiot.
That's how I know you understand absolutely nothing about how PC hardware works
'Even on Reddit'
Is this a joke? You're a comedian right and this is a YouTube skit?
You're really not very smart.
are you STILL going to be ignorant to the fact my problem isnt a vram problem?