Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Maybe I explored a bug, but take a look:
https://steamcommunity.com/sharedfiles/filedetails/?id=3006382177&fileuploadsuccess=1
IF I try to activate this feature (1440p), the game shows me "DLSS is not supported!"
I mean ... DUDE ... really?!
... How ever, the person deleated his answer silently ... I guess I know why.^^
Nice Dude!
It's okay it wasn't rough how you wrote it and yes indeed, I think I got a bug explored.
I used NVClean-Installer, to get rid of every bloatware of the driverpackage.
Exported an installer out of it.
Then I used DDU in Safe-Mode with Windows-Update-Cancelation, to prevent my OS for installing any drivers and wiped the drivers.
After that I made a clean installation of the package and mostly this results in lightly better framerates and performance.
This time or in this case "Unsupported DLSS" could be based on this type of driver-installation.
Updated the drivers now to 536.67 and I hope that will fix my problem.
Thanks mate, we're good, no problems.
I try everything now, to get DLSS 2.0 running again, becuase I was getting 90fps+ on Quality-Settings if I do remember it right.
I did a fresh Driverinstall now.
I checked the Installation-Files with Steam.
Now I wiped the gamedata and reinstalling the game. 🤣
I had a different problem back then and a fresh installation of the game helped me that time and I hope it will work now as well. I mean, there might was a problem with the update or while the game had updated, who knows.
This is frustrating ... I don't like to be screwed up like this. 😂
Bad 4000 series ? What ? The performance of the 4080 and 4090 are fantastic, Only thing that's high is the price.
DLSS2 wasn't removed, I'm looking at it right now on my HTPC which has a 3080.
40'th Series is another story, but not this time.
I wiped the Drivers again and installed everything completely fresh, but I still can't get access to DLSS 2.0?!
This is insane now... I will deactivate RTSS Overlay and try again, but I guess it will not fix my issue.
I thought that DLSS/2.0 was really excluded for DLSS 3.0 ... it still feels like it was, only because of this stupid bug that I have to deal with.
I will give it a try tomorrow... this day comes to an end now.
Gn8
And dlss works in other game as well so don't know what you are talking about.
There was recently new update for
3.13 dls maybe you are using wrong version.
To enable dlss all you do is change anti-aliasing option in game.
On my pc the dlss quality with max rt, ultra + seems to add 10-15 fps vs tauu
But even then it is not 60 fps so I wonder what 3090 would get.
No I have a 12gb 3080 and I've been one of the lucky ones that hasn't had 90% of the problems that i've seen other people have. Other than the crap performance when this first got updated. No crashes no dlss grey out bugs. I feel bad for the people who have so many issues and try to help if I can. Way to much toxicity in these forums so I try to help instead of insult.
RT obviously was off. I had Hairworks off. 2560x1440. And FSR was off because it looked WORSE and the FPS didn't improve. So those are roughly the settings. I'd get roughly 80FPS average in the first town, then I just clamped it down to 60FPS VSYNC'd and things stay pretty smooth.
RTX4070 (just got it) - I didn't change much in the game settings from the GTX1080. I tried maxing them out but ray-tracing still caused a few issues. I can't deal with that right now as the game is just supposed to be a stress reliever.
So I'm basically at a locked 60FPS VSYNC'd, ray-tracing OFF, and mostly everything else maxed out at 2560x1440 (DLSS Quality).
(My general advice is just aim for a smooth experience in terms of frame times and avoid ray-tracing for now no matter what system you have. But that's me. When I have an hour to jump into the game I don't want to have the immersion broken... also, I'm really happy with my RTX4070 although I didn't buy it to improve Witcher 3 as that already ran great for me.)
You aren't Up2Date - even the 4070 is tweaked by nVidia, to show much less powerconsumption by internal HAL sensors!
Igor's Lab has shown what's going on and IF nVidia hadn't cheated the telemetric-readings, the "Efficiency" wouldn't be a reason to buy them anymore.
4060 "only" listed at ~115W ... consumpting already 135W (+20W = +17,4%) at the very low-end is this fact more than impressive and you can imagine, what's going on the way up.
If you have a Power-Display by ThermalGrizzly + PCIe-Socket-Consumption-Tracker ... then you can calculate the physical true consumption, but all HAL-Sensor values are true lies on the 4000'th Series.
Galax HOF 4090 with 1000W is the most Bully Flexing Card with "IDGAF"-Style ever and where was the 3090(Ti) with such powerconsumption?^^
These Cards can do a lot more than 3000'th series are able to ... and in sight of Frame/Watt, what does it mean? 🤗 Don't ask me! 😅
Yep, it's the most consumer unfriedliest series which was ever launched! 😑 I feel very sorry, that nVidia screwed so many people up. This doesn't mean, that 4080/Ti or 4090/T... i doesn't exist, but I don't mean, that this cards are bad in performance, but PROMOTION plus how ppl were misleaded badly with false promises, are the worst fact about the 4000'th series!
No commenting on the problem, that "without" DLSS 3.0 + Framegeneration (Fakeframing/Framecheating), the performance-increasement is really bad and lowest degrea from Series-to-Series ever listed.
I hate nVidia ... and love some of their products ... I can't deny that it would sound like a way of craziness if someone would spell it out loud, but to make it clear "I hate the company", but love some of their technologies and products.
... But mate ... what was the topic?
- If the shoe doesn't fit your foot ... don't take it on.
Just get a better card NUB
Jk.
Likely when they updated the game just recently, it got bugged for many people.
But ♥♥♥♥♥♥♥♥♥ so many tin foiled hat people.
Theyre not going to remove literally a feature that most people can afford some what, to push something LITERALLY only 1-2% of people can afford, that isnt profitable in any way.
I can believe theyre trying to push people to buy a nvidia card cuz well, thats how sponserships work, but theyre def not removing the features that actually make them profit lmao.
Instead of putting on the tin foil hat. How about you look at the likely reality. Its cdpr with the witcher 3 next gen port. its very likely just bugged. Theyve done a ♥♥♥♥ job with the port for the higher tech stuff like this. And implementing some of the mods they did a lazy job to scan for things that shouldnt be there. It isnt nvidia shooting them selves in the dong to make less money.
1) POWER consumption:
TECHPOWERUP doesn't go by sensors. Just FYI. It tests the actual 12V current draw by the card to determine the actual power the card draws... an RTX3080 and RTX4070 basically trade blows in terms of average FPS depending on the resolution etc.
The RTX4070 however draws about 60% of the power (and has slightly more VRAM so the actual GPU processor is closer to 2x the efficiency):
https://www.techpowerup.com/review/nvidia-geforce-rtx-4070-founders-edition/39.html
I'm sure there's likely more to this story and perhaps NVidia is being deceptive but to me the Techpowerup methodology seems sufficient.
2) "The fact that a 4060 has 5x more (10x for a 4090) L2 cache than a 3090ti is a game changer."
Uh? No, it's not. The purpose of a cache is to offset insufficient bandwidth to the memory. So you're comparing apples and oranges between different designs. It reminds me of the original XBox One where they put in an eSRAM buffer to compensate for the DDR3 bandwidth limitation and then got STUTTERS when the data filled up the eSRAM.
I'm not saying the design makes no sense. Just that the purpose of the cache + memory bandwidth is simply to ensure that the GPU processor isn't left waiting for data. That's it.
Getting hung up on the technology is rather pointless at the end of the day. You run your benchmarks and the results are what they are.
3) NVidia pricing:
Here's a big shock, but they aren't an altruistic company. They answer to shareholders. It would be stupid AND illegal frankly to charge less than they could. The ONLY legitimate complaint is if they lie. And NVidia does that so I'm not giving them a pass. As does AMD, and Intel. I bought an RTX4070 and couldn't be happier. I'd love it to have been cheaper but I only upgrade infrequently. It's been SEVEN YEARS since i bought my GTX1080.
Besides, how much of the pricing is actually the "fault" of NVidia? The answer isn't actually that simple, but the short answer is that supply and demand tend to dictate pricing. Like MANY things in a Capitalist society. (of course you could DEMAND they drop their prices. That's fine, right?)
4) "performance increase is really bad"
So what if it is? If you rename an RTX4070 to an RTX4060 but charge the same would that be better? The specs and benchmarks are there. Nobody's being forced to buy the card.
I will, again, repeat that LYING is really the only issue. Such as implying that FRAME GENERATION is comparable to real FPS. On this I completely agree. But I disagree that the performance for the next gen has to be some arbitrary number like 25% faster at the same "80" class doesn't really make sense to me.
SUMMARY:
The VALUE of a product is determined by the consumer that purchases it. The specs are fun to talk about but benchmarks don't care about opinions.
Anyway, I shall go back to playing RDR2. I just started and it's fun. I hope nobody is too pissed off that I spilled some facts.
Yep, I went from a 3090 to 4090 and the performance increase in some games is massive, That's before using frame generation and if I played at the same FPS as my 3090 which is now in my HTPC by capping frames then power usage would be a lot lower.
RDR2 for example, I went from 70FPS to nearly 140FPS in the same scenario, That's pretty impressive and in other games I've seen an FPS increase of at least 50% on the low end, IMO that's nothing to complain about considering the graphical complexity of games now.
As the Screenshot is showing "If I try to activate DLSS, the game responses with an error, telling me that there's no DLSS supported!", so I wasn't able to get access to it.
Surprisingly, with nothing changed on hardware or software, DLSS started to work fine again?!
WTF is going on ... it looks like there weren't ever a problem, but I did nothing at all ... that's so confusing, I don't know what's going on?!
++Now I know!++
//Update on this!
So, on the 19'th of this week, there was the v4.04 Patch released, which I think had already fixed the issue, with "DLSS is unsupported!". That's the only possible explaination for me.
Now I can report, that after the updates, I get around 80-90fps with RT at max. and in some rare scenarios I got 48fps the lowest.
All other settings are as well maxed out.
The 4060 especially is shown on TechPowerUp-Homepage with 115W TDP.
Igor's Lab showed us, that this card run's fine with 135W consumption, but it's not overclocked. 🤷♂️
Igor used Cybernetics Hardware, to read out true powerconsumption, maybe it's the way how you test it, to get different results, but mate only for comparison and in sight of wrong results ... if TechPowerUp listed 115W for a 4060, wich is really consumpting 125-135W, can you take all the other values for the higher cards for real?
I doubt that and I'm not pissed off. 😂😎
My own 3090 Ti is able to run with 490W (by HAL sensor-readings), if the card isn't optimized with clock-settings + this and that. After some investigations in the first two weeks of using it, I was able to get it running with 420W on peak, but defeated all 3D-Mark results by 2-8% over average 3DMarks with equal systemconfigurations.
My system is often faster than 95% of all tested systems, with undervolting + overclocking.
I pay 34c/kWh in germany ... I got an old contract, made in February of 2022, but active since January this year with a 2-Year Price-Guarantee, but I don't care cuz I didn't care about it at all. Energyprices aren't my financial problem ... my PC as a hobby is my financial problem. 🤣
Let's say that I play every weekend, what's not really true, but to have a basis to calculate with, and I play every weekend 10hrs straight, then I play 40hrs in a month.
GPU 420W + CPU 80W + 20W SSD's + Samsung GQ55QN94A 165W (HDR-Mode)
= 685W in Total
685W x 40h
= 27.400Wh
27.400Wh : 1000
= 27,4kWh
27,4kWh x 0,35€
= 9,59€/Month for my system for electricity
I have a monthly tee-off by 70€ for electricity and 80€/Month for gas-delivery and I get a big repayment every year, so I don't get hectic spots because of increased energy-costs.
Yep, the only card which isn't that bad is if you upgrade from a 3070, that's true.
True facts and some ppl are misleaded by the 3D-V-Cache theme on AMD because this results in better 3D-Performance.^^
"OH! There's something we can get use of!"
On YT, "Classical Technology" named one of his videos "You Will RETURN THIS Nvidia GPU After Watching.." and what he was showing was very interesting about VRAM.
Many other Tech-Channel jumped onto this train and many customers are simply ignoring nVidia-GPU's today.
I mean, the best value for clarification of how bad Price-Per-Performance this time is, is the fact that this series is the most badest sold series in nVidia's Market-History. 😅
Yes nVidia, feel the Love that we got right now!
Yep ... we can hope for a "better priced" 5000'th Series.^^
For me in personal, I will not going back to lower ends.
I had a 1060 6Gb with an Intel i7 6700k and 16Gb RAM from 15.09.2016!
Over a timespan of 6'th months I bought the hardware that I wanted, but with a delivery-delay by AMD's 7950X3D, I builded my rig after 7 months in total.
Today my system is made out of:
Mainbord: ASUS Rog Strix X670E-E Gaming WiFi
BIOS Version: 1416
BUS: 100MHz “fixed”
FCLK: 2133MHz “fixed-OC”
MEM-CLK = MEM-CTRL-CLK: 1:1 “fixed”
Curve-Optimzer: UV-Adjusted (-Offsets)
Precision-Boost-Override: +75MHz (5.825MHz Boost-CLK)
EXPO/XMP: Disabled/Manualadjustments
RAM: G.Skill DDR5-6400 64Gb (4x16Gb)
MEM-CLK: 5.800MHz
CL34-36-36-35 “fixed-OC”
1.4000V “fixed”
CPU: AMD Ryzen 9 7950X3D
CCD-0: 5.250MHz
CCD-1: 5.825MHz
GPU: GIGABYTE nVidia GeForce RTX 3090 Ti
OC-Plugin used
GPU-CLK: +125MHz (Curve-Optimizer +Offset)
GPU-MEM-CLK: +1150MHz
2D/3D Profile in use
Driver: v536.40
System-Space:
Internal:
2x 2Tb NVMe SSD Kingston Fury Renegade
1x 1Tb Samsung-HDD
1x 250Gb Itenso S-ATA SSD
External:
1x 8Tb Western Digital Elements (Black)
1x 3Tb NAS
OS: Windows 10 x64 Professional
Build: 19045.3208
Version: 22H2
System-Tasks:
Video-Editing
Picture-Editing
Gaming
Gaming-Resolution: 1440p 12bit 120Hz
Editing-Resolution: 4k UHD 12bit 120Hz
PU: beQuiet! Dark Power 12 1.500W 80+ Titanium
Tower: FractalDesign Pop Ait XL (RGB)
Systemfans:
6x IceBerg Thermal IceGale Xtra 120mm HighPerformance (3.000Rpm - 214,63m³/hr)
1x IceBerg Thermal IceGale Xtra 140mm HighPerformance (2.500Rpm - 285,61m³/hr)
CPU-Cooler:
Noctua NH-D15(S) customized with 3x ChromaxBlack 140x150mm Fans
CoolingPaste:
Thermal Grizzly KryonautExtreme (but Arctic Cooling MX6 is performing as good as this TG-Paste in my own tests)
https://steamcommunity.com/sharedfiles/filedetails/?id=3007336584&fileuploadsuccess=1
https://steamcommunity.com/sharedfiles/filedetails/?id=3007338341&fileuploadsuccess=1
For me it's mostly 8 times the performance of the old system.
I'm not a fan of the RGBisten Szene, but ey "Indirect" RGB-LED-Lightning is cool.
I rebuilded parts of my old system, like the tower (AeroCool DS Cube Black) with the RGB-Fans of my new tower, with my old beQuiet! Dark Power Pro 550W 80+ Titanium powersupply and ASUS TUF Gaming B550M-Plus mainbord, with the AMD Ryzen 7 5800X3D, with my old Noctua NH-D14 CPU-Cooler and Corsair Vengeance LPX 32Gb Kit DDR4-3600, with a NVMe SSD Kingston Fury Renegade 1Tb, installed another Windows 10 x64 Pro Version that I owned once and the GPU is a ASUS Dual RTX 3060 12Gb. I've sended this little bully to a friend and he's happy with it. Upgraded from a GTX 1050 ... to 3060 ... with this CPU was like a hughe upgrade like I did.
Part of the money was a wrong delivery which I found in the packages ... after unpacking everything that I collected for a good half year, I wasn't able to find out which shop sended it, but I owned an Intel NUC 12 this way, that I sold for ~890€.
Well ... was a little investigation anyway back then because of other happenenings, but my goal was to make someone happy. I was more happy about this little project as for my own.