Steam 설치
로그인
|
언어
简体中文(중국어 간체)
繁體中文(중국어 번체)
日本語(일본어)
ไทย(태국어)
Български(불가리아어)
Čeština(체코어)
Dansk(덴마크어)
Deutsch(독일어)
English(영어)
Español - España(스페인어 - 스페인)
Español - Latinoamérica(스페인어 - 중남미)
Ελληνικά(그리스어)
Français(프랑스어)
Italiano(이탈리아어)
Bahasa Indonesia(인도네시아어)
Magyar(헝가리어)
Nederlands(네덜란드어)
Norsk(노르웨이어)
Polski(폴란드어)
Português(포르투갈어 - 포르투갈)
Português - Brasil(포르투갈어 - 브라질)
Română(루마니아어)
Русский(러시아어)
Suomi(핀란드어)
Svenska(스웨덴어)
Türkçe(튀르키예어)
Tiếng Việt(베트남어)
Українська(우크라이나어)
번역 관련 문제 보고
Actually, after reading what you said, it probably isn't monitor limits. So my bad, no need for VSR. But at that res, you are running into CPU limits. So you need to raise the resolution so you can take advantage of the 6600XT. Even at 1080p, you will run into CPU limits. That resolution is much lower. Upping the resolution will put the focus back on the GPU. The GPU will get the brunt of the load, as opposed to the CPU.
I posted this video yesterday showing CPU bottlenecks. Look at the CPU intensive games. At 1080p, the 3080 is not getting full GPU usage, while at 1440p and 4k, full GPU usage. Not all the games, but the CPU intensive ones. At 1080p, even a 10700K bottlenecks a 3080. Even an overclocked 10700K. At 1440p and 4K you don't get that problem. Just imagine how bad it is for his 10600KF and a 6600XT at 1280x960. The bottleneck must be huge! And CS:GO is a CPU intensive game. In fact, look at Fortnite and Flight Simulator, even at 1440p you are running into CPU bottlenecks!
https://www.youtube.com/watch?v=yH2nX4giIc4&t=1039s
Thanks a lot, I am really suprised how many people wanted to help some random user.
emoticorpse I have PCI 3.0, It causes 5% underperforming, I know. Anyway, i played faceit 128tick on average constant 300+ fps.
I was really dissapointed my fps in csgo especially because i have constant 150 fps in BF1 on ultra settings haha
Thanks one more time! I update status if something changes.
How many fps are you getting in CSGO?
I'd suggest OP run DDU to clear all traces of previous nVidia and AMD drivers, if he had simply uninstalled his previous nVidia driver, instead of running DDU, there may be traces of nVidia driver left to mess up the RX 6600 XT performance. Always, and I mean always, run DDU when switching between cards (even if there're from same company, especially so when switching between nVidia to AMD and vice versa).
Disable WiFi, or disconnect your LAN cable, make sure you have DDU and the latest AMD driver ready. Run DDU is 'Safe Mode', uninstall all previous GPU drivers, reboot and install the latest driver and let's see where you land.
I find it really 'funny' that many peeps who'd tried switching from nVidia to AMD would complain that performance took a dip (they'd simply uninstalled nVidia driver and installed AMD driver instead of running DDU first), and that they regret buying AMD. The simple fact of the matter is, while AMD drivers can be a tad finicky, they are relatively easy to install and great to use. I have both AMD and nVidia cards, and I find AMD Adrenalin control panel easy to use, with so many options for tweaking.
Edit - The opposite is true as well, when you switch from an AMD card to nVidia, and had simply uninstall the AMD driver followed by an installation of the nVidia driver, the performance and/or stability of the nVidia card may be affected as well.
Lol. I think you should go back and reread the posts. The OP figured out the problem. His GPU usage was way down because of the settings he was using. He changed his settings and some other stuff in the control panel, and got his usage up. I do believe that the low resolution he was playing at was causing problems. Simply put, his card wasn't getting much of the load, causing bottlenecks.
While I don't necessarily believe in doing clean installs of drivers when just updating drivers, it is a good idea when you change GPUs. But I believe that the OP did that. However, the OP did solve his issue I believe.
As far as AMD GPUs, there is nothing wrong with having an AMD GPU. I know some people, and a lot of enthusiasts, just will not use an AMD GPU for any reason, because of certain issues in the past and software complications that AMD has had, but I feel that AMD has really turned it around with the RX 6000 series. I don't know personally, but I have read a lot of complaints about the 5700XT. It had decent initial frame-rates and benchmarks scores, but people complained about micro-stutter issues and degrading performance. It is stuff like that this that has hurt AMDs reputation, and has created preconceived notions that AMD has to overcome.
I know early leaks has shown that their next generation, RX7000 series, will be absolute fire, and beastly. Even in the ray tracing department. Hopefully they are not as expensive as everyone is predicting, but people are also predicting that RTX 40 series will be super expensive as well. I probably won't be upgrading in the next generation, but most likely in the one after that. And I am definitely not loyal to Nvidia. There is definitely the possibility of me going AMD for my next GPU. Heck, maybe even Intel. We shall see.
I'd read a an alleged leak that claims that the RX 7600 XT (lower mid tier) GPU would beat an RX 6900 XT. Man, if that were true, looks like my RX 6900 XT would replace my VEGA64 in my 2nd rig, and an RX 7900 XT would find a new home in my main rig. I just hope it can do at least 3440x1440 and/or 3840x1080 max ingame + RT natively at 90fps or higher.
Wow. I currently have a 3070 Ti, but if the next gen is that powerful, and available, I might upgrade next gen instead of waiting. But we shall see.
But I also heard that they will be super expensive. I am talking the 7900XT in the $2000+ range. And if that is true, then the 40 series will be insanely expensive too. We are also going to have to see what Intel comes up with.
Also, AMD is also releasing some refreshes. The 6950XT is coming out with higher clocks.
Anyway, RX 6600 XT should be two times better than GTX 1060. It is possible to be a lower power state, so you can turn it off with RadeonMod.
And enable Smart Access Memory from the BIOS if you can.
If he's a bad player getting killed in the first twenty seconds per round then his experience will be the same at low resolution though?
The same thing is done (and rightly seen as abuse) in competitive combat flight sims. Difference is there it is (correctly) seen as borderline cheating and thus highly frowned upon.
To put it as simple as possible: the more pixels presented to a player the more static points of aim and thus the more chance to miss. Also the more pixels the more detail, which sounds nice for up close and big but also means something very far off might be less visible at higher rez.
In flight sims this is important for long range spotting where playing at a lower rez will result in a far off plane becoming more noticable faster.
In a shooter it means when you aim at a head there is less room for the game to decide you missed.
In both cases it's a crutch that results in a change to game meta through manipulation of game settings. Difference is only the FPS community sees it as OK. Most other gaming communities see such behaviors as silly at best or sleepy at worst.
End of the day it just leaves many liking. But hey, the pros make money at it so. Still does not make it any less objectively stupid. But sometimes stupid pays.
Had a few friends including one who was low-level mlg back when it mattered for things like H2 and H3 that ran settings like that unless the comp specified elsewise.
It is not just higher or lower resolution, the idea comes from changing the resolution ratio which is making the model bigger, so it is easier to one tap someone, but then the spray control is harder. We can see players like Lucky, Magisk and others who are playing at 1920x1080.
The best solution is this which is making the game comfortable.