Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
A lot of more recent games are hitting CPU bottlenecks. I have been assuming "Ghost Recon: Breakpoint" is a VRAM limitation, my old GTX 970 could run at the medium graphics setting on 3.5GB of VRAM, while on foot. Completely unplayable at the Medium preset when driving.
The Low preset is extremely bland and unpleasant to look at, Medium is the lowest I would recommend.
An 8-Core Ryzen 7 with an RX 6600 was an unpleasant experience with the low frame rates while driving. Instead of the pauses, I experienced about 40-50 FPS while driving with the default preset of "High" for this particular configuration. I've been meaning to play it for a while on the Medium preset to see if I can achieve a locked 60 FPS.
I'm looking at optimizing RAM, VRAM, PCI-Express bus bandwidth, and SSD/NVMe drive performance. Not so much the GPU rasterization, which is typically what causes GPUs to overheat,...
Basically, I still haven't gotten the game to run on any of my computers yet.
I have a Ryzen 5 4000 series. So it will be unplayable for me?
Apparently, it has come into vogue as a means of distinguishing "smear frames" from "source frames". "Frame-Gen" technologies, such as DLSS 3, or "Motion-Interpolation" or "Motion-Smoothing", can fake higher Frames per Second in various situations. However, the lower the source or "rasterization" frame rate, the more inconsistent and error prone the result.
You may have 30 FPS "rasterization" and see on screen up to 60 FPS. If your rasterization frame rate fluctuates between 30-40 FPS, then your actual frame rate will fluctuate between 30 and 80 FPS, making the difference more striking.
I'm saying that most GPUs won't have a problem if you turn the graphics preset to Low. "Low" generally doesn't overheat GPUs.
A Ryzen 5 4000 series refers to a CPU, not a GPU.
The CPU should be fine on its own. However, if you are referring to the Integrated GPU iGPU/APU, then the iGPU will probably be too weak to play Ghost Recon: Breakpoint.
Even the most recent Ryzen 9 iGPU/APUs can barely touch a Geforce GTX 960, and presumably are just as far behind a Radeon R9 280X.
Ghost Recon: Breakpoint indeed might just melt a Ryzen 5 4000 series iGPU/APU.
I mean compared to your Ryzen 7 RX 6600, which had bad framerate while driving. That's a much better CPU than mine.
The AMD Ryzen 7 is a 5000 series CPU. For your run of the mill Quad Core optimized titles, it shouldn't make a huge difference.
The Radeon RX 6600 is a 6000 series GPU. If you don't have a dedicated GPU, then the iGPU on your Ryzen 5 4000 series is probably closer to an XBox 360 or "7th Generation". The RX 6600 is a 9th Gen graphics card, almost a PlayStation 5.
I just ran a few more benchmarks now. On the Medium preset I was able to achieve a locked 60 FPS. The AMD Adrenaline utility seems to have highlighted a CPU bottleneck. The GPU was humming at 50% to 60% utilization. The process kept hitting 13% on the CPU, which is 100% on a single core.