Tom Clancy's Ghost Recon® Wildlands

Tom Clancy's Ghost Recon® Wildlands

View Stats:
BigTough Feb 4, 2017 @ 1:19pm
Very badly optimized, film grain horrible!!!
It's very difficult to pull decent frame rate out of this game and it uses up so much vram it's retarded.

The film grain effect is SOOOOOO BAD that I can't even play the game.... yes it's that bad. May as well have just given us a screen of white snow. jesus....

There's quite a few other things that annoy me but this is what really stand out for me. I really want to like this game but if nothing is done about these things by release I'm not buying it.

How can a game like For Honor look and run amazing and this game look and run like complete garbage? I don't get it.
< >
Showing 1-13 of 13 comments
Beans Feb 4, 2017 @ 1:23pm 
Specs?
MP-ONE Feb 4, 2017 @ 3:07pm 
The beta ran very very laggy for me. I had to turn everything to medium/low and every effect off.

I have GTX970, 8GB RAM, i7 3770 3.4GHz, Win10 64bit.

Ubisoft better opitmized this game big time before they launch the full game.
Mockingcrow Feb 4, 2017 @ 4:36pm 
Originally posted by MPLAY7:
The beta ran very very laggy for me. I had to turn everything to medium/low and every effect off.

I have GTX970, 8GB RAM, i7 3770 3.4GHz, Win10 64bit.

Ubisoft better opitmized this game big time before they launch the full game.

Mid-range
GeForce GTX 950M
GeForce GTX 950
GeForce GTX 960M
GeForce GTX 960
GeForce GTX 965M
High-end
GeForce GTX 970M
GeForce GTX 970
GeForce GTX 980M
GeForce GTX 980
Enthusiast
GeForce GTX 980 "Notebook"
GeForce GTX 980 Ti
GeForce GTX Titan X

GTX 970 was the bottom of the "high-end," was sold for nearly $200 cheaper than the 980, for a reason. Its been 2 1/2 years, since release? Technology is ever evolving. So my 980 could handle ultra 4k on just about any game for the first year I bought it(except GTA 5, ahhh bad optimization rabble rabble(not really)) 970 im sure could handle some games on ultra 4k, but otherwise "very high." Fast forward 2 years, and newer graphic options, directX, etc. your card is gonna be sustaining medium or high graphics, while mine is sustaining Custom Very High/Ultra mix, on 2k resolution.

Wasnt there also an issue with the 970 and its memory? Its like 3.5k instead of 4k like the 980?

That said, see my specs below... im running at very high/ultra custom, and 2k resolution with VERY little frame rate dips. Will check on my next playthrough though, so dont QUOTE me :P

Honestly, tone the shadows down one notch from your baseline. Theres a gpu ram calculator built into the options, which people will argue you can go over due to Vram, but the moment I did, is when I started seeing frame rate issues, so now im about 400 under. MY GTA 5 memory usage is like 95%, and I have no frame rate issues, so im sure I could tone up 1-2 options on wildlands when the game comes out and I want to min/max it.

I dont want to say its not the developers fault(espeically given Ubisofts track record), but like I said, the 970 was the lowest tier of "high-end" 2 years ago, and I think they had a lack of memory issue. Games are beefing up their memory usage, and Vram is not an optimal device to rely on for why you GPUs ram runs out. Advice would be to always buy the best "high-end" gpu for the extra $200, so you could go an extra 2 years without needing an upgrade(barring any major GPU advancements.)

Will verify my frame rates claim here in a bit before I say its a certainty.

My specs: GTX 980 4g, AMD FX 9590 8 Core, 16g Ram, Windows 10.
Motor Feb 4, 2017 @ 4:43pm 
try vsync off and frame limit to 60 and minimum frames 50, thats helped bit 4 me
Last edited by Motor; Feb 4, 2017 @ 4:43pm
uR fAN <3 Feb 25, 2017 @ 9:52pm 
Originally posted by Mockingcrow:
Originally posted by MPLAY7:
The beta ran very very laggy for me. I had to turn everything to medium/low and every effect off.

I have GTX970, 8GB RAM, i7 3770 3.4GHz, Win10 64bit.

Ubisoft better opitmized this game big time before they launch the full game.

Mid-range
GeForce GTX 950M
GeForce GTX 950
GeForce GTX 960M
GeForce GTX 960
GeForce GTX 965M
High-end
GeForce GTX 970M
GeForce GTX 970
GeForce GTX 980M
GeForce GTX 980
Enthusiast
GeForce GTX 980 "Notebook"
GeForce GTX 980 Ti
GeForce GTX Titan X

GTX 970 was the bottom of the "high-end," was sold for nearly $200 cheaper than the 980, for a reason. Its been 2 1/2 years, since release? Technology is ever evolving. So my 980 could handle ultra 4k on just about any game for the first year I bought it(except GTA 5, ahhh bad optimization rabble rabble(not really)) 970 im sure could handle some games on ultra 4k, but otherwise "very high." Fast forward 2 years, and newer graphic options, directX, etc. your card is gonna be sustaining medium or high graphics, while mine is sustaining Custom Very High/Ultra mix, on 2k resolution.

Wasnt there also an issue with the 970 and its memory? Its like 3.5k instead of 4k like the 980?

That said, see my specs below... im running at very high/ultra custom, and 2k resolution with VERY little frame rate dips. Will check on my next playthrough though, so dont QUOTE me :P

Honestly, tone the shadows down one notch from your baseline. Theres a gpu ram calculator built into the options, which people will argue you can go over due to Vram, but the moment I did, is when I started seeing frame rate issues, so now im about 400 under. MY GTA 5 memory usage is like 95%, and I have no frame rate issues, so im sure I could tone up 1-2 options on wildlands when the game comes out and I want to min/max it.

I dont want to say its not the developers fault(espeically given Ubisofts track record), but like I said, the 970 was the lowest tier of "high-end" 2 years ago, and I think they had a lack of memory issue. Games are beefing up their memory usage, and Vram is not an optimal device to rely on for why you GPUs ram runs out. Advice would be to always buy the best "high-end" gpu for the extra $200, so you could go an extra 2 years without needing an upgrade(barring any major GPU advancements.)

Will verify my frame rates claim here in a bit before I say its a certainty.

My specs: GTX 980 4g, AMD FX 9590 8 Core, 16g Ram, Windows 10.

Sorry, but thats actually some high level ♥♥♥♥♥♥♥♥ here.
Sure the GTX 970 was the bottom of the High End Cards, but as sorry as iam, your GTX 980 wasn't the top end either.
Actually many Custom GTX 970's are about 1 or 2 percent behind a basic GTX 980, if they are overclocked by factory.
So the framerates of the 980 arent really that much better.
And even if we count that 3.5GB VRAM thing, which isnt really a point for me, because i never had any noticeable performance problems because of it, then you practically have about 500MB Vram more to use, not really a difference.

The more important thing is, how VRAM demanding this game is. High VRAM usage is a problem, that came alongside with the new consoles and their shared RAM.
Sure the draw distance is high e.t.c, but that still doesnt justify such a high VRAM usage.
It seems like, its just a bad console port.

The framerates are incredibly weird and range from 30-60 and back, even on Rigs with a GTX 1060 or 1070.
It runs like crap and sorry, yeah the GTX 970 is about 2,5 Years old sure, but it still runs almost every actual game fine on Ultra or Very High and if the 970 dies, the standard 980 dies together with it btw, so 200$ for a standard 980 Version are just a waste of money, atleast buy a TI then.
Ghost Recon Wildlands is one of the few exceptions and i wouldnt complain, if it ran fine on high or very high, but medium/low? Really...?
These 2 years arent justifying that, especially not, when there are a whole lot of other games, who run great on this card and after all, they recommend a gtx 970, alongside a 1060 and the game runs like ♥♥♥♥, on the recommended cards? Really....we are defitently talking of optimization issues here...in the open beta...11 Days before release...
R.I.P Ghost Recon Wildlands.
ThunderMonkey Feb 25, 2017 @ 9:54pm 
Yeah, in closed-beta it wasn't all too bad... but I noticed this time in open-beta (on PS4), there's a gritty/snowy white effect on a lot of stuff in the dark, which I hadn't seen in a game since probably the old Fallout New Vegas-ish days or so... and that was typically fixed with a graphics tweak or some such, thing is, it's the PS4 with uniform specs across systems and not really any graphic settings to speak of. :steamfacepalm:

(Gaming-screen is a SuperHD 3D Samsung 1080p 46" Hi-Def with high-end HDMI cables... even tried going through all the TV settings just in case it was something on my end, swapped the HDMI cable... and Ghost Recon: Wildlands is the only game doing it.)
Last edited by ThunderMonkey; Feb 25, 2017 @ 9:56pm
Danne Feb 25, 2017 @ 10:01pm 
Originally posted by MPLAY7:
The beta ran very very laggy for me. I had to turn everything to medium/low and every effect off.

I have GTX970, 8GB RAM, i7 3770 3.4GHz, Win10 64bit.

Ubisoft better opitmized this game big time before they launch the full game.

You should check your comp, something is not right. Ive got similar spec like you even older cpu. I72600 with gtx 970 and im getting 60-62 fps on avarage on high preset. This was checked with builtin benchmark test tho.

//Ohforf
Drothen Feb 26, 2017 @ 1:21am 
Originally posted by Ohforf:
Originally posted by MPLAY7:
The beta ran very very laggy for me. I had to turn everything to medium/low and every effect off.

I have GTX970, 8GB RAM, i7 3770 3.4GHz, Win10 64bit.

Ubisoft better opitmized this game big time before they launch the full game.

You should check your comp, something is not right. Ive got similar spec like you even older cpu. I72600 with gtx 970 and im getting 60-62 fps on avarage on high preset. This was checked with builtin benchmark test tho.

//Ohforf
Stfu troll.
Wumpo-Botan Feb 26, 2017 @ 1:53am 
I had the same problem... it was unplayable until i set the resolution to the highest possible.
Now it works acceptable.

Originally posted by BigTough:
It's very difficult to pull decent frame rate out of this game and it uses up so much vram it's ♥♥♥♥♥♥♥♥.

The film grain effect is SOOOOOO BAD that I can't even play the game.... yes it's that bad. May as well have just given us a screen of white snow. jesus....
Shintai Feb 26, 2017 @ 3:01am 
GTX 1080, ultra settings and everything good for me in the beta :)
Danne Feb 26, 2017 @ 11:49am 
Originally posted by TRON:
Originally posted by Ohforf:

You should check your comp, something is not right. Ive got similar spec like you even older cpu. I72600 with gtx 970 and im getting 60-62 fps on avarage on high preset. This was checked with builtin benchmark test tho.

//Ohforf
Stfu troll.

Hey asswipe, are u retarded? I can post the Benchmark if you belive im trolling. Just cos you want to make the game look like its got bigger problem then it has you shouldnt need to lie.
mac_angel Feb 15, 2018 @ 6:53pm 
Not sure if you found the answer, but I have it for you if you still need it.
It's the Sharpening effect under Video (not Graphics). Scroll down and you'll see it. By default it's set to 30% I believe. Slide it to 0 and it should fix the grain effect.
I came across this post because I had the same problem. I'm running 1080tis in SLI, and three 1080p screens, so slightly less than 4k. I ran it on Ultra and it actually looked fine (not much grain that I can remember) but the fps sucked. As soon as I started with the settings, going to Very High, horrible grain effect. I found the answer on a Reddit page.
I hope this helps

PS. Not sure what's with the people on here arguing about what they see or don't see. It's an obvious problem since Google auto-completed it for me when I searched. If something works for you and you don't have anything helpful to say, don't troll and start a fight with some stranger on the internet.
White Lion Sep 29, 2023 @ 2:00am 
Originally posted by mac_angel:
Not sure if you found the answer, but I have it for you if you still need it.
It's the Sharpening effect under Video (not Graphics). Scroll down and you'll see it. By default it's set to 30% I believe. Slide it to 0 and it should fix the grain effect.
I came across this post because I had the same problem. I'm running 1080tis in SLI, and three 1080p screens, so slightly less than 4k. I ran it on Ultra and it actually looked fine (not much grain that I can remember) but the fps sucked. As soon as I started with the settings, going to Very High, horrible grain effect. I found the answer on a Reddit page.
I hope this helps

PS. Not sure what's with the people on here arguing about what they see or don't see. It's an obvious problem since Google auto-completed it for me when I searched. If something works for you and you don't have anything helpful to say, don't troll and start a fight with some stranger on the internet.
Thanks a lot. This one configuration is causing so much visual irratation.

I have RTX 4080 16GB and set all settings to ultra quality. But it gave me "film grain" effects. Once I set "Sharpening" to 0%, it's all fixed.
< >
Showing 1-13 of 13 comments
Per page: 1530 50

Date Posted: Feb 4, 2017 @ 1:19pm
Posts: 13