The Witcher 2: Assassins of Kings Enhanced Edition

The Witcher 2: Assassins of Kings Enhanced Edition

Jubi Jan 22, 2014 @ 7:15am
Help Running this Game?
So I tried starting up the game. Opening cinamatic pretty boss, it got me excited to delve in and play. Sadly, a wagon with square wheels would run smoother then this. Whether the setting is low or high, it's extremely choppy, and when the slight bit of ingame action I got - even just moving around - it's like playing a text-based RPG using dialup through a potateo.

My Rig: GeForce GT 610, AMD FX-8100 Eight-Core Processor, 8g Ram
Windows 7 home.

So, something I can do to make sure this game is playable? Some little hidden settings I can click? I have no problem playing a lot of other games, such as Crysis 2 (on high), Skyrim (on high), or Planetside 2 (No lag in big ass battles! On medium too!).

I don't mind playing on Medium, or even at say 30fps. I'd just enjoy being able to play the game, because as of current, my system is choking at every attempt.

< >
Showing 1-15 of 33 comments
WolfEisberg Jan 22, 2014 @ 9:01am 
Your video card is way below minimum requirements for this game. Best I can tell you that won't include an upgrade is set everything to the absolute lowest, and set the resolution to the absolute lowest. But that still may not work.
Volkayno Jan 22, 2014 @ 9:22am 
yea the problem is your graphic cards price is even lower then the games price xD
FANTA DEFENDER Jan 22, 2014 @ 9:42am 
Who sold you that pice of ♥♥♥♥?
Grimminski Jan 22, 2014 @ 11:11am 
Yeah. Absolute minimum settings @ 1024x768, or else you can't play it. You really need an upgrade if you want to play anything decently demanding. Nothing less than a 660 if you want things to be playable.
Last edited by Grimminski; Jan 22, 2014 @ 11:11am
Jubi Jan 22, 2014 @ 12:21pm 
Alright, thanks for letting me know!
EvilKlown Jan 22, 2014 @ 12:37pm 
I play this game just fine on my GTX 560TI, Medium-High settings. (1920x1200 resolution too.)

Originally posted by eisberg:
Your video card is way below minimum requirements for this game. Best I can tell you that won't include an upgrade is set everything to the absolute lowest, and set the resolution to the absolute lowest. But that still may not work.

Lol... Just... No dude. If Op's card was a 6 series (6***, like the 8800 is an 8 series) then yes, it would be below the minimum requirements. A GT 610 is in the GT/GTX 600 series, just like my GTX 560TI is in the 500 series.

Min required GPU: GeForce 8800 (512 MB) or Radeon HD3850 (512 MB)
Recommended required GPU: GeForce 260 (1 GB) or Radeon HD 4850 (1 GB)

Since OP's GT 610 should fall right in to the recommended specs.

I posted the following in response to a laptop user having issues. It seems to have helped them greatly...


System = i7-2600K, 16GB Ram, Geforce GTX 560TI (1GB)..

For those who don't know how/where to adjust settings... You can access these through the game's launcher, under options. Then click the "Show advanced options" check box, you will then have all of these options. (See the ReadMe.txt in "..\SteamApps\common\the witcher 2" for more information on what each setting does, the information can be found in section "10)")

My settings in game are as follows.. If it helps any one.. (Keep in mind I am running on a desktop, not a laptop.)

Screen Resolution: 1920x1200 <-- Your choice
Fullscreen: Enabled <-- Your choice
Aspect Ratio: 16:10 <-- Your choice
Configuration Set: This will show Custom after you begin making changes.

Texture Downscaling: None <-- Not sure, but this might help with lesser GPU's
Texture Memory size (MB): Very Large <-- Adjust accordingly with your GFX memory
Shadow Quality: Ultra <--Laptop should probably have this a bit lower than Ultra

Numbered of Shadowed Lights: Low <-- Your choice, but if Geforce experience suggests it to be Low, you might stick with that.

LOD Distance: Far <-- May go less with Laptop

Bloom: Enabled <-- Should be fine unless you're running on something really old..
Light Shafts: Enabled <-- Same as with Bloom

Anti-Aliasing: Enabled <-- This can help some, but can also have a negative impact on frame rates if your GPU isn't powerful enough. Use as you wish.

Blur Effect: Enabled <-- Lesser GPU's, and Laptops, should try without Blur. It may or may not help.

Depth of Field - Gameplay: Enabled <-- For me this is an "always", of course, check to see if it gives you any negative impact.

Vignette: Enabled <-- ... Why they can't just say fancy designs (or possibly video clips... the definition varies..) on things... No idea... This is up to you.

Wet Surfaces Rain Effect: Enabled <-- If your GPU seems to be angry when it rains, disable this. It should help a bit by getting rid of the wet surfaces.

SSAO: Disabled <-- If you aren't running something seriously good, SSAO will only hinder performance. Disable it unless you're running SLI or Crossfire with some heavy duty GPU's.

Motion Blur: Enabled <-- Same as with Blur Effect

Cinematic Depth of Field: Disabled <-- Geforce experience suggested this be disabled. I went with it, I think it won't matter much though. Try at own risk?

Depth of Field - Cutscenes: Enabled <-- Sure, but doesn't this make the previous option a little confusing? Do as you wish with this option, if you seem to get some horrible issues with it disabled, then enable it. Same goes the other way.

Dangling Objects Limit: Disabled <-- Unless you're running something really weak, this should likely be disabled.

UberSampling: Disabled <-- As with SSAO, keep this disabled unless you have the power..

Vertical Sync: Enabled <-- This is generally optional but it will help to keep your GPU from working overtime as it will lock the frame rate to match your display's refresh rate. As such, the temperature f the GPU will be lower too. This is a Win for laptops.

Decals: High spec <-- I don't think this one even matters, mostly because I have seen no performance increase or decrease. Again, adjust as you wish.

Unless you have some monsterous card, and are using SLI/Crossfire, Avoid Ubersampling. Mobile cards (###M) should always run a bit lower settings than a full blown PC as the mobile series cards do not have the full ability that a desktop GPU has. (They can kick some serious tail but not as much as a full blown desktop GPU.)

Now as I said at the begining of this, I am on a desktop, not a laptop. But this should help you get an idea of what to adjust so you don't have to play on such low settings.

(For more information on these settings and exactly what they do and have an impact on, check the ReadMe.txt in the game's installation directory. Look for section 10.)
Last edited by EvilKlown; Jan 22, 2014 @ 1:36pm
LeadStarDude Jan 22, 2014 @ 8:41pm 
Originally posted by EvilKlown:
I play this game just fine on my GTX 560TI, Medium-High settings. (1920x1200 resolution too.)

Originally posted by eisberg:
Your video card is way below minimum requirements for this game. Best I can tell you that won't include an upgrade is set everything to the absolute lowest, and set the resolution to the absolute lowest. But that still may not work.

Lol... Just... No dude. If Op's card was a 6 series (6***, like the 8800 is an 8 series) then yes, it would be below the minimum requirements. A GT 610 is in the GT/GTX 600 series, just like my GTX 560TI is in the 500 series.

Min required GPU: GeForce 8800 (512 MB) or Radeon HD3850 (512 MB)
Recommended required GPU: GeForce 260 (1 GB) or Radeon HD 4850 (1 GB)

Since OP's GT 610 should fall right in to the recommended specs.

I posted the following in response to a laptop user having issues. It seems to have helped them greatly...


System = i7-2600K, 16GB Ram, Geforce GTX 560TI (1GB)..

For those who don't know how/where to adjust settings... You can access these through the game's launcher, under options. Then click the "Show advanced options" check box, you will then have all of these options. (See the ReadMe.txt in "..\SteamApps\common\the witcher 2" for more information on what each setting does, the information can be found in section "10)")

My settings in game are as follows.. If it helps any one.. (Keep in mind I am running on a desktop, not a laptop.)

Screen Resolution: 1920x1200 <-- Your choice
Fullscreen: Enabled <-- Your choice
Aspect Ratio: 16:10 <-- Your choice
Configuration Set: This will show Custom after you begin making changes.

Texture Downscaling: None <-- Not sure, but this might help with lesser GPU's
Texture Memory size (MB): Very Large <-- Adjust accordingly with your GFX memory
Shadow Quality: Ultra <--Laptop should probably have this a bit lower than Ultra

Numbered of Shadowed Lights: Low <-- Your choice, but if Geforce experience suggests it to be Low, you might stick with that.

LOD Distance: Far <-- May go less with Laptop

Bloom: Enabled <-- Should be fine unless you're running on something really old..
Light Shafts: Enabled <-- Same as with Bloom

Anti-Aliasing: Enabled <-- This can help some, but can also have a negative impact on frame rates if your GPU isn't powerful enough. Use as you wish.

Blur Effect: Enabled <-- Lesser GPU's, and Laptops, should try without Blur. It may or may not help.

Depth of Field - Gameplay: Enabled <-- For me this is an "always", of course, check to see if it gives you any negative impact.

Vignette: Enabled <-- ... Why they can't just say fancy designs (or possibly video clips... the definition varies..) on things... No idea... This is up to you.

Wet Surfaces Rain Effect: Enabled <-- If your GPU seems to be angry when it rains, disable this. It should help a bit by getting rid of the wet surfaces.

SSAO: Disabled <-- If you aren't running something seriously good, SSAO will only hinder performance. Disable it unless you're running SLI or Crossfire with some heavy duty GPU's.

Motion Blur: Enabled <-- Same as with Blur Effect

Cinematic Depth of Field: Disabled <-- Geforce experience suggested this be disabled. I went with it, I think it won't matter much though. Try at own risk?

Depth of Field - Cutscenes: Enabled <-- Sure, but doesn't this make the previous option a little confusing? Do as you wish with this option, if you seem to get some horrible issues with it disabled, then enable it. Same goes the other way.

Dangling Objects Limit: Disabled <-- Unless you're running something really weak, this should likely be disabled.

UberSampling: Disabled <-- As with SSAO, keep this disabled unless you have the power..

Vertical Sync: Enabled <-- This is generally optional but it will help to keep your GPU from working overtime as it will lock the frame rate to match your display's refresh rate. As such, the temperature f the GPU will be lower too. This is a Win for laptops.

Decals: High spec <-- I don't think this one even matters, mostly because I have seen no performance increase or decrease. Again, adjust as you wish.

Unless you have some monsterous card, and are using SLI/Crossfire, Avoid Ubersampling. Mobile cards (###M) should always run a bit lower settings than a full blown PC as the mobile series cards do not have the full ability that a desktop GPU has. (They can kick some serious tail but not as much as a full blown desktop GPU.)

Now as I said at the begining of this, I am on a desktop, not a laptop. But this should help you get an idea of what to adjust so you don't have to play on such low settings.

(For more information on these settings and exactly what they do and have an impact on, check the ReadMe.txt in the game's installation directory. Look for section 10.)
GT cards are way weaker than GTX cards no matter which series they belong to. His GT 610 is way weaker than your GTX 560ti.

Here is some info for you that proves my point.

In terms of overall gaming performance, the graphical capabilities of the Nvidia GeForce GTX 550 Ti are massively better than the Nvidia GeForce GT 610.

The GTX 550 Ti has a 90 MHz higher core clock speed and 24 more Texture Mapping Units than the GeForce GT 610. This results in the GTX 550 Ti providing 22.3 GTexel/s better texturing performance. This still holds weight but shader performance is generally more relevant, particularly since both of these GPUs support at least DirectX 10.

The GTX 550 Ti has a 90 MHz higher core clock speed and 20 more Render Output Units than the GeForce GT 610. This results in the GTX 550 Ti providing 18.4 GPixel/s better pixeling performance. However, both GPUs support DirectX 9 or above, and pixeling performance is only really relevant when comparing older cards.

The GTX 550 Ti has 84 GB/sec greater memory bandwidth than the GeForce GT 610, which means that the memory performance of the GTX 550 Ti is much better than the GeForce GT 610.

The GeForce GT 610 has 48 Shader Processing Units and the GeForce GTX 550 Ti has 192. However, the actual shader performance of the GeForce GT 610 is 96 and the actual shader performance of the GTX 550 Ti is 384. The GTX 550 Ti having 288 better shader performance and an altogether better performance when taking into account other relevant data means that the GTX 550 Ti delivers a massively smoother and more efficient experience when processing graphical data than the GeForce GT 610.

No one should ever buy a GT card for gaming no matter what series it belongs to. It is a huge mistake. Always buy a GTX card. :)
Last edited by LeadStarDude; Jan 22, 2014 @ 10:52pm
Warlord Jan 22, 2014 @ 10:50pm 
GT610 isn't a gaming GPU and is as least 2 teirs in performance lower than the intel's intregrated HD 4000 graphics.

GT610 has a lower gaming performance than a Nvbidia geforce GT 6800 which was released in 2004.

For even more lulz it's 3DMark performance is 177. Real deal current GPUs are in 10,000+ range.
Last edited by Warlord; Jan 22, 2014 @ 10:52pm
LeadStarDude Jan 22, 2014 @ 10:51pm 
Originally posted by Warlord:
GT610 isn't a gaming GPU and is as least 2 teirs in performance lower than the intel's intregrated HD 4000 graphics.

GT610 has a lower gaming performance than a Nvbidia geforce GT 6800 which was released in 2004.
+1
Yes... LOL! My post above yours proves how weak it really is. ;)
Moffin Bovin Jan 23, 2014 @ 12:30am 
Even on my HD7870 i get 30 fps on medium settings with vsync on. 50 - 60 With it off.

One thing to note about this game it the animations of Geralt running when out of combat are extremely jerky and choppy when you change direction. Is like Geralt suddenly snaps to one of the 8 directions, it doesnt look smooth at all. His hair, i have noticed, has a habbit of spazing out as well, adding to the "choopy" look.
WolfEisberg Jan 23, 2014 @ 5:36am 
Originally posted by EvilKlown:
I play this game just fine on my GTX 560TI, Medium-High settings. (1920x1200 resolution too.)

Originally posted by eisberg:
Your video card is way below minimum requirements for this game. Best I can tell you that won't include an upgrade is set everything to the absolute lowest, and set the resolution to the absolute lowest. But that still may not work.

Lol... Just... No dude. If Op's card was a 6 series (6***, like the 8800 is an 8 series) then yes, it would be below the minimum requirements. A GT 610 is in the GT/GTX 600 series, just like my GTX 560TI is in the 500 series.

Min required GPU: GeForce 8800 (512 MB) or Radeon HD3850 (512 MB)
Recommended required GPU: GeForce 260 (1 GB) or Radeon HD 4850 (1 GB)

Since OP's GT 610 should fall right in to the recommended specs.

That is not how it works at all.The Geforce 8800 is vastly superior than the 610. You can't get confused by what series it comes from, series is chip arcitecture, but says nothing about performance. I'll show you the difference between the 8800GS (lowest of the 8800s) vs the 610 GT; This Website shows you the difference between the 2 cards, and it shows that the performance of the 610GT is way below the 8800GS

http://gpuboss.com/gpus/GeForce-GT-610-vs-GeForce-8800-GS
WolfEisberg Jan 23, 2014 @ 6:51am 
When it comes to PC gaming you really need a little knowlege about hardware, especially graphics cards. A graphics card made today, could very well be slower than a graphics card made 6 years ago. Know that in the graphics card realms there is low end to High end in every series. And the numbers in the name of the card in of themselves don't really mean anything. A Geforce 560 is extremely better performance then a Geforce 610, despite the Geforce 610 being a higher number in its name, and also being a newer card then the Geforce 560.

So don't let the name of the cards, or the years they were made in confuse you.
EvilKlown Jan 23, 2014 @ 12:46pm 
Look, you guys know a few things here and there. But the GT 610 does fall within the recommended specs (and yes, I know. I mistakenly said 600 series, it is in fact part of the G/GT series). People told me that my laptop just could not run Metro 2033, that the GT 420M was just too weak for it. Yet I play it on the laptop just fine?? (And not even on the absolute lowest settings at that.) If OP adjusts the games settings appropriately the game should be runable.

I am aware of the differences between the cards, I know that just because its a higher model and newer doesn't exactly guarentee the card is going to be better than the previous model. My other PC used to run 2 GTS 7600's which are both clocked higher, and perform better than the entire 9 series rofl. The fact remains that if OP takes my advice, and lowers the settings (as well as reading section 10 of the readme for an understanding of what the options do....), the game will run better than it has been.

Y'all can sit here all day nitpicking my post because I mentioned I run a 560TI, but you are all the ones comparing my card to OP's, which I never even did. I shared what I use so OP would know that my setup is different, which means some of the options I use, will likely need to be changed. (Do note that in a few other posts, with cards way less powerful than his, people have gotten better performance from the game by reading exactly what I posted, and adjusting accordingly for their own system.)

LeadStarDude... I know my card has a higher clock speed, why else would I have bought it? (Aside from not having a board that supports PCIE 3.0 at the time of purchase...) :p
Last edited by EvilKlown; Jan 23, 2014 @ 1:00pm
WolfEisberg Jan 23, 2014 @ 1:05pm 
Originally posted by EvilKlown:
Look, you guys know a few things here and there. But the GT 610 does fall within the recommended specs. People told me that my laptop just could not run Metro 2033, that the GT 420M was just too weak for it. Yet I play it on the laptop just fine?? (And not even on the absolute lowest settings at that.) If OP adjusts the games settings appropriately the game should be runable.

Also, try to avoid comparing a low profile card to a full profile card... See the comparison link you posted Eisberg. Low profile has no choice but to get dumbed down, if it didn't then it would end up overheating as low profile is almost always fanless.. Just like in the comparison link. (I compared a low profile 8800 GT to my normal 8800 GT that I got from BFG (ah... those were the days..), the low profile was indeed a lot less powerful than my full sized card. Yes, there are cases where a low profile card is just as powerful as a full profile card, but you will need some cooling if you wish to keep it for long.

I am aware of the differences between the cards, I know that just because its a higher model and newer doesn't exactly guarentee the card is going to be better than the previous model. My other PC used to run 2 GTS 7600's which are both clocked higher, and perform better than the entire 9 series rofl. The fact remains that if OP takes my advice, and lowers the settings (as well as reading section 10 of the readme for an understanding of what the options do....), the game will run better than it has been.

Y'all can sit here all day nitpicking my post because I mentioned I run a 560TI, but you are all the ones comparing my card to OP's, which I never even did. I shared what I use so OP would know that my setup is different, which means some of the options I use, will likely need to be changed. (Do note that in a few other posts, with cards way less powerful than his, people have gotten better performance from the game by reading exactly what I posted, and adjusting accordingly for their own system.)

LeadStarDude... I know my card has a higher clock speed, why else would I have bought it? (Aside from not having a board that supports PCIE 3.0 at the time of purchase...) :p

Feel free to prove your point with showing Specs of Geforce 610 being at least the same spec of any Geforce 8800. I highly doubt you will find one. The 610 is way below minimum spec. The 610 doesn't come anywhere close to any Geforce 8800 Video card ever created, it is severally lower.
LeadStarDude Jan 23, 2014 @ 1:17pm 
Here is proof that a 8800 GT is way better that a GT 610. It also proves that the GT 610 is way below minimum specs for this game. ;)

In terms of overall gaming performance, the graphical capabilities of the Nvidia GeForce 8800 GT 1GB are massively better than the Nvidia GeForce GT 610.


The GeForce GT 610 has a 210 MHz higher core clock speed than the 8800 GT, but the 8800 GT has 48 more Texture Mapping Units than the GeForce GT 610. As a result, the 8800 GT exhibits a 27.1 GTexel/s better Texture Fill Rate than the GeForce GT 610. This still holds weight but shader performance is generally more relevant, particularly since both of these GPUs support at least DirectX 10.

The GeForce GT 610 has a 210 MHz higher core clock speed than the 8800 GT, but the 8800 GT has 12 more Render Output Units than the GeForce GT 610. As a result, the 8800 GT exhibits a 6.4 GPixel/s better Pixel Fill Rate than the GeForce GT 610. However, both GPUs support DirectX 9 or above, and pixeling performance is only really relevant when comparing older cards.

The GeForce 8800 GT 1GB and the GeForce GT 610 have the same amount of video memory, but are likely to provide slightly different experiences when displaying game textures at high resolutions.

The 8800 GT has 43.2 GB/sec greater memory bandwidth than the GeForce GT 610, which means that the memory performance of the 8800 GT is noticeably better than the GeForce GT 610.

The GeForce 8800 GT 1GB has 112 Shader Processing Units and the GeForce GT 610 has 48. However, the actual shader performance of the GeForce GT 610 is 96. The 8800 GT having 16 better shader performance and an altogether better performance when taking into account other relevant data means that the 8800 GT delivers a significantly smoother and more efficient experience when processing graphical data than the GeForce GT 610.
Last edited by LeadStarDude; Jan 23, 2014 @ 1:52pm
< >
Showing 1-15 of 33 comments
Per page: 1530 50

Date Posted: Jan 22, 2014 @ 7:15am
Posts: 33