Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
Lol... Just... No dude. If Op's card was a 6 series (6***, like the 8800 is an 8 series) then yes, it would be below the minimum requirements. A GT 610 is in the GT/GTX 600 series, just like my GTX 560TI is in the 500 series.
Min required GPU: GeForce 8800 (512 MB) or Radeon HD3850 (512 MB)
Recommended required GPU: GeForce 260 (1 GB) or Radeon HD 4850 (1 GB)
Since OP's GT 610 should fall right in to the recommended specs.
I posted the following in response to a laptop user having issues. It seems to have helped them greatly...
System = i7-2600K, 16GB Ram, Geforce GTX 560TI (1GB)..
For those who don't know how/where to adjust settings... You can access these through the game's launcher, under options. Then click the "Show advanced options" check box, you will then have all of these options. (See the ReadMe.txt in "..\SteamApps\common\the witcher 2" for more information on what each setting does, the information can be found in section "10)")
My settings in game are as follows.. If it helps any one.. (Keep in mind I am running on a desktop, not a laptop.)
Screen Resolution: 1920x1200 <-- Your choice
Fullscreen: Enabled <-- Your choice
Aspect Ratio: 16:10 <-- Your choice
Configuration Set: This will show Custom after you begin making changes.
Texture Downscaling: None <-- Not sure, but this might help with lesser GPU's
Texture Memory size (MB): Very Large <-- Adjust accordingly with your GFX memory
Shadow Quality: Ultra <--Laptop should probably have this a bit lower than Ultra
Numbered of Shadowed Lights: Low <-- Your choice, but if Geforce experience suggests it to be Low, you might stick with that.
LOD Distance: Far <-- May go less with Laptop
Bloom: Enabled <-- Should be fine unless you're running on something really old..
Light Shafts: Enabled <-- Same as with Bloom
Anti-Aliasing: Enabled <-- This can help some, but can also have a negative impact on frame rates if your GPU isn't powerful enough. Use as you wish.
Blur Effect: Enabled <-- Lesser GPU's, and Laptops, should try without Blur. It may or may not help.
Depth of Field - Gameplay: Enabled <-- For me this is an "always", of course, check to see if it gives you any negative impact.
Vignette: Enabled <-- ... Why they can't just say fancy designs (or possibly video clips... the definition varies..) on things... No idea... This is up to you.
Wet Surfaces Rain Effect: Enabled <-- If your GPU seems to be angry when it rains, disable this. It should help a bit by getting rid of the wet surfaces.
SSAO: Disabled <-- If you aren't running something seriously good, SSAO will only hinder performance. Disable it unless you're running SLI or Crossfire with some heavy duty GPU's.
Motion Blur: Enabled <-- Same as with Blur Effect
Cinematic Depth of Field: Disabled <-- Geforce experience suggested this be disabled. I went with it, I think it won't matter much though. Try at own risk?
Depth of Field - Cutscenes: Enabled <-- Sure, but doesn't this make the previous option a little confusing? Do as you wish with this option, if you seem to get some horrible issues with it disabled, then enable it. Same goes the other way.
Dangling Objects Limit: Disabled <-- Unless you're running something really weak, this should likely be disabled.
UberSampling: Disabled <-- As with SSAO, keep this disabled unless you have the power..
Vertical Sync: Enabled <-- This is generally optional but it will help to keep your GPU from working overtime as it will lock the frame rate to match your display's refresh rate. As such, the temperature f the GPU will be lower too. This is a Win for laptops.
Decals: High spec <-- I don't think this one even matters, mostly because I have seen no performance increase or decrease. Again, adjust as you wish.
Unless you have some monsterous card, and are using SLI/Crossfire, Avoid Ubersampling. Mobile cards (###M) should always run a bit lower settings than a full blown PC as the mobile series cards do not have the full ability that a desktop GPU has. (They can kick some serious tail but not as much as a full blown desktop GPU.)
Now as I said at the begining of this, I am on a desktop, not a laptop. But this should help you get an idea of what to adjust so you don't have to play on such low settings.
(For more information on these settings and exactly what they do and have an impact on, check the ReadMe.txt in the game's installation directory. Look for section 10.)
Here is some info for you that proves my point.
In terms of overall gaming performance, the graphical capabilities of the Nvidia GeForce GTX 550 Ti are massively better than the Nvidia GeForce GT 610.
The GTX 550 Ti has a 90 MHz higher core clock speed and 24 more Texture Mapping Units than the GeForce GT 610. This results in the GTX 550 Ti providing 22.3 GTexel/s better texturing performance. This still holds weight but shader performance is generally more relevant, particularly since both of these GPUs support at least DirectX 10.
The GTX 550 Ti has a 90 MHz higher core clock speed and 20 more Render Output Units than the GeForce GT 610. This results in the GTX 550 Ti providing 18.4 GPixel/s better pixeling performance. However, both GPUs support DirectX 9 or above, and pixeling performance is only really relevant when comparing older cards.
The GTX 550 Ti has 84 GB/sec greater memory bandwidth than the GeForce GT 610, which means that the memory performance of the GTX 550 Ti is much better than the GeForce GT 610.
The GeForce GT 610 has 48 Shader Processing Units and the GeForce GTX 550 Ti has 192. However, the actual shader performance of the GeForce GT 610 is 96 and the actual shader performance of the GTX 550 Ti is 384. The GTX 550 Ti having 288 better shader performance and an altogether better performance when taking into account other relevant data means that the GTX 550 Ti delivers a massively smoother and more efficient experience when processing graphical data than the GeForce GT 610.
No one should ever buy a GT card for gaming no matter what series it belongs to. It is a huge mistake. Always buy a GTX card. :)
GT610 has a lower gaming performance than a Nvbidia geforce GT 6800 which was released in 2004.
For even more lulz it's 3DMark performance is 177. Real deal current GPUs are in 10,000+ range.
Yes... LOL! My post above yours proves how weak it really is. ;)
One thing to note about this game it the animations of Geralt running when out of combat are extremely jerky and choppy when you change direction. Is like Geralt suddenly snaps to one of the 8 directions, it doesnt look smooth at all. His hair, i have noticed, has a habbit of spazing out as well, adding to the "choopy" look.
That is not how it works at all.The Geforce 8800 is vastly superior than the 610. You can't get confused by what series it comes from, series is chip arcitecture, but says nothing about performance. I'll show you the difference between the 8800GS (lowest of the 8800s) vs the 610 GT; This Website shows you the difference between the 2 cards, and it shows that the performance of the 610GT is way below the 8800GS
http://gpuboss.com/gpus/GeForce-GT-610-vs-GeForce-8800-GS
So don't let the name of the cards, or the years they were made in confuse you.
I am aware of the differences between the cards, I know that just because its a higher model and newer doesn't exactly guarentee the card is going to be better than the previous model. My other PC used to run 2 GTS 7600's which are both clocked higher, and perform better than the entire 9 series rofl. The fact remains that if OP takes my advice, and lowers the settings (as well as reading section 10 of the readme for an understanding of what the options do....), the game will run better than it has been.
Y'all can sit here all day nitpicking my post because I mentioned I run a 560TI, but you are all the ones comparing my card to OP's, which I never even did. I shared what I use so OP would know that my setup is different, which means some of the options I use, will likely need to be changed. (Do note that in a few other posts, with cards way less powerful than his, people have gotten better performance from the game by reading exactly what I posted, and adjusting accordingly for their own system.)
LeadStarDude... I know my card has a higher clock speed, why else would I have bought it? (Aside from not having a board that supports PCIE 3.0 at the time of purchase...) :p
Feel free to prove your point with showing Specs of Geforce 610 being at least the same spec of any Geforce 8800. I highly doubt you will find one. The 610 is way below minimum spec. The 610 doesn't come anywhere close to any Geforce 8800 Video card ever created, it is severally lower.
In terms of overall gaming performance, the graphical capabilities of the Nvidia GeForce 8800 GT 1GB are massively better than the Nvidia GeForce GT 610.
The GeForce GT 610 has a 210 MHz higher core clock speed than the 8800 GT, but the 8800 GT has 48 more Texture Mapping Units than the GeForce GT 610. As a result, the 8800 GT exhibits a 27.1 GTexel/s better Texture Fill Rate than the GeForce GT 610. This still holds weight but shader performance is generally more relevant, particularly since both of these GPUs support at least DirectX 10.
The GeForce GT 610 has a 210 MHz higher core clock speed than the 8800 GT, but the 8800 GT has 12 more Render Output Units than the GeForce GT 610. As a result, the 8800 GT exhibits a 6.4 GPixel/s better Pixel Fill Rate than the GeForce GT 610. However, both GPUs support DirectX 9 or above, and pixeling performance is only really relevant when comparing older cards.
The GeForce 8800 GT 1GB and the GeForce GT 610 have the same amount of video memory, but are likely to provide slightly different experiences when displaying game textures at high resolutions.
The 8800 GT has 43.2 GB/sec greater memory bandwidth than the GeForce GT 610, which means that the memory performance of the 8800 GT is noticeably better than the GeForce GT 610.
The GeForce 8800 GT 1GB has 112 Shader Processing Units and the GeForce GT 610 has 48. However, the actual shader performance of the GeForce GT 610 is 96. The 8800 GT having 16 better shader performance and an altogether better performance when taking into account other relevant data means that the 8800 GT delivers a significantly smoother and more efficient experience when processing graphical data than the GeForce GT 610.