Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
I currently run two GTX690 cards in what is effectively quad-SLi. I have zero trouble with framerates in any of the games I play...... other than with GTAV. It's performance is nothing short of shocking - that bad that it is simply unplayable on my machine.
If I disable SLi and run it, the performance increases - but it is still very poor. On Windows 10 with a DX12 patch applied? Who knows.....
Incase if this were happening they would be testing it as we speak for win 10 this would be good advertise,but then again i assume there would be alot debugging to resolve.
Which makes me doubt it happens anytime soon since gta v and windows 10 would be drumming gta v is dx12 game upon release which we havent seen so i think it wont happen anytime soon.
Everyone remember BF4 and mantle dx12 basics same methods as mantle,but will be more efficient. This will boost high end systems about 15% to 30% lowend systems got more benefit.
The numbers are correct but your understanding of them is not, the benchmark used to produce these number is the star swarm benchmark and it requires more draw calls than DX11 can handle, hence the large performance increase when switching to DX12.
If you would like an idea of the type of performance increase you will see load up After Burner and have the GPU usage % in the overlay when you see your GPU running at 80% imagine what you would get if it was running at 100%.
Now this is by no means going to be 100% accurate but it should give you an idea.
Also for a game to take advantage of shared VRAM it needs to be explicitly programmed to do so, so don't expect to see anything already on the market to take advantage of that feature as it would require a lot more work than most developers will be willing to put in to it.
Hope that clears up a bit of the misinformation here.
Edit
Here is a link to the actual benchmark numbers
If you look at the AMD 290X you will see that it gets a pathetic 8.3fps in DX11 and jumps up to 42.9fps in DX12 giving it the illusion of a huge performance increase even though it is only due to incredibly underwhelming performance in DX11
When taken in comparison to Nvidia's 26.7fps DX11 and 66.8fps DX12, yes it is a bigger increase but only because it set the bar so low in DX11
http://images.anandtech.com/graphs/graph8962/71450.png
Its still one of the best optimised games i tried so i really doubt it will make a difference
maybe 5%