Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
That said, have yet to see stutter on either of my i7 builds. Both quad cores with hyperthreading. One runs at 10-15fps (560m) the other at 30-50 depending on environment (970m).
I mean raytracing.
I think this would look awesome for space game.
Not to say they couldn't add it in later.
Adoption of the RTX GPUs are going to be extremely small because they're going to be extremely expensive. That means it's going to be a couple generations of Nvidia GPUs before we really start to see it show up in games. Maybe we'll see something from Frostbyte or Cryengine or whatever, but it's going to be an option and afterthought because built-in functionality for it would absolutely hamstring anyone who doesn't have one of the two GPUs that are announced that support it.
The GTX 2050 and GTX 2060 will not have this feature, more importantly, the 2060 will most likely have a $300 price tag. The RTX 2080 will have a price tag of $700 and the RTX 2070 will have a price tag of $500. Keep in mind, as well, that these are MSRP, which means that if crypto doesn't keep tanking like it is, we might wind up seeing prices of $900-$1100 for a 2080.
All this is to say it's just not a good value for the developers to focus their time on supporting RTX features, since there's such a tiny fraction of people who can afford those GPUs.
Keep in mind that the most popular GPU right now is the 1060 at 12.9%, followed by the 1050 Ti: 9.9%, then the 1050, 750 Ti, 960, and finally the 1070 comes in 6th place with a measely 3.66% market share. This indicates to me that there is currently no use case for the RTX features in titles due to be released before 2020.
Now, let's look at upgrade paths. The 2060 is due to be 7% slower than the 1070. This means that 1070 people are going to only see a benefit if they move to either the 2070 or 2080, which are due to be 40% and 110% faster than the 1070, respectively. Will someone who's already forked out $350 - $450 this year be willing to pay another $500 - $700 (assuming MSRP) for a GPU solely for the new feature that won't even bring about a huge improvement in the quality of the game? I vote no.
Yes, it would look awesome, but this game isn't even focused on the graphical side of things. I don't mean to say it looks ugly, but it's not the nicest looking space game I've ever played. Coming here from Space Engineers, I'd have to say that it's a step down in graphical quality for a huge step up in stability.
I think the first industry to see benefits from realtime Ray Tracing is going to be the CAD and animation industry.
/rant
Besides, the tech isn't really new from what I've read, it simply wasn't feasible at the home computer level. And 2020 consoles will likely have ray tracing capable gpus. Like the ps5.
The reality of the RTX 20 Series is this: it's a money-grab designed to get early adopters on the ray tracing hype train for the 20 or so games that will ship with the feature. It's a stopgap to 7nm cards that should arrive in 2019 and offer substantial performance gains and power efficiency improvements. And as for the price tag, Nvidia can charge whatever it desires due to lack of competition in the high-end space, period.
Thank Bitcoiners for the insane prices, NVidea is just making money as any of us would do in there position, its the Bitcoiners that have driven all this out of control in pricing.
Yep all this 100%
Intel...
Dgpu...
Those don't belong together.
Although the people who bought the titan v should be breaking down the door. "oh look, 2 years later and we released a stronger dgpu for half the cost with more performance and a new feature, sucks to be you! kthxbye"
Intel has tried releasing their own GPU's before and there just was not a lot of interest in it (It did have an interesting design to it, and became part of the Xeon-Phi line-up later). Im certain it will be the same this time around unless intel revolutionized SOMETHING within the gpu industry or introduced a product that would bring VR to the masses for dirt cheap. Yes it would take that kind of push to get devs interested in a new Graphics platform...
With DDR4 being so darn expensive right now... And GPU's being over priced for so long thanks to crypto mining (Those Jerks hahahaha)... AMD seems to have inserted itself into a good position with some really good timing for the moment. And with their aggressive roadmap... Intel will be hard pressed to keep pace as they always command a price premium and have struggled to push 10nm for what feels like ages with multiple setbacks. Newer Intel Generations just arent seeing high gains like they used to from one gen to the next and have not for quite some time.
If amd releases a potent GPU this year / early next that is affordable and can beat the 2060 at a better price point (As the 1060, and most x60 cards usually eat the most market share)... They could kick both intel and nVidia in the sack hard enough to upset the apple cart. Amd wins the price/Performance contest all day right now until it comes to their GPU's As they suckle to the power jack like a newborn baby calf and just cant quite hit the high notes when asked to sing.