Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
My point is, that if you wanna play on 1080p, then you don´t buy or look for a card, that runs 1440p at the same level.
A real problem is also that benchmarks and reviews, often use Ultra settings as a benchmark, it is a faulty way in my opinion, since Ultra from a cost benefit pov, is silly to use, unless the game is 5 years old or you have a extravaganza setup, that can justify the huge resource hit, for the little visual gain.
Most games run 1080p perfectly fine on higher settings, with even old GPU´s with 6gb vram.
Obviously. you can´t future proof anything, but if you wanna do 1440p, then you don´t buy a 4060 and build around that... you would most likely build a PC around a 3070ti or even 3080, if you have the cash to burn.
For a 1080p system however... it is just a whole other world.
I have many GPUs from both brand, but mainly use an 8 GB card (6650XT) because it doesn't turn my room into a furnace and I did manage to undervolt it and drop power consumption to 45-90W @ 2990 Mhz clock. No GPU is giving me the same performance/watt at that setting, so I keep it around.
Many games are requesting more VRAM because the dev team are extremely lazy and can't optimize their own games. They just leave it for someone else to do because they are getting their money, and when the games don't run people blame AMD and NV, or others besides the devs.
I won't be surprised if the trend gets worse. But I am not worried because I will soon be working in the law field, and not cleaning up the mess of game devs in drivers. Can't wait to lean back and watch the show soon.
Most of the dire examples of games not being able to run well, are either console ports or cross platform development.
A lot of games actually do have sufficiant and acceptable optimizing, often it is just not the AAA titles.
The CPU and RAM requirements aren't bad at all.
What I'm concerned by is the video card.
Not so much the minimum requirements as I guess something on the level of a GTX 1080/RTX 3060 isn't super high anymore compared to a next generation game (but has a minimum requirement been this high yet for the video card?).
But the part of how the recommended requirements ask for x80/x800 GPUs from the last couple of generations, and the description says that is good for "play at medium settings at 60 FPS (or 30 FPS at high) at 1080p? So you're choosing high settings or 60 FPS even on that hardware? And just at 1080p? And needing DLSS for 4K? Yikes.
Why are some people quick to say it IS their fault?
Like with this example, VRAM needs have a history of jumping up when new console/game engine generations start coming around, and that is part of what we're seeing, no? These new games are exposing the current GPU market for its lack of VRAM, where 8 GB is the most common but almost everyone else has less (with "only" 12 GB starting to go up just now, and mostly because of the RTX 3060). I have to wonder if the poor market state in regards to VRAM amount is being exposed and making the launch situations for many of these titles worse. The new consoles have a shared 16 GB of very fast RAM. Half of that amount might not be enough VRAM for some titles, especially with higher resolution textures. It's a consideration.
You also see a lot of complaining about shader compilation times or shader stutter. I saw this video and gave it a watch, because I always wondered it.
https://www.youtube.com/watch?v=f7yml1y3fDE
From what I gather, modern games are just so numerous/heavy on shaders that it's caught up, and this is the result? Consoles get around the issue because the shaders come pre-compiled for them, and a lot of people say this could be done on PC and that it's the failure of the developer if they don't, but apparently this isn't realistic on PCs due to them not being fixed platforms. So many different GPUs and drivers make it impossible (?), so it looks like lengthy sessions of pre-compiling them before game start, or stutters during play, are what we will have to deal with.
Higher frame rates will likely intensify this too.
I've seen a lot of people blaming this on bad developers too.
But I have to wonder how much of it is their fault, and how much of it isn't.
I imagine that GTX 1060 6GB will be fine at 720p resolution.
it can run FH5 on ultra at 130fps+ but this game is low/mid at 30fps at the same res?
what did they optimise it with? a brick?
We are in transition period when devs are dropping support for potato ps4 and games won't scale down very well any more.
It only has 8gb VRAM👀 how come you guys need so much!? Even my 1660 super has no trouble and that only has 6gb.
Oof!
I would blame developers if I were to blame anyone (at least give them the majority of the blame as opposed to blaming Nvidia or even gamers for buying whatever is being put out to begin with).
My simple reason is this. If a game like Bioshock Infinite can be crafted to run great old consoles and old pcs (it was released 10 years ago), then logically I don't see why it would be so much trouble to make a game that runs great now?.
From that game to now, everything has gotten worse. Games are much larger, much more performance taxing, and in my opinion crappier storylines. At least for AAA games and for what reason? negligibly better lighting and you can see more freckles and texture details at a cost of 150 fps?.
I can't see how we can't blame developers. They're the ones who made the game?.
Now as far as breaking down "developers" even further, I'm not sure how it works but maybe the people actually coding aren't to blame since they probably are just taking orders from whoever it is that makes the decisions on how the game will be made. But, I do think somewhere in that group is where most of the blame should go.
Unless, It's possibly looked at like "it's just not viable or feasible to code a game efficiently anymore" due to all sorts of factors in the gaming development environment. Maybe if they did code it properly, it would take too long, cost too much or whatever and so it wouldn't get made to begin with.
But, I would still end up coming back to Bioshock Infinite. If they made that game, not sure why they wouldn't be able to make it again?. IIRC, it's offically been confirmed that they are making a Bioshock 4 also.
I really would have liked to see a 3070 Ti 16GB. I know Nvidia was thinking of releasing one at the time that the 3080 12GB got released, and I was almost certain that it was going to happen but it never did.
Now, I wouldn't have replaced my 3070 Ti 8GB with a 3070 Ti 16GB, but I would of liked to see how it would have compared to the 8GB card. It is clear from what we are seeing from these games, that more Vram would have prevented the issues that the 8GB card, and have surpassed the 6800 a great deal in recent games and have given the 6800XT a run for its money. There would have been a huge discrepancy in performance between the 8GB card and the 16GB. Something not seen before with cards having the exact same architecture, except for the Vram amount.
I am sure Nvidia would not have liked this. Perhaps that is why they decided not release it after all, after knowing the future that the 3070 and 3070 Ti faced. And honestly, I have never seen a situation quite like this. I mean sure, a GPU 3 to 5 years after release is not going to be on the same tier that it was performance wise as it was on release, but even just a couple years after release, it should still be a great performer and a viable card. Not saying that the 3070 and 3070 Ti are now NOT viable cards, but the type of problems that these cards are facing are not based because of their ability for rasterized performance. They simply run out of Vram, and games begin experience stuttering, texture pop-in and pop-out issues, and crashes, not to mention reduced FPS.
I have had a tendency over the last decade, or so, to start looking for my next GPU upgrade a couple years after I bought it. But that has always been do to wanting straight up more performance. Never because I was thinking that my card was limited in Vram. Performance wise, the 3070 TI is still a great card. And the problems we are seeing are from a crop of very recent games that have been criticized for being poor console ports. But still the problems occurring do raise an alarm. And I am seeing it myself in games that require more Vram. Either I am simply not allowed to max out a couple settings because my 3070 Ti has not enough Vram, or the game will encounter issues if I do max out those settings.
Games will only start using more and more Vram. So, I am sure there will be some games that are released that I really want to play, but I will think twice about because of the GPU I have. So, even before the 3070 Ti's overall performance comes into question, I am held back by the card's 8GB Vram amount.
Would have loved to see a 16GB version of this card. Heck, how about a 20GB 3080!
But I'm actually planning on upgrading from it soon anyway so I'm not so much as worried about not being able to play it due to being below minimum. I just find it thought provoking and wonder if it's might be yet another sign of the sorts of asks that games will have going forward.
I want to clarify that I'm merely presenting a question for curiosity sake (or food for thought, as some say), rather than making a declaration of where the blame does or doesn't lay.
I'm just asking those who are eager to blame developers, do you have extensive knowledge of those sorts or things to know it is their fault, or are complaints just coming from a place of "I think it should be better". Because I think the latter happens a lot. It doesn't mean I'm not saying developers never make faults either. They're not mutually exclusive things.
That's too simplistic of a view.
The developers don't control everything about the platforms (plural) they're making the games for. They don't control their time and schedule limits.
Do you actually KNOW and not just THINK the issues could be fixed?
Do these solutions make sense in our capitalistic system? If not, at what point do you realize maybe it's perhaps the system and not the developers that are maybe more at fault?
Similar for me, I was shooting for the 3080 but had better luck on cost and was able to score the 3060ti at retail (can't say MSRP because it was an AIB for like 79 above or so).
Gotta have that 16, I am not trying to buy 500 or 600 or 700 card every generation.