Instale o Steam
iniciar sessão
|
idioma
简体中文 (Chinês simplificado)
繁體中文 (Chinês tradicional)
日本語 (Japonês)
한국어 (Coreano)
ไทย (Tailandês)
Български (Búlgaro)
Čeština (Tcheco)
Dansk (Dinamarquês)
Deutsch (Alemão)
English (Inglês)
Español-España (Espanhol — Espanha)
Español-Latinoamérica (Espanhol — América Latina)
Ελληνικά (Grego)
Français (Francês)
Italiano (Italiano)
Bahasa Indonesia (Indonésio)
Magyar (Húngaro)
Nederlands (Holandês)
Norsk (Norueguês)
Polski (Polonês)
Português (Portugal)
Română (Romeno)
Русский (Russo)
Suomi (Finlandês)
Svenska (Sueco)
Türkçe (Turco)
Tiếng Việt (Vietnamita)
Українська (Ucraniano)
Relatar um problema com a tradução
Do you even know what machine learning is?
Your not doing machine learning you are playing with childs toys.
Requires over 12GB of VRAM to play because it utilizes GPT-2 learning library on your own machine. It is machine learning.
That is a waste of compute power to run a game. You aren't doing machine learning.
You have no idea how the algorithm works
You aren't doing anything even remotely targeted.
You probably have no understanding of the linear algebra or the vector fields involved.
This isn't machine learning. It's just a waste of electricity and compute power. It has no applications outside of the game it's nothing.
To excuse it as "machine learning" is gross abuse of the word, and it's intended application. And again most machine learning algorithms are glorified bruteforce.
They are a joke, I know how much VRAM machine learning uses I also know the top PhD professors on machine learning don't believe it has that greater potential, and is heavily oversold to both technology companies and consumers as the next holygrail of technology. When In reality it's limited in it's applications and capacity beyond what it's trained to do. And furthermore that "Googles" innovation in regards to machine learning is created by using randomly different activation functions (the mathematical representations used to determine if a neuron will fire or not). Because if you knew anything about machine learning you would have at least a rudimentary level of comprehension about the biological processes involved in neural networks. That is the human brains neural cells fire off creating something called an action potential dependent on the forward propagation of a depolarisation wave resulting in a neutralisation of the charges associated with the neurons ion-sodium pump. Strictly speaking deep learning is trying to re-create that.
You know whats hilarious about that "text game" you are storing the state of thousands of pieces of data. Thats all your VRAM is being used for a dictionary Index.
Lmao.
All these claims sound funny now.
Yeah the reason for that is because performance demands change and the bar is moved further so what maybe possible now at 4K will only be possible for 1080P later. But the 3080 for 1080P if RDR2 uses about 5.4GB at 1080P a 3080 would be a waste at 1080P you'd use 1/2 of the capacity of a 3080 only reason you'd be targeting the 3080 is if you were an avid skyrim mod user or some other mod user.
Make sure you don't place any money on it.
FCND is the wrong game to use as a benchmark test as it's known to be fps limited by the engine.
Proof - 2070 thru to 2080ti all perform the same at 1080p
"1080p
Mainstream gamers will have no problems running Far Cry New Dawn at 1920x1080, with the super-cheap Radeon RX 570 capable of 67FPS even on the Ultra preset, the newer RX 590 hits 83FPS average, the GTX 1060 is capable of 72FPS and the new RTX 2060 pushes 106FPS average. Far Cry New Dawn maxes out at around 120FPS anyway, so you can throw all the GPU power in the world at it and it's not going to change much, even our RTX 2080 Ti reached 117FPS average."
www.tweaktown.com/articles/8908/far-cry-new-dawn-benchmarked-1080p-1440p-4k-tested/index.html#Benchmarks-1080p
As product says far more eloquently and with more detail than I could muster, that isn't machine learning.
Anyway, if that's what floats your boat and you can afford it, why not just say sod it and buy a quadro or two?
3080 is a 40% more powerful GPU.
It appeared only 2% faster in that particular game at 1080p because of the CPU bottleneck.
If the CPU is 100% used, the system won't be able to produce anymore FPS, no matter how powerful the GPU is. It's not GPU's fault. Even the RTX 3090 will give the same FPS.
Put put more powerful CPU or play at higher resolution, you will notice the difference.
1.3 billions
But this includes ie my toddler playing a silly flash game, me playing CK3 and my old old old mom playing on FB.. (just to put in some perspective) actual people with a gamer pc is like 150m or less. Even those with a gamer PC might just be playing minecraft, terraria and well indie games or a mixture without caring to much about settings.
All that being said.. you could run most indie games on a non gaming card on a 144hz 1080p and be happy I reckon.. I use 1440p.. we are a minority
If we take SHS
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam
Then it is down to under 8% (having widescreen options as well) and 4k is under 3%
If we take a look at the GPU´s used (lets take cards on pair with my 2080S or close to) then you will notice that it is 0.77% for the 2080s, but if we add all, then around then under 4% use a modern high tier card.. If we go 3 generations back, we see abit higher in the mid/high tier market, but not much... if we look at budget cards such as the x60 series, then they have the overall markets (1060 is still around 10% here)
My point is that most of these new high tier cards are not targeting the avarage user, they are meant for the 1440p and 4k crowd, that is their main customer audience.. then ofc you have people without a clue about hardware building, so they buy a 3080 for their 1080p to play Fortnite or CS on.. well... that is silly and waste of cash.
The interesting card to look at (for gains and stuff) is the 3070. mid tier cards like that, is where most will buy and even more the future release of the 3060 (the avarage entry level card tier) because that is where most users will end up buying anyway, not 3090 and not even 3080
Only on rendering and ray tracing.... it was the same with pascal to Turing.. I saw 40-100% gains on rendering benchmarks and a stunning 150% on gaming!! from official benchmarks... the hint here was that it was with Ray Tracing on ^^