Peter 11/set./2020 às 16:55
3080 is only around 2% faster than 2080ti @1080p.
With the majority of steam users gaming at 1080p (65%), it looks like 1080p will be around for a long time. By far the most popular game on steam is Counter Strike, which is played at 1080p. This game had around a million players today, and indeed, everyday. These 2 facts alone make 1080p still relevant and the most important benchmark to test, for the majority of gamers.

What is the reality for the majority of players, who game at 1080p, with respect to the new 3080?

Nothing much, nothing much at all. Over at Videocardz, they've released the figures for Far Cry New Dawn at 1080p. The result for the 3080 was 132fps. For the 2080 ti, 129fps.

While I'm not going to generalise the result, I will predict that this result will be similar in countless other games.
< >
Exibindo comentários 1630 de 127
iceman1980 11/set./2020 às 23:21 
Escrito originalmente por Kurumi Tokisaki:
Escrito originalmente por Product ∏:

You don't even come close the VRAM provided by the RTX Titan. GPU utilization has nothing to do with VRAM usage. 5.4GB VRAM out of 24GB VRAM is a real waste of VRAM. You are so naive. Ah now that explains why you have so much money. Mummy and daddy pay for your stuff. You are still a teenager.
VRAM usage is fully utilized in deep machine learning apps and games such as AI Dungeon Unleashed. You're crazy if you think no game can come close to 24GB yet. I'm disappointed 3090 is only 24GB, and hope they still release a TITAN this year with 48GB or more VRAM to keep up with machine learning requirements.

Do you even know what machine learning is?
mimizukari 11/set./2020 às 23:22 
Escrito originalmente por Product ∏:
Escrito originalmente por Kurumi Tokisaki:
VRAM usage is fully utilized in deep machine learning apps and games such as AI Dungeon Unleashed. You're crazy if you think no game can come close to 24GB yet. I'm disappointed 3090 is only 24GB, and hope they still release a TITAN this year with 48GB or more VRAM to keep up with machine learning requirements.

Do you even know what machine learning is?
I run games which use machine learning all the time. They exist. GPT-2 library requires over what a 2080 Ti can handle.
iceman1980 11/set./2020 às 23:23 
Escrito originalmente por Kurumi Tokisaki:
Escrito originalmente por Product ∏:

Do you even know what machine learning is?
I run games which use machine learning all the time. They exist. GPT-2 library requires over what a 2080 Ti can handle.

Your not doing machine learning you are playing with childs toys.
Última edição por iceman1980; 11/set./2020 às 23:23
mimizukari 11/set./2020 às 23:24 
Escrito originalmente por Product ∏:
Escrito originalmente por Kurumi Tokisaki:
I run games which use machine learning all the time. They exist. GPT-2 library requires over what a 2080 Ti can handle.

Your not doing machine learning you are playing with childs toys.
https://henk.tech/aid/

Requires over 12GB of VRAM to play because it utilizes GPT-2 learning library on your own machine. It is machine learning.
iceman1980 11/set./2020 às 23:28 
Escrito originalmente por Kurumi Tokisaki:
Escrito originalmente por Product ∏:

Your not doing machine learning you are playing with childs toys.
https://henk.tech/aid/

Requires over 12GB of VRAM to play because it utilizes GPT-2 learning library on your own machine. It is machine learning.

That is a waste of compute power to run a game. You aren't doing machine learning.

You have no idea how the algorithm works
You aren't doing anything even remotely targeted.
You probably have no understanding of the linear algebra or the vector fields involved.

This isn't machine learning. It's just a waste of electricity and compute power. It has no applications outside of the game it's nothing.

To excuse it as "machine learning" is gross abuse of the word, and it's intended application. And again most machine learning algorithms are glorified bruteforce.
Última edição por iceman1980; 11/set./2020 às 23:29
mimizukari 11/set./2020 às 23:33 
Escrito originalmente por Product ∏:
Escrito originalmente por Kurumi Tokisaki:
https://henk.tech/aid/

Requires over 12GB of VRAM to play because it utilizes GPT-2 learning library on your own machine. It is machine learning.

That is a waste of compute power to run a game. You aren't doing machine learning.

You have no idea how the algorithm works
You aren't doing anything even remotely targeted.
You probably have no understanding of the linear algebra or the vector fields involved.

This isn't machine learning. It's just a waste of electricity and compute power. It has no applications outside of the game it's nothing.

To excuse it as "machine learning" is gross abuse of the word, and it's intended application. And again most machine learning algorithms are glorified bruteforce.
I build my PC to play the games I like to play and for my own purpose, you build your own. I require over the 12GB VRAM otherwise I would get out of memory errors on AIDU/it'd take over 5 minutes to complete a turn vs the near instantaneous of having 24GB GDDR6 VRAM. Again, I hope the TITAN this series if there is one has 48 GB VRAM, would like to get on GPT-3 library someday with a multiple NVLink/SLI even if it takes Quadros. I have money to spend so I get to do my hobbies how I like them.
Última edição por mimizukari; 11/set./2020 às 23:34
mimizukari 11/set./2020 às 23:37 
Escrito originalmente por Product ∏:
Escrito originalmente por Kurumi Tokisaki:
I build my PC to play the games I like to play and for my own purpose, you build your own. I require over the 12GB VRAM otherwise I would get out of memory errors on AIDU/it'd take over 5 minutes to complete a turn vs the near instantaneous of having 24GB GDDR6 VRAM. Again, I hope the TITAN this series if there is one has 48 GB VRAM, would like to get on GPT-3 library someday with a multiple NVLink/SLI even if it takes Quadros. I have money to spend so I get to do my hobbies how I like them.

Correction you are 16 and your parents pay for your hobbies.
Transference of a Steam account is illegal and I've owned this account for 16 years. Again, check the years of service badge. All you're doing is showing that you had no idea that VRAM could be taken to its limit in modern AI games.
iceman1980 11/set./2020 às 23:42 
Escrito originalmente por Kurumi Tokisaki:
Escrito originalmente por Product ∏:

Correction you are 16 and your parents pay for your hobbies.
Transference of a Steam account is illegal and I've owned this account for 16 years. Again, check the years of service badge. All you're doing is showing that you had no idea that VRAM could be taken to its limit in modern AI games.

They are a joke, I know how much VRAM machine learning uses I also know the top PhD professors on machine learning don't believe it has that greater potential, and is heavily oversold to both technology companies and consumers as the next holygrail of technology. When In reality it's limited in it's applications and capacity beyond what it's trained to do. And furthermore that "Googles" innovation in regards to machine learning is created by using randomly different activation functions (the mathematical representations used to determine if a neuron will fire or not). Because if you knew anything about machine learning you would have at least a rudimentary level of comprehension about the biological processes involved in neural networks. That is the human brains neural cells fire off creating something called an action potential dependent on the forward propagation of a depolarisation wave resulting in a neutralisation of the charges associated with the neurons ion-sodium pump. Strictly speaking deep learning is trying to re-create that.

You know whats hilarious about that "text game" you are storing the state of thousands of pieces of data. Thats all your VRAM is being used for a dictionary Index.

Lmao.
Última edição por iceman1980; 12/set./2020 às 0:11
vadim 11/set./2020 às 23:52 
Escrito originalmente por Nabster:
it is silly to use 3080 on 1080p
I hesr tbat every time. Every new graphics generation. I remember how many times I heard that GTX 1060 is overkill for 1080p. And even earlier (4 years ago) that Fury X intended to be used for 4k gaming.
All these claims sound funny now.
iceman1980 11/set./2020 às 23:54 
Escrito originalmente por vadim:
Escrito originalmente por Nabster:
it is silly to use 3080 on 1080p
I hesr tbat every time. Every new graphics generation. I remember how many times I heard that GTX 1060 is overkill for 1080p. And even earlier (4 years ago) that Fury X intended to be used for 4k gaming.
All these claims sound funny now.

Yeah the reason for that is because performance demands change and the bar is moved further so what maybe possible now at 4K will only be possible for 1080P later. But the 3080 for 1080P if RDR2 uses about 5.4GB at 1080P a 3080 would be a waste at 1080P you'd use 1/2 of the capacity of a 3080 only reason you'd be targeting the 3080 is if you were an avid skyrim mod user or some other mod user.
Última edição por iceman1980; 12/set./2020 às 0:05
hawkeye 12/set./2020 às 0:24 
Escrito originalmente por Peter:
While I'm not going to generalise the result, I will predict that this result will be similar in countless other games.

Make sure you don't place any money on it.

FCND is the wrong game to use as a benchmark test as it's known to be fps limited by the engine.

Proof - 2070 thru to 2080ti all perform the same at 1080p

"1080p

Mainstream gamers will have no problems running Far Cry New Dawn at 1920x1080, with the super-cheap Radeon RX 570 capable of 67FPS even on the Ultra preset, the newer RX 590 hits 83FPS average, the GTX 1060 is capable of 72FPS and the new RTX 2060 pushes 106FPS average. Far Cry New Dawn maxes out at around 120FPS anyway, so you can throw all the GPU power in the world at it and it's not going to change much, even our RTX 2080 Ti reached 117FPS average."

www.tweaktown.com/articles/8908/far-cry-new-dawn-benchmarked-1080p-1440p-4k-tested/index.html#Benchmarks-1080p
Última edição por hawkeye; 12/set./2020 às 0:27
Monk 12/set./2020 às 2:03 
Holy cow, so a text adventure is the 'machine learning' you go on about all the time.......... Jesus no wonder you think 1080p is fine for yur gpu, all you seem to play are visual novels and text games.

As product says far more eloquently and with more detail than I could muster, that isn't machine learning.

Anyway, if that's what floats your boat and you can afford it, why not just say sod it and buy a quadro or two?
RTX 3080 is not 2% faster than 2080Ti.
3080 is a 40% more powerful GPU.

It appeared only 2% faster in that particular game at 1080p because of the CPU bottleneck.
If the CPU is 100% used, the system won't be able to produce anymore FPS, no matter how powerful the GPU is. It's not GPU's fault. Even the RTX 3090 will give the same FPS.

Put put more powerful CPU or play at higher resolution, you will notice the difference.
AdahnGorion 12/set./2020 às 3:58 
Escrito originalmente por sebo:
the amount of CS players is only a very small percent of the total gaming market which supposedly is over 1 billion, so i doubt the try-hards players speak for the majority.

1.3 billions

But this includes ie my toddler playing a silly flash game, me playing CK3 and my old old old mom playing on FB.. (just to put in some perspective) actual people with a gamer pc is like 150m or less. Even those with a gamer PC might just be playing minecraft, terraria and well indie games or a mixture without caring to much about settings.

All that being said.. you could run most indie games on a non gaming card on a 144hz 1080p and be happy I reckon.. I use 1440p.. we are a minority

If we take SHS
https://store.steampowered.com/hwsurvey/Steam-Hardware-Software-Survey-Welcome-to-Steam

Then it is down to under 8% (having widescreen options as well) and 4k is under 3%
If we take a look at the GPU´s used (lets take cards on pair with my 2080S or close to) then you will notice that it is 0.77% for the 2080s, but if we add all, then around then under 4% use a modern high tier card.. If we go 3 generations back, we see abit higher in the mid/high tier market, but not much... if we look at budget cards such as the x60 series, then they have the overall markets (1060 is still around 10% here)


My point is that most of these new high tier cards are not targeting the avarage user, they are meant for the 1440p and 4k crowd, that is their main customer audience.. then ofc you have people without a clue about hardware building, so they buy a 3080 for their 1080p to play Fortnite or CS on.. well... that is silly and waste of cash.

The interesting card to look at (for gains and stuff) is the 3070. mid tier cards like that, is where most will buy and even more the future release of the 3060 (the avarage entry level card tier) because that is where most users will end up buying anyway, not 3090 and not even 3080
AdahnGorion 12/set./2020 às 4:00 
Escrito originalmente por 🌈Cloud Boy🌈:
RTX 3080 is not 2% faster than 2080Ti.
3080 is a 40% more powerful GPU.

It appeared only 2% faster in that particular game at 1080p because of the CPU bottleneck.
If the CPU is 100% used, the system won't be able to produce anymore FPS, no matter how powerful the GPU is. It's not GPU's fault. Even the RTX 3090 will give the same FPS.

Put put more powerful CPU or play at higher resolution, you will notice the difference.

Only on rendering and ray tracing.... it was the same with pascal to Turing.. I saw 40-100% gains on rendering benchmarks and a stunning 150% on gaming!! from official benchmarks... the hint here was that it was with Ray Tracing on ^^
< >
Exibindo comentários 1630 de 127
Por página: 1530 50

Publicado em: 11/set./2020 às 16:55
Mensagens: 127