Ghost of Tsushima DIRECTOR'S CUT

Ghost of Tsushima DIRECTOR'S CUT

Lihat Statistik:
Juub 17 Apr 2024 @ 7:24am
2
Wait, an RTX 4080 for 4K60?
This is a PC port of a PS5 port of a PS4 game. Why the hell do you need such a beastly rig to max it out at 4K? PS5 does 2160p checkerboard 60 (1920x2160). You shouldn’t need anything more than an RX 6800/XT or 3070 Ti to reach 4K60. Maybe an RTX 3080/RX 6800 XT at most. An RTX 4080 or RX 7900 XTX is ridiculous.

For reference, it has the same requirements as former PS5 exclusive Ratchet & Clank Rift Apart...with Ultimate Ray Tracing.

https://assets-prd.ignimgs.com/2023/07/18/ratchet-1689694693726.jpg
Terakhir diedit oleh Juub; 19 Apr 2024 @ 9:25am
< >
Menampilkan 91-105 dari 132 komentar
Juub 28 Apr 2024 @ 7:58am 
Diposting pertama kali oleh M.Spengler:
Diposting pertama kali oleh Juub:
You claimed the 4080 is only 74% faster than a 6700. Get lost lmao.

https://technical.city/de/video/Radeon-RX-6700-XT-vs-GeForce-RTX-4080
Not even a proper review. Might as well use GPUbenchmarks lol.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4080-vs-AMD-RX-6700-XT/4138vs4109

And it's still more accurate.

You were already linked ACTUAL reviews from well-known sources but decided to ignore them and now post sites that just compile data without even reviewing. Get lost.

Diposting pertama kali oleh M.Spengler:
That´s one of the many problems he created for himself.
He want 4K at max settings and refuse to accept the system requirements.
Nah, I accept the system requirements for Cyberpunk or Alan Wake 2. That a PS4 game requires top-tier hardware to run at 4K60 is a farce. I just hope they overshot the requirements because this should be maxed out on a 3080 at 4K60, a GPU that's actually around 74% faster than the 6700.

Once again, stop with the ignorant takes.
Terakhir diedit oleh Juub; 28 Apr 2024 @ 8:09am
Juub 28 Apr 2024 @ 8:09am 
Diposting pertama kali oleh SHREDDER:
Τhat the problem The very high requirements of 4k. According to the benchmarks not even RTX 4090 can run all games maxed 4k 60 fps withotu FSR or DLSS

. But for 1440p you can run evrything maxed 1440p with a RX 6700XT or better. In games who have ray tracing i use FSR to run them with max graphics and max ray tracing 60 fps while those who dont have ray tracing dont need FSR yet because i run those maxed 1440p 60 fps without FSR.
Yes he got a 4k monitor when he knew that none of ths current pcs are powerful enough to run all games maxed 4k. As you said he created the problem himself. I also had this problem when i got this 1440p monitor in 2016: At that time none of the pcs could run all games maxed 1440p 60 fps.
Even if you had the best cpu best graphics card ram and ssd it was impossible to run all of them maxed 1440p 60 fps. Until 2021 i was running them at 30 or 30-40 fps maxed 1440p. Exceptions were simple games like 2d platformers and fighters that because those games are simple they have much lower requirements that most action games.Now even a medium graphics card can run all gamws maxed 1440p 60 fps.
But 4k? The two best graphics cards RTX 4090 and RX 7900XTX cant run all of them maxed 4k without FSR.
Games requirements aren't random. You answered your own question.

According to the benchmarks not even RTX 4090 can run all games maxed 4k 60 fps withotu FSR or DLSS

Yes, not ALL games. Obviously won't run Cyberpunk, Alan Wake 2, or even most modern AAA games at max settings and get 4K60, but here's the catch, Ghost of Tsushima is not a modern game. Look at the OP. The requirements for max settings are the same as Rift Apart with Ultimate Ray Tracing. How the ♥♥♥♥ does a PS4 port of a 4-year-old game have the same requirements as a PS5 game with ray tracing? This doesn't add up. Either they made HUGE enhancements for the PC port that take the visuals to a whole new level, or those requirements are nonsensical. The PS5 port of GOT looks almost identical to the PS4 version with a few minor enhancements. The biggest jumps are the frame rates and resolution.

Terakhir diedit oleh Juub; 28 Apr 2024 @ 8:15am
M.Spengler 28 Apr 2024 @ 9:02am 
Diposting pertama kali oleh Juub:
Diposting pertama kali oleh M.Spengler:

https://technical.city/de/video/Radeon-RX-6700-XT-vs-GeForce-RTX-4080
Not even a proper review. Might as well use GPUbenchmarks lol.

https://gpu.userbenchmark.com/Compare/Nvidia-RTX-4080-vs-AMD-RX-6700-XT/4138vs4109

And it's still more accurate.

You were already linked ACTUAL reviews from well-known sources but decided to ignore them and now post sites that just compile data without even reviewing. Get lost.

Diposting pertama kali oleh M.Spengler:
That´s one of the many problems he created for himself.
He want 4K at max settings and refuse to accept the system requirements.
Nah, I accept the system requirements for Cyberpunk or Alan Wake 2. That a PS4 game requires top-tier hardware to run at 4K60 is a farce. I just hope they overshot the requirements because this should be maxed out on a 3080 at 4K60, a GPU that's actually around 74% faster than the 6700.

Once again, stop with the ignorant takes.
Are you nuts ? :D

These are the benchmarks used :

Passmark is no GPU benchmark ?
3DMark Vantage is no GPU benchmark ?
3DMark 11 is no GPU benchmark ?
3DMark Fire Strike is no GPU benchmark ?
3DMark Cloud Gate is no GPU benchmark ?
3DMark Ice Storm is no GPU benchmark ?

Tell me, smart kid, what is a GPU benchmark in your opinion and what type of benchmark should i use in your opinion : synthetic or real ?

Ok, nuff of this moving the goalposts nonsense - Short and on point :

You´ve first claimed the RTX4080 would be 257% faster and changed it later to 2.5x faster then it was 2.3x faster.
Most of your replies are inconsistent.
Some people think 200% faster simply means twice as fast (which mathematically is 100% faster). If it is of the initial value, a 100% increase doubles it.

My statement of 74% faster speed is based on aggregated performance scores measured by (other) technicians under controlled conditions. I`ve already wrote this, too.
These results are informations i`ve quoted once and don´t care about as much as you do. I am the messenger.

Another user asked you to do a quick calculation based on user benchmark results you`ve linked here yourself, what you did not.
Instead you´ve quoted him by leaving out and ignoring the relevant question that would have proved his point (and mine too), moving the goalposts further.

I really don`t care about better or worse user benchmark results measured on a variety of different setups (CPU / RAM / Mainboard / Chipset / Manufacturer [ASUS/MSI/GIGBYTE etc. Stock or OC]) - Your self created artificial "4K problem" doesn´t affect me at all.

If you can´t just accept the fact that this PC port of the Decima engine need a RTX4080 for 4K at max settings that guaranteed (min) 60FPS then whatever floats your boat.
Reasonal explanations to this were given in this thread by multiple users already.

Possible options for you are :

(1) Ignore the facts & Stomp your feet loud on the ground
(2) Step down quality settings
(3) Lower rendering and/or screen resolution
(4) Buy a RTX4090
(5) Skip this game

You are stuck in a loop always pointing at option (1) and feeling very smart by doing so.

On the other hand i am just looking forward playing Ghost of Tsushima in 1440p at max settings because i prefer a minimum framerate of 120FPS over 4K and don´t feel the need to create my own artificial problems just to rage about them in a forum like you do. A very, very simple and easy solution.
It's deplorable that YOU call other people stupid.
Juub 28 Apr 2024 @ 9:22am 
Diposting pertama kali oleh M.Spengler:
These are the benchmarks used :
Passmark is no GPU benchmark ?
3DMark Vantage is no GPU benchmark ?
3DMark 11 is no GPU benchmark ?
3DMark Fire Strike is no GPU benchmark ?
3DMark Cloud Gate is no GPU benchmark ?
3DMark Ice Storm is no GPU benchmark ?

Tell me, smart kid, what is a GPU benchmark in your opinion and what type of benchmark should i use in your opinion : synthetic or real ?
Huh, real ones? Those are all synthetic benchmarks.

Ok, nuff of this moving the goalposts nonsense - Short and on point :

You´ve first claimed the RTX4080 would be 257% faster and changed it later to 2.5x faster then it was 2.3x faster.
Most of your replies are inconsistent.
Some people think 200% faster simply means twice as fast (which mathematically is 100% faster). If it is of the initial value, a 100% increase doubles it.
Ballpark. It can vary depending on the set of games, duh. Unless every single game gets compiled in review, one data set might show 2.3x, the other 2.5x, another 2.6x, and so on.

Also lol, 3DMark 11 and Fire Strike? Benchmarks that are over a decade old? I was using those on my GTX 670. Cloud Gate that is meant to test freakin' notebooks? Are you trolling? Use Time Spy or Time Spy Extreme, more recent DX12 benchmarks (which GOT will use at it doesn't even support DX11).

Here is a Time Spy benchmark where the 3080 beats the 6700 XT by 52%.

https://www.guru3d.com/data/publish/220/1ec799879f8efedcff534b240a1c3dc686af4a/untitled-52.png

That's the XT which is around 10-15% faster than the non-XT. The 3080 alone is around your quote 74% faster. The 4080 is over twice as fast lol.

My statement of 74% faster speed is based on aggregated performance scores measured by (other) technicians under controlled conditions. I`ve already wrote this, too.
These results are informations i`ve quoted once and don´t care about as much as you do. I am the messenger.
Your 74% faster is wrong on all accounts.

If you can´t just accept the fact that this PC port of the Decima engine need a RTX4080 for 4K at max settings that guaranteed (min) 60FPS then whatever floats your boat.
Reasonal explanations to this were given in this thread by multiple users already.

Except it's not minimum fps. Requirements are never for minimum fps but an for an average. The minimum fps for Rift Apart on a 4090 is like 35fps but the requirements for 4K60 is a 4080 which dips way below 60fps at 4K60. System requirements have never ever been quoted for a minimum but for an average.

If a 4080 averages 100fps but gets 0.1% lows of 60fps at 4K60 and max settings? That's fine. If a 4080 averages 60fps in that game at 4K60 and max settings? That isn't fine.
Terakhir diedit oleh Juub; 28 Apr 2024 @ 9:28am
M.Spengler 28 Apr 2024 @ 9:28am 
Oh, now these benchmarks are too old for you but you still won´t understand why a PC Port of a PS5 (also old) need a RTX4080 for 4K at highest settings ?

What ever floats your boat, kid.
I`ll leave it to others to tell you that you have no clue of what you are talking about.
Don´t skip school !
Terakhir diedit oleh M.Spengler; 28 Apr 2024 @ 9:29am
Juub 28 Apr 2024 @ 9:31am 
Diposting pertama kali oleh M.Spengler:
Oh, now these benchmarks are too old for you but you still won´t understand why a PC Port of a PS5 (also old) need a RTX4080 for 4K at highest settings ?

What ever floats your boat, kid.
I`ll leave it to others to tell you that you have no clue of what you are talking about.
Don´t skip school !
Lol, why are you using 3DMark 11 for a freakin' DX12 game when there is Time Spy which uses DX12? You know, exactly like GOT which doesn't support DX11.

Now leave, troll.
Terakhir diedit oleh Juub; 28 Apr 2024 @ 9:31am
M.Spengler 28 Apr 2024 @ 9:33am 
Diposting pertama kali oleh Juub:
Diposting pertama kali oleh M.Spengler:
Oh, now these benchmarks are too old for you but you still won´t understand why a PC Port of a PS5 (also old) need a RTX4080 for 4K at highest settings ?

What ever floats your boat, kid.
I`ll leave it to others to tell you that you have no clue of what you are talking about.
Don´t skip school !
Lol, why are you using 3DMark 11 for a freakin' DX12 game when there is Time Spy which uses DX12? You know, exactly like GOT which doesn't support DX11.

Now leave, troll.

You did not even understood that I didn´t used these benchmarks.
Told you already - Whatever floats you boat. If you want to call me stupid of command me to leave then do it in person or remain silent, kid. ;)
Juub 28 Apr 2024 @ 9:38am 
Diposting pertama kali oleh M.Spengler:
Diposting pertama kali oleh Juub:
Lol, why are you using 3DMark 11 for a freakin' DX12 game when there is Time Spy which uses DX12? You know, exactly like GOT which doesn't support DX11.

Now leave, troll.

You did not even understood that I didn´t used these benchmarks.
Told you already - Whatever floats you boat. If you want to call me stupid of command me to leave then do it in person or remain silent, kid. ;)

These are the benchmarks used :
3DMark 11 is no GPU benchmark ?

>These are the benchmark used
>Mentions 3DMark 11
>I didn't use those benchmarks

Maybe don't post drunk? Anyway, away with you.
M.Spengler 28 Apr 2024 @ 9:43am 
Diposting pertama kali oleh Juub:
Diposting pertama kali oleh M.Spengler:

You did not even understood that I didn´t used these benchmarks.
Told you already - Whatever floats you boat. If you want to call me stupid of command me to leave then do it in person or remain silent, kid. ;)

These are the benchmarks used :
3DMark 11 is no GPU benchmark ?

>These are the benchmark used
>Mentions 3DMark 11
>I didn't use those benchmarks

Maybe don't post drunk? Anyway, away with you.

Yes these are the benchmarks used by technicians under controlled conditions but NOT by ME. I´ve quoted and linked them. See - You don´t even understood this.
Already told you : If you want to give me any commands then do it in person or remain silent.
Juub 28 Apr 2024 @ 9:49am 
Diposting pertama kali oleh M.Spengler:
Yes these are the benchmarks used by technicians under controlled conditions but NOT by ME. I´ve quoted and linked them. See - You don´t even understood this.
Already told you : If you want to give me any commands then do it in person or remain silent.
"You don't even understood", maybe learn your verb tenses before saying anyone doesn't understand something?

Furthermore, this changes nothing. A DX11 synthetic benchmark that's almost 14 years old to test a 2022 GPU running a game that only supports DX12 is moronic, as is using Cloud Gate, a notebook and home computer benchmark. Ice Storm is for smartphones and tablets. How is this not trolling?

Time Spy and Time Spy Extreme are DX12 GAMING benchmarks and they're conspicuously missing from your list. Strange, huh?
M.Spengler 28 Apr 2024 @ 9:59am 
Diposting pertama kali oleh Juub:
Diposting pertama kali oleh M.Spengler:
Yes these are the benchmarks used by technicians under controlled conditions but NOT by ME. I´ve quoted and linked them. See - You don´t even understood this.
Already told you : If you want to give me any commands then do it in person or remain silent.
"You don't even understood", maybe learn your verb tenses before saying anyone doesn't understand something?

Furthermore, this changes nothing. A DX11 synthetic benchmark that's almost 14 years old to test a 2022 GPU running a game that only supports DX12 is moronic, as is using Cloud Gate, a notebook and home computer benchmark. Ice Storm is for smartphones and tablets. How is this not trolling?

Time Spy and Time Spy Extreme are DX12 GAMING benchmarks and they're conspicuously missing from your list. Strange, huh?
Ah from moving the goalposts to tone policing. Very well :D
Ok i`ll bite on that anyway.
Passmark is one of the real GPU benchmarks used :
https://www.passmark.com/products/performancetest/pt_adv3d.php
Syntetic ? No DX11 ?
You mentioned "accurate". What is more accurate :
Measuring only common DX12 functions or across the board including syntetic and real benchmarks ?

You proved once more that you have no clue what you are talking about.

PS: Da du meine Fremdsprachkenntnisse bemängelt hast schlage ich vor, das ich auf Wunsch diese sinnlose Diskussion in meiner Muttersprache fortführe.
Juub 28 Apr 2024 @ 10:02am 
Diposting pertama kali oleh M.Spengler:
You mentioned "accurate". What is more accurate :
Actual game benchmarks, not synthetic ones, and especially not DX11 benchmarks that are almost 14 years old.
M.Spengler 28 Apr 2024 @ 10:11am 
Diposting pertama kali oleh Juub:
Diposting pertama kali oleh M.Spengler:
You mentioned "accurate". What is more accurate :
Actual game benchmarks, not synthetic ones, and especially not DX11 benchmarks that are almost 14 years old.
Wow. Passmark is an actual benchmark software capable of measuring GPU performance, syntetic and real on API´s like DX11 AND DX12.
I`ve already linked it and is was also used :
https://www.passmark.com/products/performancetest/pt_adv3d.php

Go on...
Juub 28 Apr 2024 @ 10:13am 
Diposting pertama kali oleh M.Spengler:
Diposting pertama kali oleh Juub:
Actual game benchmarks, not synthetic ones, and especially not DX11 benchmarks that are almost 14 years old.
Wow. Passmark is an actual benchmark software capable of measuring GPU performance, syntetic and real on API´s like DX11 AND DX12.
I`ve already linked it and is was also used :
https://www.passmark.com/products/performancetest/pt_adv3d.php

Go on...
Don't care. Post games. Not freakin' Passmark. We don't play synthetic benchmarks, do we? We play video games.
M.Spengler 28 Apr 2024 @ 10:24am 
Diposting pertama kali oleh Juub:
Diposting pertama kali oleh M.Spengler:
Wow. Passmark is an actual benchmark software capable of measuring GPU performance, syntetic and real on API´s like DX11 AND DX12.
I`ve already linked it and is was also used :
https://www.passmark.com/products/performancetest/pt_adv3d.php

Go on...
Don't care. Post games. Not freakin' Passmark. We don't play synthetic benchmarks, do we? We play video games.
Another moving the goalposts from you ?
Nope, not this time. You have a problem with DX11 benchmarks instead of using DX12 benchmarks. Both GPU´s were tested on the same API. Speaking of real and syntetic benchmarks : Double the FPS does not mean the GPU is twice as fast in general it depends on many things lke what API you use and your skills at coding.

Speaking of just games : Between DirectX 11 and DirectX 12, the most important difference is that DirectX 11 is a high-level API, while DirectX 12 is a low-level API. There are various layers between your game and your hardware. Low-level APIs are closer to the hardware, while high-level APIs are further away and more generalized.

It’s an important distinction between DirectX 11 and DirectX 12. In short, DirectX 12 allows game developers to target optimizations closer to the hardware, reducing the overhead incurred from the API and graphics driver. In turn, it’s also more difficult for developers to work with.

In order to quantify the performance of a GPU, three tests are used: How quickly can data be sent to the GPU or read back from it? How fast can the GPU kernel read and write data? How fast can the GPU perform computations?

But well, proceed ?
< >
Menampilkan 91-105 dari 132 komentar
Per halaman: 1530 50

Tanggal Diposting: 17 Apr 2024 @ 7:24am
Postingan: 132