Cài đặt Steam
Đăng nhập
|
Ngôn ngữ
简体中文 (Hán giản thể)
繁體中文 (Hán phồn thể)
日本語 (Nhật)
한국어 (Hàn Quốc)
ไทย (Thái)
Български (Bungari)
Čeština (CH Séc)
Dansk (Đan Mạch)
Deutsch (Đức)
English (Anh)
Español - España (Tây Ban Nha - TBN)
Español - Latinoamérica (Tây Ban Nha cho Mỹ Latin)
Ελληνικά (Hy Lạp)
Français (Pháp)
Italiano (Ý)
Bahasa Indonesia (tiếng Indonesia)
Magyar (Hungary)
Nederlands (Hà Lan)
Norsk (Na Uy)
Polski (Ba Lan)
Português (Tiếng Bồ Đào Nha - BĐN)
Português - Brasil (Bồ Đào Nha - Brazil)
Română (Rumani)
Русский (Nga)
Suomi (Phần Lan)
Svenska (Thụy Điển)
Türkçe (Thổ Nhĩ Kỳ)
Українська (Ukraine)
Báo cáo lỗi dịch thuật
Koboh is more of a jungle planet, definitely more water than you would expect to find in a desert, and there is a river right next to the space port. No doubt that Koboh is the largest and most diverse of the planets, with swamps, jungles, caves, crashed spaceship, ancient temples and other ruins, pirate camps, and a "cloud station" mildly reminiscent of Bespin.
Well, to be pedantic, Jedi: Survivor is CPU bound, meaning it is bottlenecked by the CPU.
This is most noticeable both inside and outside of Pyloon's Saloon at Rambler's Reach on Koboh. You will notice your GPU going idle, while your 4-Core thread performance caps out.
Jedi: Survivor is a victim of Unreal Engine 4 development, which was a 4-Core game engine.
Some utilities report average frame rates, which may indicate "80 FPS". However, more detailed utilities might break down the numbers in different ways. 1% lows, charts and graphs, etc.
It seems that "tearing" is indicative of a frame rate being too low for the scene being rendered.
Barring some kind of mass thermal throttling issue, as the issue seems similar to thermal throttling, the next point seems to be an issue with timing on the animations or simulation. Speculation is that either Respawn or Electronic Arts or Epic Games (Unreal) has tied their simulation to the processor clock speed somehow, resulting in issues when the clocks cycle up and down. The focus is on Epic being incompetent, since they blame Intel.
https://www.pcgamer.com/there-are-increased-reports-of-crashing-in-unreal-engine-games-etc-and-epic-is-blaming-intel-chips/
Intel is working on updates to their PresentMon utility to provide clearer insight into performance issues.
If they would optimize the second planet a bit more i would even say that the port is good but ofc i dont know how it runs on 2k or 4k.
There have also been fairly recent reports that an NVIDIA driver results in more crashing in Jedi: Survivor, with the recommendation being to revert back to a previous driver.
The higher the resolution, the more VRAM is required. I would expect that the RTX 4070 12GB to be capable of 1440p native, which is generally what the PlayStation upscales to 4k. So I would expect 4k/60 DLSS Quality to be viable.
Tech Power Up did some benchmarks on Coruscant, which isn't quite Koboh or Jedha, but on Coruscant they did find that the GPU is ideally suited for 4k.
https://www.techpowerup.com/review/star-wars-jedi-survivor-benchmark-test-performance-analysis/4.html
Don't be too hard on your i7 7700. Jedi: Survivor began development on Unreal Engine 4, which seems to be a mostly 4-Core optimized engine. Newer CPUs might have more cache, but that doesn't relegate the i7 7700 to "toaster" status equating to the performance of a Raspberry Pi Pico.
Funny I got this at launch and just started playing again, not much has changed performance wise...
the game is very boring, its a typical action rpg: go there save this, go there take this, go there stole this, and struggle to be a souls like. Just like them the game base his "fun" on givin satisfaction after givin a lot of frustration, frustration not given only by enemies but also by puzzle and level design. I can nearly play for 1 hour then i am already bored
So if you can afford it, may Force Brute be with you!
It'll either run decent and look bad or look good and run terrible, there is no happy middle ground to find, I wouldn't recommend buying it.
Also side note: I've never seen a non emulated game that requires the game to compile shaders in a menu before you can play the game every time, its super weird
It also depends a LOT how long your game sessions are, as many even with 4090s say at some point after playing a while, more and more memory gets used and the performance starts dropping.
I have found there are ways to minimize stability and performance issues by limiting session time, and/or quitting and rebooting every couple hrs, not minimizing the game, and using Nvidia's studio drivers.
It's still going to have it's problems here and there, but much less so, especially if on high spec.
I could see that being said of a console "port", as developers can pre-compile the shaders and bake them into a console game, to avoid irritating console gamers.
Modern console games are built on the x86 platform, and the XBox Series X especially shares "X" code between the consoles and PC. The "DirectX-Box". Unless you are implying that ports to the XBox series of consoles are also "emulated" PlayStation 5 native releases? PlayStation 5 exclusives?
Switch games run on the NVIDIA Tegra "ARM" architecture. Switch games must be emulated because the Switch uses ARM and not x86 CPUs, the x86 Intel/AMD CPUs must be made to emulate the way the ARM in the Nintendo Switch functions.
So what Virtual Console games on the Nintendo Switch require shader compilation before you can play the game? Ocarina of Time?
I dont even have THAT powerful of a PC