Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
It has to be noted that tweaking such code requires a former knowledge, from people that has actively worked on that or that have been working with it for years. I am not sure if these people are still around at Bethesda, leading to some risks that a dev is not willing to take. Comprehension of a codebase is a crucial part if you want to alter it. There are devs and devs. From my experience, those that take time to understand the code are in the minority. The evil lies in the measurement of time; deadlines. Usually there is not enough time given to truly/fully understand why something is written in that way because the delivery of a feature, either added (by retrofitting) or refactoring (improvement, such as supporting larger textures) is subject to a deadline, given by project management.
That you have mentioned Cyberpunk 2077 is a fair comparison because the game has also shown the limitations (of which started to become obvious in Witcher 3, especially with texture streaming) of the RED engine the game is running on. According to the latest information I have is that CD Projekt RED (the company behind Cyberpunk 2077) is going to move to Unreal Engine 5 for their next projects. The latter engine is developed and updated by devs who knows the engine very well, ensuring that newest technology (such as light fx's) is implemented properly. There are enough video's on YouTube (just consult their channel) showcasing their work. It is possible that this step is necessary for CD Projekt RED to ensure that their next projects would not have lots of troubles (of which has an impact on the budget) like they might have gotten during the development of their previous games
Other companies have their own engines as well, either retrofitted or refactored to ensure that its continuity is guaranteed. However, for that you need both money and the right people (those developers that have knowledge), something that is not feasible.
Awesome response, I do remember the god rays in Fallout 4 and my first thing to configure for it was how many cascades god rays had, due to it having too much of a performance impact with default settings.
And I did think of that, and deem it as an inevitable factor in terms of development regarding knowledge and comprehension of engine functionality, especially if prior devs are no longer present, new devs may not alter code accordingly to support new features where needed. Deadlines are indeed another factor as well, that I think affect the process of development behind the scenes, which I am quite sure most AAA studios will never admit or elaborate on when it comes to it potentially affecting a games stability/performance. Which is why I hate what Todd said in my OP.
Regarding redEngine, I am aware they're moving to Unreal engine 5, which has me excited to see what the dev team is capable of for the next iteration of Cyberpunk.
It does seem some unfortunate factors are to be considered when a game is an outlier in performance sadly, but I can only hope that down the road they fix or improve performance, but I do feel that is in vain. Moreover, while it may affect their stocks and shareholders, I think some of these AAA studios should be more transparent about internal issues that don't breach any sensitive information they wish to retain so consumers can develop better understandings of what is going on, instead of being left to guess why a game is so underwhelming performance wise to other fair comparisons.
I am a professional software engineer, and the loading behavior has all the signs of race-conditions. The outpost data structures are not stable. The loader is not stable.
I've tested outposts quite a bit. On one outpost I have a floating crate about 30 meters in the air ... apparently from a nearby PG base, because I never created any crate.
The supply missions can bork an entire outpost.
Deleting certain items can crash an outpost.
Certain combinations of builds will cause an outpost to crash.
Adding crew to an outpost will disappear existing robots.
Outposts can hang when entering build mode and overhead view.
******************
The HDD to ram loader is not stable.
The ram to GPU loader is not stable.
The engine itself is beautiful and the lighting model is first class.
Which is just really a long way of saying that there'll be trade-offs going on and would be useful to have a clearer insight into those rather than just pointing at the tools being used, especially when the tailoring of custom engines doesn't lead to each having the same priorities.
https://youtu.be/jqymg_prARI?si=7d4aQgTYpng-JIRp
Why are there two shops on the neon concourse that require a zone change? What's special about those two?
but it isn't modern and obviously can't handle what modern engines can which also leverage all the hardware capabilities on consoles and PC's. any fanboys who want to try and argue i refer you back to the sheer number of load screens REQUIRING that ssd installation. that's pure copium on beths part and not a great bit of copium either. that shows serious issues with their asset pipeline or just sheer laziness and only designing around consoles.