Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
2nd. GTAV also doesn't load the entire map and everything at once, so it has the time to compress and decompress things. Ark does not compress anything because we don't want loading screens when we go into a cave...
If you'd lurked and partaked in the development of the game you would know the answer to things like daily patches, large game size and game instability.
tldr: Its Early Access
thats one of the big points of EA that even very small devs (like Wildcard) can make big games without the need for a high number of testers and hardware to test on.
this is also a common problem with guys out there:
"i want this fixed asap! i want this patched asap!"
against
"i don´t want to download a patch every day"
regarding this, just take a short read about how EA works.
for the second thing
if you want to compare games at all, at least chose games with the same game engine. and even then it is very difficult to do it.
GTA V f.e. was developed for consoles and not the PC. if you bring GTA V up to PC standarts it goes way over 100gb and even kills a 1080Ti at 1080p (yes even worse than ARK!!)
a other game that uses the UE4-engine is Hellblade. it is just a small game but it also takes ~40gb.
and if you go out and look at other modern games you will see that they all range from 40-120gb.
thats a common size these days.
but you know what?
Games get bigger all the time. wait for two or three years and you will look at games with over 200gb size.
PS
space is one of the cheapest things regarding pc hardware....
1TB cost not even 40bucks....
We are looking at different types of testing then. There is gameplay testing, which is testing a new mechanic or any additions to the gameplay, and then there is the QA testing, which is playing the game or a scenario over and over again to see if there is a bug that is causing a problem with the game. We're not getting paid to do the latter, it is a legit profession and using people paying for a game as testers under the name of Early Access is far from professional. You don't upload every single change you make to the game's existing mechanics as part of "Yeah but we're Early Access", you handle that ♥♥♥♥ on company side by either professional testers or sparing a portion of developers into testing if you're an indie company without a lot of funding. This game is literally the only Early Access title that I know of that does this sort of amateur ♥♥♥♥♥♥♥♥.
Subnautica is an early access game about an open world as well but there has been 5 major updates and 8 minor updates and the game is about to be released in a few months and is nearly done. Because they design iterations on their development cycle, deciding on what is on their backlog and what to tackle next. After they're done with their iterations (mind you, this includes <b>TESTING THEIR OWN CHANGES ON THEIR SIDE</b>, as thats why developers are for) they present it with a patch. If there is something that doesn't work (which is our jobs as Early Access people, testing the <b> new game mechanics </b> and reporting back what works and what doesn't. Finally, they add those changes to the backlog, change it on their side, test it and then add it to the next patch. This is how a development cycle is handled around the world for all game dev companies, which is why I'm questioning company's choices on how to handle the Early Access. Dead Cell is a smaller team that does follows the same logic as well. Even No Man's Sky, which is registered as a failure, just released a HUGE patch containing the complete data.
GTA V doesn't have loading screens as well so that is a completely unrelated example. It might divide the map into chunks and handles loading in the background while you're moving around but that is not a technical obstacle, that's just being smart with finite resources. GTA V is just one example, there are ♥♥♥♥♥♥♥♥♥ of games with great graphics that are below that size limit. GTA V is a done game by the way, this is still an Early Access title, so by logic, I'm expecting the final size to be much bigger than 90 GBs of data.
Finally, I don't mean in-game compression: that doesn't change the data size all together. I mean using compressed assets that takes a long time to compress but results in a nearly-lossless new assets. I don't know what's wrong but I can definitely tell that something is wrong. I mean the game looks great, sure but this size is riddiculous. There HAS to be something to be done about it.
I know I'm giving the same example but no one ever had any problems like "I want this fixed ASAP" with Subnautica. It is understandable to have frequent patches in the beginning of the project but at this point, I'd expect to have something playable, which we do. So what's the point of pushing more updates each day without a title or an explanation? I mean just because someting non-critical is changed doesn't mean that they should immediately push it to the Steam is what I'm trying to say.
I'm not against games with 60 or 100 GBs of data. However, this game isn't even done yet and with every update (since Steam doesn't just download the diff patch, it re-downloads whatever is changed), there is like 4 GBs of raw data for 500 MBs of changes. I don't have any limits on my internet agreement but there are a lot of people who does and because of that, they can't even play the game they paid for to support developers to continue with their game.
I don't agree with GTA V not being on PC standards. Even today I'm happy with it's graphics and it's texture quality. However, it is not as good looking as Ark but Ark's beauty comes from the new material system UE4 is using as well as new lightning shaders. Anyway, the size is an understandable problem that's why I was asking to just have a look to see if there is something wrong with packaging configuration or their assets. Patches, though, is a potentially big problem for a lot of users and I don't think the amount of data we download each day doesn't justify the changes we get.
the down side of this is a high increase in needed CPU power and high increase in needed RAM size and RAM speed (with putting ARK on a SATA-SSD as minimum). this would also mean that most things that are currently handled by the server must be pushed back to the client, increasing the CPU and RAM load further.
this would also make it easier to cheat.
all those things go against most peoples available hardware.
also the game is not finished so that it could go bigger or smaller at the end since such things are usually done at the end of development.
but also like i said before:
it is almost impossible to compare games, even with the same game engine.
Your wants do not match the reality of the situation. Its a game in development.
1. They're working on many things both for pc and console for the 29th date and well beyond.
2. Per point 1, this is nothing new and those of us who have been with the project for 2 years or so know this.
3. Comparing a AAA game to Ark with a completely different situation is completely inane and pointless to do.
Look we get it, this is just a long winded way of you saying you dont like the updating or the manner by which its being updated nor the way the game is unoptimized as its progressing to optimization.
Best advice then for you:
Either lower your expectations to match the reality of the game's situation or shelve it and come back when it matches you. They're not stopping updates(expect 4 semi major ones in the next 7 days alone, if you take what Jeremy has said on Twitter: https://twitter.com/arkjeremy/status/896122813353938945 ), so decide now whether to weather it or not.
Now I have zero issues with ARK as I have a high end system too, a 1080ti from Palit, Samsung 960 pro, Evo and 850 pro SSDs, 16gb of Trident Z RGB ram and an I7 7700k.
500MB every day or two isn't that much, if you are complaining of such you may want to invest in some decent storage, even 108GB which is what my ARK is taking up can be solved by buying cheap storage, this is the reality of high end PC gaming.
As said by Acandri, the game is not finished, so that 108GB may become 70-80, maybe less in the future, it's just a matter of time.
In conclusion I think we should all just sit back and enjoy until the next big update or launch, no one is doing anyone else a favour by ranting, not saying OP is, just generally saying.
I have the core game a mod map and 3 mods and it's only 59Gb
If I add the 3 DLC maps and the total conversion DLC with the mods I have it jumps to 109Gb
The core game with the original Island map is about 40Gb as per store page.
So one map like GTA 5 and it's much smaller, core game 4 additional maps and a total conversion and it's bigger than GTA 5.
So no surprise its bigger way more map content.
Some people just moan for the sake of it..........
EA is a marketing tool of course but that doesnt mean it isnt testing prone either. Ark in EA has been from prealpha to now beta for all intents and purposes for 2 years of progression.
2gb vram + what ever other aspects of their rig can be conducive to the subzone transition freeze/lag but not exclusively. Some of us have i7 7700k's with 1080ti's, 64gb ram, and game on an ssd and can encounter the blips on cave entry. Its the game's optimization issues that are being worked on as we speak more than the rig itself in these regards.
just an other exsample of size:
Ffallout 4, known for its bad textures. so Bethesda released a mod with only 4k textures for the game.
only 50gb big.....