Claus Nov 30, 2017 @ 5:44am
Steam 2 please, for a better download and file system is much needed.
When you can download games faster than you allocate you get fed up with it on a ssd, especially when i can head over to origin/uplay and download a game with 120mbp/s without ever allocating anything.

I've never seen 100 mbp/s on steam in 12 years.

It's an ancient system that needs to be replaced, games needs to carry over to this new platform of course or it will never be a thing.

Also steam takes more space than it needs to simply update or unpack files, it's just not acceptable in 2017 and this is why i suggest a steam 2 platform to remedy this.

Last edited by Claus; Nov 30, 2017 @ 5:49am
< >
Showing 1-9 of 9 comments
Mu6o Nov 30, 2017 @ 6:07am 
I am downloading with 12MB/s. (almost 100Mbp/s) This is my limit on my network. Yet I have no problems with that. I download most games under 20 minutes. Also what's the difference between 20 minutes and 17?
Spawn of Totoro Nov 30, 2017 @ 6:29am 
I download at 15 MB/s (120 Mb/s) on Steam. Origin gets about 10-12 MB/s, so I download faster on Steam.

My ISP claims 130 Mb/s speeds, so max would be 16.25 MB/s in theory, but there is always extra data moving around between the computer and internet as well as modem and router, that can account for difference.

As for unpacking and updating, that is on the developer as they make the patch. Path of Exile, for example, takes a long time to unpack, updated, then repack the game, due to how it was made and how the developers choose to implement patches.
The_Driver Nov 30, 2017 @ 7:23am 
Originally posted by Claus:
Also steam takes more space than it needs to simply update or unpack files, it's just not acceptable in 2017 and this is why i suggest a steam 2 platform to remedy this.
https://www.reddit.com/r/Steam/comments/650wim/19gb_required_for_200mb_patch_okay_steam/dg7w4ya/
We do this because we have no idea if your internet or wifi will cut out during the download, or if your PC will shut down, or if Steam will crash for some reason. If any of those things happen and the patch is only halfway done, and we didn't have the backup files, then the game would be in a totally broken state and wouldn't be playable until you got online again and Steam could do a full re-download of the broken files.

Which in itself is of course unacceptably backwards as well, that there is no partial redownload of files possible. No detection of synced chunks and unsyned chunks possible but rather just a complete redownload. Even old synchronization tools like rsync have partial downloads, https://rsync.samba.org/tech_report/ (note the 1998 date on it).
Gwarsbane Nov 30, 2017 @ 7:30am 
I've seen steam downloading to someones system at over 1 gigabyte per second on youtube. If Steam is downloading slow its because your ISP is limiting it, there is an issue somewhere between steam and your ISP, there is an issue between your ISP and you, or its your system.

Try changing your download location, that should change the route from steam to your ISP if they are in 2 different directions, least for the most part. At some point they might be on the same routes anyway.


As to the size of the updates.... the updates are compressed. To be installed, they need to be uncompressed. There is no way around that.

Some games are also compressed. I know for a fact Payday 2 for example is compressed. To install a patch for that game you have to have at least 30 gigs free. Why? Because not only do you have to uncompress the update to install it, but you also have to uncompress the game files for the updates to be installed in. Then you need to recompress the game files. All that takes space.

Again with the payday 2 example, the game requires 45 gigs of space to be installed. It also requires 30 gigs free for updates. If the game was not compressed, it would not require that 30 gigs of space, but the game would because to 2 times its current size making the space requirement pretty much null and void.

And of course it takes a long time for a compressed game and compressed update to install. The slower your computer the slower the update installs. If you try using your system while doing an update of a compressed game it will slow it down. So tell it to install the updates, walk away and go watch a movie or tv show or something. Or better yet, do it over night while you sleep.
aiusepsi Nov 30, 2017 @ 8:17am 
Originally posted by Claus:
When you can download games faster than you allocate you get fed up with it on a ssd, especially when i can head over to origin/uplay and download a game with 120mbp/s without ever allocating anything.
I'm not quite sure what you mean here. Exactly what behaviour are you seeing? If Steam's taking a really long time to allocate files as a separate step in the download, well, that's not a thing that's supposed to be happening! Just allocating a file should be very fast unless something funky is going on.

Originally posted by Claus:
It's an ancient system that needs to be replaced, games needs to carry over to this new platform of course or it will never be a thing.
. It's only a maximum of about six years old. It's not particularly ancient!

Originally posted by The_Driver:
Which in itself is of course unacceptably backwards as well, that there is no partial redownload of files possible. No detection of synced chunks and unsyned chunks possible but rather just a complete redownload. Even old synchronization tools like rsync have partial downloads, https://rsync.samba.org/tech_report/ (note the 1998 date on it).
Well, the dev you were talking about there was talking about a hypothetical!

What Steam does is a two-phase stage and commit: when doing a download, it builds a new updated version in a staging area, and this process is absolutely resumable. Once that staging process is done, it commits it over the top of the old version in a single step.

This has the disadvantage of needing temporary working space, but it avoids the hypothetical situation the developer is describing where the game's install is modified in place, and it'll end up trashed if the process is interrupted. It's possible that (again, hypothetical) situation could be recovered from, but you wouldn't be able to do it offline.

It should also be noted that Steam's download system (Steampipe) and rsync have very different goals; rsync has a very chatty conversation with the server on the other side to work out what to sync. This is a lot less scalable than what Steampipe does, which does the work to calculate the deltas once up front, and then serves chunks over HTTP from dumb servers.
The_Driver Nov 30, 2017 @ 8:34am 
Originally posted by aiusepsi:
It should also be noted that Steam's download system (Steampipe) and rsync have very different goals; rsync has a very chatty conversation with the server on the other side to work out what to sync. This is a lot less scalable than what Steampipe does,
It's also 20 years older and only an example.

You'd probably use something like hashed chunks of fixed size instead, like in a torrent. The new version could then just update the changed chunks in-place. In case of an error (as detectable by the update lock being (still) in place), just rehash the files chunks and find the unsynced ones (the up2date state of the hashes is precomputable, so you'd just compare against that) and redownload those selectively.

Yes, if an error occurs, you may not be in a good state, but like the valve employee said... optional cowboy mode. If one were fancy the old chunks could be kept as a backup as well to undo the patch, eliminating the need to be online to repair (although playing unupdated games is at least a little frickly in steam (but possible afaik)).

Plus with that system in place, the old verify integrity could also selectively download only parts of the file, so a minor defect could mean downloading a megabyte instead of several gigabytes then.
aiusepsi Nov 30, 2017 @ 9:21am 
It's also 20 years older and only an example.
Yeah, I'm just pointing out that if you're evaluating solutions to a problem, you've got to be looking at the intended use cases. The algorithm used by Steampipe is pretty similar to what Rsync does, on a conceptual level. It's not like the Steam devs aren't aware of Rsync! It's just that it's solving a different problem, which is why the form of the solution is different.

Originally posted by The_Driver:
You'd probably use something like hashed chunks of fixed size instead, like in a torrent.
Steam actually does use hashed chunks of nearly-fixed size, as it happens.

The new version could then just update the changed chunks in-place.
This is where it starts getting a little trickier; say the file version 1 is AAABBBBB, and file version 2 is AAACCBBBBB. I.e., the file's got a new bit of data put in the middle. You've then got two ways to approach that; assuming that the chunk length is 4, and the original file is decomposed into chunks of AAAB, BBBB, you can either consider file 2 to be AAAC, CBBB, BB, at which point you're redownloading everything, or version 2 is AAACB, BBBB, and your chunks are no longer fixed size, and you only have to redownload the AAA&B from AAACB. Or version 2 is AAA, CCB, BBBB, and you can reuse a part of your first chunk, and the whole of your second chunk. Anyway: not fixed size.

Another problem here is that you can't just insert data into the middle of an existing file (at least, not via any method I know of; you might be able to abuse a deframentation API to deliberately fragment a file, but... not a good idea!) You'd have to overwrite the file from the point of insertion onwards, which is not ideal if you're looking for update-in-place.

The point I'm getting at here is that in general, the Steam download system is relatively sanely designed, with sensible trade-offs. There are probably situations it could handle better, but it's very arguable that doing so would increase complexity, which can often lead to reduced robustness.

Plus with that system in place, the old verify integrity could also selectively download only parts of the file, so a minor defect could mean downloading a megabyte instead of several gigabytes then.
I haven't recently deliberately corrupted a large file to check, but I think Steam can already do this; I think it just depends on the type of corruption. If you just mutate a byte, I think it'd be fine. If you're inserting or removing, you'd have issues.
The_Driver Nov 30, 2017 @ 10:08am 
Originally posted by aiusepsi:
The point I'm getting at here is that in general, the Steam download system is relatively sanely designed, with sensible trade-offs. There are probably situations it could handle better, but it's very arguable that doing so would increase complexity, which can often lead to reduced robustness.
If we assume a given complexity as maximum tolerable, they it's hard to argue for any improvement...or flaw^^

Originally posted by aiusepsi:
I haven't recently deliberately corrupted a large file to check, but I think Steam can already do this; I think it just depends on the type of corruption. If you just mutate a byte, I think it'd be fine. If you're inserting or removing, you'd have issues.
I just tried it, edited a single Byte in Path of Exiles content file. The validation went over the whole file (the error was in the second byte... so if it would be using the chunks to detect failure to redownload, steam could have come to the conclusion FAST) and then decided to download 7.8 GB (->the file, but compressed). (Note: It deleted the file in the process, I thought I might be able to just validate again...).
So no. It appears to be both checking and redownloading the whole file.
aiusepsi Nov 30, 2017 @ 5:17pm 
Originally posted by The_Driver:
If we assume a given complexity as maximum tolerable, they it's hard to argue for any improvement...or flaw^^
I see what you’re getting at, but it really amounts to a judgement call: is the potential gain worth the detrimental effects on reliability or code maintainability? People will have different opinions on where the right balance is, which is fine.

My personal theory is one of the reasons that the current Steam content / download system favours robustness and simplicity is bitter experience with the old system being far too complex (Steam’s local files are a cache for a distributed virtual file system!) which lead to maintainability issues (there’s a reason that Steam only got features like a choice of library folders on other drives and bandwidth limiting after they dropped the old system) and caused all sorts of offline mode problems.

Originally posted by aiusepsi:
So no. It appears to be both checking and redownloading the whole file.
Interesting. That doesn’t seem like ideal behaviour; that could definitely be better, but it’s not a rip-it-up-and-start-again thing.
< >
Showing 1-9 of 9 comments
Per page: 1530 50

Date Posted: Nov 30, 2017 @ 5:44am
Posts: 9