PDA

View Full Version : Downloader feedback



Annihilatron
10-10-2013, 05:16 AM
Downloader is agonizingly slow. My hypothesis it's the fact that the files are so tiny, and every download is an additional web request, so no consistent connection can be maintained. I'm seeing barely any network traffic (less than 50kilobits/s, fluctuating to and from zero) while this sucker is running. Checking some of the downloaded files, they're all single-digit kilobytes. There are 5400 lines in the filelist.txt ; that's at least 100x too many files.

Web requests incur a delay from point of origin to make connection with server (connection time per request). Pack your files so a big consistent connection can be maintained. Unpack after download. This should also cut down on the download time itself. Chunk them into appropriately sized pieces. There are plugin packs for almost every language for 7z, gz, etc.

I'd also recommend opening multiple webrequests at the same time, if the server can handle it.

Ideally you can plug into a p2p solution (a while ago many games were running with pando media booster, but that software's dead now), or configure the webserver for ftp (ftp will download faster as the connection is maintained)

Good feedback part: you keep track of what I've already downloaded (i think) so it can be started and stopped as I please. Maybe 10-15 files at a time and I'll get the completed client eventually.

- edit -

Lots of talk about authentication; personally I don't really support that as the number of people who will attempt to download this alpha version is static and not growing. Everyone will eventually get these files. However the download process itself is what can be improved.

- end edit -

Cheers
Chris
CTO buzzbuzzhome.com

Erde
10-10-2013, 05:32 AM
I just had to register for this.


or configure the webserver for ftp (ftp will download faster as the connection is maintained)

Ahahahahahahahahahaha *inhales* ahahahahahaha.

Seriously. Downloading many small files on ftp faster? Hardly. I dare you to actually try it.
For few larger files? Might be faster. Not much, tho.

More info: http://www.isi.edu/lsam/publications/http-perf/

Edit: I won't even bother going into details how ridiculous some other parts of your feedback are. My advice? Don't make suggestion unless you know what you are talking about.

kobisjeruk
10-10-2013, 05:59 AM
So you've registered just to mock someone whos trying to give his opinions regarding the installer issue? Its good that you point out and give helpful link about what he said but come on man, theres no reason to be condescending to your fellow members.

Erde
10-10-2013, 06:07 AM
So you've registered just to mock someone whos trying to give his opinions regarding the installer issue? Its good that you point out and give helpful link about what he said but come on man, theres no reason to be condescending to your fellow members.

I just KNEW someone would come forward saying something like this.

Annihilatron isn't really giving any feedback. He is telling/suggesting Cryptonic how they should handle serving the game client files.

This wouldn't be a problem if the actually knew what he was talking about. So yes. You could say I registered to diss on a stupid opinion. Someone might actually think that Annihilatron is a legit source of networking information and experience otherwise. He sure made it sound like he is.

Kualtek
10-10-2013, 06:08 AM
It would likely be much easier on their servers to have a torrent link that contains an installer that includes all the files required for the game. Then the patcher simply needs to update the files that have changed since the last time the torrent was updated. Use everyone's bandwidth instead of incurring all of that cost on yourselves.

keroko
10-10-2013, 08:12 AM
GW2 does it well, in the style of the old school WAD file from Doom days.



as OP says DL as compressed archive blocks where possible.

calc hash of local fileset as whole, then individual items on non-current ver compliance. at server side determine needed files and deliver in archive.

in case of WAD file style you have a file system of your own within a large file. doing this way can consider as files inside or as the disk image directly - 2 perspectives for dif checking.

I need to see this running myself..

Gremio
10-10-2013, 08:28 AM
You still have to hash the individual files locally to determine if they match the current version.

I was surprised they used HTTP, but that is not the biggest problem, and any slower download speeds from that are negligible. We're talking about an extra 100 or so bytes per file to request the file and start receiving it. And maybe an extra .06 second latency for re-establishing new connections for each file instead of a constant data stream. I don't think that 5400 number is correct btw (it might be for the filelist.txt, but not what it actually downloads), when you recheck the files it's somewhere around 3370 files, so we'll say 3500. The site we're downloading from was actually averaging about 200 KB/s.

3500*.06 = 210 seconds
(1/200000)*200*3500 = 3.5 seconds

That isn't necessarily the correct math or additive, but rather the worst case scenario of additional time from the "large" file count.

No. The problem is the files crashing and being saved incomplete. Same thing happens when you manually download files from their source. Either the file will immediately fail at 104-109 KB, or it will start and fail (from timeouts maybe?) anywhere in the middle of the download, which causes the oddities people have been experiencing.