The mod update story

So have you thought about it? I’m not asking for save game compatibility, but a few measures to minimize downloads would be nice.

For instance, i notice you use zips. However, i dimly recall reading that tar.gz is better for ‘minimal’ changing since it doesn’t store dates or the compression is per block or something like that.
It would also be nice if you used something like zsync for downloads even if you don’t settle on a compressed container format that minimizes those changes (but obviously better if you do).

edit: apparently zsync has builtin support for handling gzip downloads especially. These are the requirements:
zsync.moria.org.uk/server

Implementing updates as uninstall + install is much more easier.

  • What to do if player have missed one update? E.g. player have 1.2.3 and available version is 1.2.5? Keeping diff’s for each version may increase download size considerably.
  • What about other platforms? zsync page does not mentions neither Windows nor Mac.
  • Uploading updates will be more complex - right now all I need to do is copy archive to server and update repository.json - I don’t need any extra tools for this.
  • What if player have manually modified something in the mod?

Idea is not bad but it much complex to implement than it looks like.

zsync doesn’t keep diffs for 2+ versions. It keeps a hash of current parts file on the server (the new one) and uses that on the client to match the parts that already exist on the new version: no new additional work for you (except the mod container being gzip created by zsync and the metadata creation of course). It’s not completely sequencial-incremental.
Unless you plan to keep older versions around and allow the client to download them of course.
It’s better than rsync -z in that it doesn’t run any program on the apache server.

The server requirements are straightforward (on the server page Distribution section):
zsync.moria.org.uk/server

As for windows, there is a client port here : assembla.com/spaces/zsync-windows/documents

I confess that manual modification is a risk - i don’t know if zsync detect that case.
edit: probably not. If it the file was modified it probably only treats it as a deviation from the server file as normal, and shouldn’t have any way of distinguish from a old version - that would need support on your gui if you care to warn the user that his modifications are going to be overwritten and what exactly is going to be overwritten. This would be the case anyway with normal downloads though.

If for some reasons it doesn’t work, you can just fallback to normal http downloads as normal.

BTW, this is the program that ubuntu uses for users to update CD images.

Maybe the problem is that you’re currently unzipping the downloaded mods, and deleting the original and can’t compare easily to the file on the server? No support for loading compressed filesystems in VCMI? Could just keep the original mod file in cache instead of deleting it no?

VCMI can load compressed filesystems - .lod archive from H3 are almost identical to .zip - optional per-file compression using deflate.
But there is no way for VCMI to work with .tar.gz archives because it compresses all data at once - to extract a file you need to extract all previous files. So we’ll have to introduce yet another library with yet another archive file format. And this is road to bloatware…

Maybe allow downloading data via some kind of proxy instead of using Qt routines? Majority of users don’t care if download is 1 MB in size or 20 MB. While the rest of them will be able to use something like zsync.

But even in this case - you’ll have somebody to write support for this. Features like this will be very low-priority for me.

This is very bad idea.

Yes, VCMI supporters can make few mods working like this (through gz and other diff stuff).
But complexity of this will stop single modders from making mod updates, because they will have no education and possibility to make diff things etc.

What is that supposed to mean?

As the doubled extension implies, .tar.gz means that the files are first all .tared together into a single file, and then compressed with gzip.

Also, the .tar archive format (which has no compression and is just meant to put several files into a single one) does store dates, as well as a lot of other Unix metadata. It also stores a lot of empty space since metadata headers and file content are both aligned to 512 bytes (meaning that there’s always a lot of padding with 00 to reach the next multiple of 512). It’s not a problem when gzipped of course, since null padding compresses rather well.

Anyway, the point is that .tar.gz is not a good format to work with when you expect to change the content of the archive. All it’s good for is archiving stuff so that it takes up less space when storing it on backup discs or transferring it over the Internet – and for that purpose, you have many choices that are a lot more efficient at making it small, such as bzip2, 7z, or even plain old .zip with LZMA compression. (Note: most zip utilities only handle the STORE and DEFLATE algorithms. LZMA, PPMD, and some other algorithms allowed by the ZIP standard are less commonly supported, unfortunately.)

Anyway, good old DEFLATE still offers what is possibly the best compromise between compression efficiency and decompression overhead, which makes it well suited for a game data archive format.

If you read the rest of the post, zsync uses gzip.

What complexity? If you actually read the page you’d realize that the only requirements is for the mod container format to be gzip and the server to support http resume and multi-byte ranges (apache does). The zsync file doesn’t even need to be in the same server as the file it points to! It also doesn’t need to be gzip (however it’s better to be zsync creating the gzip files since it is more efficient at downloading then).

The whole point of the utility is to amortize downloads between versions, so of course the compression format used is not ‘solid’. However you’d get it back on the next mod update! Assuming that most users of VCMI that download mods are going to be regulars, the server would be hammered far far less on mod updates like this (especially if they have huge archives of images that don’t change, like i expect will happen).

  1. You forgot that majority of users use Windows, where there is no gzip/tar/other things like this
  2. I don’t think server will suffer from constant continuous downloads - there is not so many HMM3/VCMI users to DDOS it.
  3. Gzip/Zip/Rar etc are good to distribute, but are bad to maintain and correct. If I want to correct/balance something in mod, I still will have to uncompress files and work on them locally. So I was against zip thing from the start. All mods that are not from VCMI server come in one file already archived. So it looks like zip file in rar file and so on (as you know, if file is archived by one archiver, other archive algorithm will not make it much more light.
    And if you talk about disk space economy, Windows already can make compressed folders on filesystem layer (don’t know if Linux can do the same)
  1. GZip is a standard format that rar or zip understand, and besides on this usecase the users wouldn’t need to uncompress or recompress files (unless they wanted to ‘mod the mod’)
  2. obviously the advantages are not only for the server, but also for the users, not downloading a 100mb file multiple times that is 96% the same.
  3. you need a package format anyway for distributions. Are you expecting users to download multiple files for your stuff!? If you tell me you want autoshield crap i will scream internally because the same mistake was done in BG modding and it resulted in things like ‘BGTrilogy’ which takes 5+hours to install on a modern computer for no reason at all.

Actual objections to the idea (which i disagree with, but whatever) could be:
1: we don’t want to change the compression format (for some reason - remember, zsync is less effective for compressed files with formats which it doesn’t understand because it can’t inspect which files are the same inside)
2: we don’t want to maintain a centralized download server so there is no point to this (there goes the downloader idea)
3: we don’t want to create zsync files on the server for the submitted mods (possible to create it automatically from a mod submission form?)
4: we expect that mods will not be downloaded more that once per user and will not be huge anyway (possible. But considering most mods are new factions i expect 10mb+ to be standard)

i30817,
1: This. Because using .tar.gz as primary format won’t work - as Turnam said it is suited for archiving, not for accessing it constantly during gameplay.
2:Not this. Idea is to have one or several centralized repositories.
3: This. For now I’d rather keep this as simple as possible. If there will be some automated mod submission then this may change but for now let’s keep it simple.
4: I’d rephrase this: we don’t expect that users will create huge amount of traffic that vcmi.eu won’t be able to handle. If Tow will say that vcmi creates huge volume of traffic - we can consider this. But for now I’d rather spend my coding time on something else.

  1. tag.gz was my mistake, it is only gzip that zsync uses - no tar. If it’s not suitable for the engine, you could recompress it onto a form suitable after downloading
  2. so there is no reason not to consider it
  3. ok
  4. i’m more worried about my own traffick or the users one in general. But it’s also good for you. As a extreme example of a possible future take the 1pp project for baldur’s gate. It replaces the avatars and adds monsters to BG.
    1333+27913+5209+1573+2402+3653 = 42083 downloads of about >1GB in aggregate… or about 41.0967 terabytes; and this is only the last version.
  1. But gzip can’t be used without some container format because it can only compress one file. And any container will cause abovementioned problems.
  2. In this case - some proxy as I described above may be a better idea:

And by “proxy” I mean command that vcmi launcher will execute to download file instead of using built-in downloader.

UPD:
As for size - let’s take VCMI numbers. Windows package is ~25 MB in size, ~5000 downloads in ~3 month - 125 GB total, 1.4 GB daily. Not much.
For other platforms download counts are much smaller.

Actually you don’t need to change archiver/compressor, rsync’s xdelta should work well (for gzip you have --rsyncable switch). VCMI might consider using http://librsync.sourcefrog.net/ to speed up downloads.

The difference between both is that:
‘zsync downloads the checksums and then downloads the data, whereas rsync uploads the checksums and then downloads the data’

that is, rsync requires special server software running in apache to recognize a incoming connection and sort out from the metadata the client sends which parts of the file should be transmitted. It also (for compressed files, which we are talking about), require (for real efficiency instead of a waste of time) gzip to be invoked with the ‘–resyncable’ option during file creation, so repack is always in the cards due to impredicability of the mods uploaded to the server (probably can’t be solid, works best if it is ‘specially compressed’ by the tool itself).

I believe the best solution would be a form for mod upload that unziped it and repacked with zsync / created the metadata), and a client that would download the whole file during the first download, but only partial updates on the next updates (using zsync).
edit: although if you don’t mind installing whatever rsync needs i believe it will be more efficient, if you go the gzip --rsyncable dance and use rsync -z - compressing the delta update download itself; but then you lose the option of hosting the metadata files somewhere other than the main download, for instance for mods that don’t allow you to host their stuff - although presumably they’d have horrid efficiency for the reasons on the previous paragraph anyway).