As someone will a low download speed it is always a pain to have to redownload the game files whenever i am joining a different version of fa. For instance i wanted to try the faf dev games that were hosted today but i simply cant be bothered to wait for it to download for ages everytime. Something like this could help to get people to help with beta / development testing.
Is it possible to cache faf versions?
Pretty sure when you join a dev, it's only downloading the most recent version. Caching wouldn't help, because they're constantly updating it right now.
All files are cached, but fafdevelop changes way to often to utilize the cache. We tried generating better patches (and even had a working implementation) but it has other drawbacks in (needs more disk writes and CPU), so we dropped it.
He said, "I've been to the year 3000
Not much has changed, but they live underwater
And your great-great-great-granddaughter
Is playin' FAF, playin' FAF"
And since the files are packed up in zip archives, if a single line in a single file gets changed you have to redownload the entire zip archive again...
@askaholic said in Is it possible to cache faf versions?:
And since the files are packed up in zip archives, if a single line in a single file gets changed you have to redownload the entire zip archive again...
There's actually no need for them to be in .zip files as the game can mount folders.
I suspect the game (original game) does it to prevent fetching scattered files on disk. E.g., small random reads / writes.
A work of art is never finished, merely abandoned
@jip yes with most modern machines equipped with NVMe SSDs that should be less of a concern though
Tell that to our players with their 10 year old notebooks.
He said, "I've been to the year 3000
Not much has changed, but they live underwater
And your great-great-great-granddaughter
Is playin' FAF, playin' FAF"
It’s more of a web issue. Like we dont want every new client making 3000 web requests to download all the files. Although there’s could probably be some combination of the two like get a Zip the first time and then switch to getting individual files, but that adds a lot of complexity and then of course you gotta hash all 3000 files on disk separately to check for changes.