I have a couple of static websites that I generate with Jekyll and upload to a GCS bucket.
Today, for deployment, I have a shell script that does some validation, parses a bucket name out of _config.yml, and essentially runs:
JEKYLL_ENV=production jekyll build --incremental gsutil -m rsync -d -r ./_site gs://$GCS_BUCKET
This works great, but an interesting question recently sprung to mind:
What happens if I have a large site, and I want to make a change to it from a different computer, or if my ‘_site’ directory gets nuked for any number of reasons?
In the happy case (_site just got nuked), files that get straight-up copied (e.g. images) will retain their mtimes, but files that get generated by Jekyll (e.g. posts) will be new files with new mtimes. As a result, even if nothing changed, or if I just changed a single post, it would want me to re-upload all generated html.
In the unhappy case (separate machine, have to git clone my repo from scratch), pretty much every file will have a new mtime, so the entire site has to be re-uploaded, even if it hasn’t changed.
Is this a problem someone has solved, or are people just living with it because it’s not that big of a deal in practice?
I suppose the
git clone thing is kind of out of scope of Jekyll, but it would be neat if there were a plugin/setting where all generated content has its mtime set to the latest mtime of its dependencies as part of the build process.