How to have different robots.txt files in developpment and production environment?

Hi,
I have a jekyll website hosted on my own Siteground server, and i use Github pages for developpment environment.

I’m trying to automate the process so i could easily choose to sync with Github pages for developpment or build the _site folder for production.

The only thing which change between dev and production is the robots.txt file which contains Disallow: / for Github pages dev site and Disallow: for the production site.

I would like to know if there would be a way to use a specific robots.txt file for when i do the jekyll build command in order to block google from indexing my staging website.

Also i’m looking for a way to be able to push to production via ftp without having to drag and drop the _site folder in Filezilla.

Thanks in advance for your help.

Etienne

My first thought would be to use a <meta name="robots" content=""> tag in the head of your pages and put them behind a conditional like this:

{% if jekyll.environment == "production" %}
   <meta name="robots" content="noindex nofollow">
{% endif %}

Then when you build for production you trigger that environment with JEKYLL_ENV=production jekyll build. But since you’re hosting the staging site with GitHub Pages this won’t work since I’m pretty sure they automatically pass the production flag for you.

Though perhaps you could use a different env name.

As far as ftping the contents of _site to your host. Personally I like rsync. There’s a little configuration (namely getting SSH setup), but you’ll find its way faster than ftp.

It can be configured to only push the files that have changed which can dramatically speed things up if you have a lot of files or large assets. You can also use a CI service to do the deploy for you. Check out Jekyll’s docs as there’s several deployment methods outlined there.

Thanks for your reply, i will test all of this and see what is the best to make it work properly.