In my ~/website/_posts
directory I have 1,212 blog posts converted from 2,500 Stack Exchange posts. Only the good posts made the cut.
In my ~/website2/_posts
directory I have 4 blog posts “pulled” from my current GitHub Pages Pippim website repository.
Reading these instructions it is my understanding I need to write a bash script to compare the two directories and issue a command for each new file:
cd ~/website2
cp -a ~/website/_posts/filename.md ./_posts/
git add ./_posts/filename.md
git status
- Are the instructions correct?
- Is there an existing bash script somewhere that already does this?
- Is there an easier way of pushing an entire directory to GitHub Pages?
-
Note: The first command:
cd
and last command:git status
will not be part of loop iterating over 1,212 blog post files.
I’m a little concerned with how long it will take GitHub Pages to rebuild with twelve hundred blog posts. Currently it’s about 20 seconds. Will this take a lot longer and mean I’ll have to learn how to develop the website locally?
Obviously after populating the blog post landing page with 1,212 entries I’ll be doing a lot of development with lookup by date, tag, Stack Exchange site (perhaps a future way to use category), etc. I wouldn’t want GitHub Pages commits to go over 30 seconds or so whilst developing on-line. If it does perhaps I should consider developing pages locally?
As a side note a couple of posters recently asked about changing the date or directory name in all posts. It took 1,182 lines of Python code to convert Stack Exchange posts to Jekyll Blog Posts. Documenting the program seem to take just as long as can be found on my website’s Home Page ← link valid as of December 5, 2021 but may be incorrect page position in the future as the Home Page expands.