Automated Web Deployment
Automated Website Deployment with Scripts⌗
This is an odd topic for a site that mostly focuses on Systems and Embedded programming, but interesting none the less.
Without further ado; onto the subject.
The Problem⌗
Deploying updates to the website is a somewhat slow and cumbersome process. The whole process is further complicated by my excessive use of jails on the VPS this page is running on.
Typical workflow⌗
- checkout branch from repo.
- make changes in hugo files(markdown)
- commit changes
- use sftp to push the generated static files to the sever
- ssh into server
- manually cp -r ./public into the nginx jail/container
- Maybe restart the nginx server inside the jail most the time it isn’t an issue.
Why it’s an issue⌗
- It takes time
- It’s error prone
- Writing shell scripts is more fun than doing it all manually.
- I get to play around with more cli tools.
Solution⌗
To deal with this issue I first made sure I have everything correct in my local .ssh/config file. This allow me to use public key authentication and make scripting a ton easier.
I extracted the different sections of code into bash functions and wouldn’t you know it, it works great.
#!/usr/local/bin/bash
#Author: Jake Goodwin
#Description: A program to deploy my static site onto my vps
#
VER="0.2"
ARCHIVE=public.tar.gz
SERVER="vps"
FOLDER=public
WWW=<MYPATHHERE>
BATCH_FILE=sftp_batch.txt
This section above is just the variables and the name of the batch file I use to push the local archive.
function compress_public () {
chown -R jake:wheel ./public
chmod -R 0744 ./public
if [ -e ${ARCHIVE} ]; then
echo "removing old archive"
rm ./${ARCHIVE}
fi
echo "Archiving: ${ARCHIVE}"
tar -c -af ${ARCHIVE} ${FOLDER}
if [ -e ${ARCHIVE} ]; then
echo "Archive built"
else
echo "Error, missing ${ARCHIVE} file."
exit -1
fi
}
You can see that here I set the owner, group and UNIX permissions recursively to the folders contents.
function install_website () {
echo "Installing the website over ssh"
#ssh server 'cmds here' > output_var
ssh ${SERVER} "cd Downloads; tar -xf ./${ARCHIVE}"
echo "SERVER(SSH) --> ${RESULTS}"
ssh ${SERVER} "cd Downloads; cp -r ./${FOLDER} ${WWW}/${FOLDER}"
echo "SERVER(SSH) --> ${RESULTS}"
}
function send_archive () {
if [ -e "${ARCHIVE}" ]; then
echo "Sending..."
sftp -b ${BATCH_FILE} ${SERVER}
echo "Data sent..."
else
echo "Error, missing ${ARCHIVE} file."
exit -1
fi
}
function clean () {
if [ -e ${ARCHIVE} ]; then
rm ./${ARCHIVE}
fi
ssh ${SERVER} "cd Downloads; rm -rf ./public*"
echo "SERVER(SSH) --> ${RESULTS}"
}
function main () {
echo "#####################"
echo "DEPLOY SCRIPT VER: ${VER}"
hugo
compress_public
send_archive
install_website
clean
}
main
The rest of the script is pretty simple, it calls all the functions from the main function then cleans up the archives.
Conclusion⌗
Now I’m sure someone out there is looking at this wondering why I didn’t just use rsync.
To be honest it would probably be a better tool for most situations but with the way my BSD jails are setup I don’t want to redirect the ports on them for an rsync daemon. Also I’m attempting to minimize the cpu footprint of the server so I’ve stripped out many utilities from the BSD jails.