In this quick post, we will discover how to quickly setup a CI/CD pipeline using Gitlab in-house runners and automatically build and deploy our Hugo static site in a self-hosted environment.

Assumptions

My instructions are based on the assumption that you already have deployed a VM somewhere to host your Hugo site. If not, please do! My easy suggestion is to use a Digital Ocean droplet with your favorite linux flavor installed, install NginX and just deploy the Hugo site following the relevant documentation. Also, we are going to use ssh keys to logging in, so please copy the public key on the VM before continuing.

General infos

As you know/read, you have to run first the hugo command to generate the necessary files for deploying the site. Then rsync the “public” directory to the path that your web server sends the requests. So in an one-liner example, the command is as follows:

hugo && rsync -avz --delete public/ {username}@{ip-address}:{dir-path}

Building the pipeline

We will create two stages on the Gitlab pipeline, one build stage to generate the files/website, and one deploy stage for the actual deployment. First lets create a job named “site-pages” on the build stage. You can use the template from the Gitlab editor to create the .gitlab-ci.yml file and edit it as follows:

variables:
    GIT_SUBMODULE_STRATEGY: recursive

stages:
    - build
    - deploy

site-pages:
    stage: build
    image: monachus/hugo
    script:
        - hugo
    artifacts:
        paths:
        - public
    only:
        - master

As you see, we run the hugo command and export the public directory as an artifact from this job that is going to imported to the next job automatically.

Next, we need a docker container to rsync the files to the web server. For this first we have to make use of the Gilab CI/CD variables for our ssh keys and stuff. Go to “Settings –> CICD –> Variables” on the Gitlab project and create the variables as follows:

  1. USERNAME, username for the deployment-VM (eg. admin)
  2. DIR_PATH, destination directory on the deployment-VM (eg ~/www/public/)
  3. HOSTNAME, IP address/Hostname of the deployment-VM
  4. SSH_PRIVATE_KEY, private key that you are going to use for ssh in base-64 encoded format

If you don’t know how to hash your private key, just use the below command:

cat {private-key}.rsa | base64

As you already guessed, these variables are used as environmental variables inside the containers of your pipeline. So, when you spin up a docker container inside your pipeline it can read the values of the variables by default.

Now we are going to create a job deploy which is part of the also named deploy stage. To do that, append the necessay lines, so the .gitlab-ci.yml file looks something like that:

variables:
    GIT_SUBMODULE_STRATEGY: recursive

stages:
    - build
    - deploy

site-pages:
    stage: build
    image: monachus/hugo
    script:
        - hugo
    artifacts:
        paths:
        - public
    only:
        - master

deploy:
    stage: deploy
    image: ubuntu:latest
    before_script:
        - apt update -y
        - apt install -y ssh rsync
        - mkdir -p ~/.ssh && chmod 700 ~/.ssh
        - echo $SSH_PRIVATE_KEY
        - echo $SSH_PRIVATE_KEY | base64 -d > ~/.ssh/id_rsa
        - chmod 600 ~/.ssh/id_rsa
        - ssh-keyscan -t rsa ${HOSTNAME} >> ~/.ssh/known_hosts
    script:
        - rsync -avz --delete public/ ${USERNAME}@${HOSTNAME}:${DIR_PATH}
    only:
        - master

Based on the ubuntu:latest docker image, we update/install dependencies and decoding the base64 private key at the proper file/path on the before_script commands. The script command is just rsyncing the files from public directory (artifact from the previous job on build stage) using the CI/CD variables of Gitlab (username, hostname, dir_path) as environmental variables.

Job Done

You should now see both stages running at a git push event and succeding (deploying) the Hugo website on your server.

Troubleshooting a CI/CD pipeline is a burden in my honest opinion. The reason is, you are forced to read logs from containers on the Gitlab UI and then reproducing on an dev environment in order to understand what went wrong. But it worths the trouble when you succeed, because you don’t have to tshoot or trying to remember stuff everytime you commit something.

I hope you enjoyed it as much as I did and please bear in mind that this is my first post ever! :)