Hugo Site Automation

This site is built with a static site generator called Hugo. It works well for what I’m trying to do and it’s more secure and faster than a CMS. Since I’m a big proponent of continuous delivery and project automation, it seemed only right to get the site update running in a pipeline on every commit to master. This post outlines that process.


This isn’t a Hugo tutorial. There’s plenty of information on the Hugo web site that I’m not going to cover here. Having said that, I do have some customizations in my workflow that aren’t Hugo-standard. SCSS compilation, HTML minification, etc. This touches on some npm and Gulp usage which I cover but don’t dive into a lot of detail on. The site publishing step assumes you have SSH access to your web hosting provider. If that’s not the case, you will have to automate things differently.


This site doesn’t use any custom themes (and it probably shows right now since we’re in the early days of it). I have custom layouts of various sorts and I use partials for the sidebar, header and footer. For the most part this is a standard Hugo setup. Where it differs is in the use of some of the static content.

CSS and Javascript

I’m using Sass for CSS generation. I admit this is a bit of an experiment but so far it works. Hugo expects your CSS and javascript in a folder named static. But since I want to put some tooling in place to create those files, I’m keeping the SCSS and JS files in their own directories under a root src folder. I use gulp installed via npm to make this work. Here’s what you need to get the build working.


  "private": true,
  "scripts": {
    "build": "gulp scss; gulp js; gulp dist"
  "devDependencies": {
    "gulp": "^3.9.1",
    "gulp-autoprefixer": "^4.0.0",
    "gulp-htmlmin": "^3.0.0",
    "gulp-sass": "^3.1.0",
    "gulp-shell": "^0.6.3",
    "run-sequence": "^1.2.2"

With this package.json in place you are an npm install away from having gulp available to compile your CSS and add it to the static folder. Note that I’m putting it in that location so that hugo serve can pick it up from there. It’s not a good idea to mix generated and non-generated content in the same folder but it’s a tradeoff I opted into for now.


var gulp = require('gulp');
var sass = require('gulp-sass');
var autoprefix = require('gulp-autoprefixer');
var shell = require('gulp-shell');
var runseq = require('run-sequence');
var htmlmin = require('gulp-htmlmin');

gulp.task('scss', function() {
            browsers: ['last 20 versions']

gulp.task('js', function() {

gulp.task('watch', ['scss', 'js'], function() {'src/scss/**/*', ['scss']);'src/js/**/*', ['js']);

gulp.task('hugo-build', ['scss', 'js'], shell.task(['hugo']));

gulp.task('minify-html', function() {
            collapseWhitespace: true,
            minifyCSS: true,
            minifyJS: true,
            removeComments: true,
            useShortDoctype: true

gulp.task('dist', ['hugo-build'], (callback) => {
    runseq('minify-html', callback);

gulp.task('default', ['watch']);

Okay, that’s a screen of text. The important part is that I can compile my SCSS from src/scss to static. I’ve done the same thing with my javascript code in src/js. There’s a watch task that keeps the static folder up to date with changes in the scss and js folders. This is useful in the development workflow when combined with Hugo’s serve feature for seeing live reloads of every change (most of the time). There are also gulp tasks for running Hugo and for minifying the generated HTML. These tasks are used in our package.json so that a single npm run-script call will create our entire web site.


The build and deployment happens inside a docker container. I created a custom one that includes npm, git, Hugo, ssh and Pygments (for server side syntax highlighting with Hugo).


FROM alpine:3.6
MAINTAINER Benjamin Pack <>

ARG HUGO_SHA=67e4ba5ec2a02c8164b6846e30a17cc765b0165a5b183d5e480149baf54e1a50
ARG HUGO_TGZ=hugo_${HUGO_VER}_Linux-64bit.tar.gz

RUN apk update && apk upgrade
RUN apk add --update --no-cache \
    bash \
    ca-certificates \
    curl \
    git \
    openssh-client \
    nodejs \
    nodejs-npm \
    python \

RUN pip install --upgrade pip
RUN pip install Pygments

RUN curl -Ls ${HUGO_URL}/v${HUGO_VER}/${HUGO_TGZ} -o /tmp/hugo.tar.gz \
    && echo "${HUGO_SHA}  /tmp/hugo.tar.gz" | sha256sum -c - \
    && tar xf /tmp/hugo.tar.gz -C /tmp \
    && mv /tmp/hugo /usr/bin/hugo \
    && rm -rf /tmp/hugo* 

This image can be found on Docker Hub if you’d rather not create your own. If you do decide to build your own, you may want to check what the latest version and SHA for Hugo might be so you can replace those value. An updated version of this dockerfile is likely available in my github repo.

Bitbucket Pipeline

The source code for this site is in Bitbucket for a few reasons. It’s our SCM at work, they don’t charge you to have private individual repositories, and (most importantly for our purposes) they provide an integrated pipeline solution. We use Jenkins to drive our project automation at the office, but for something as simple as keeping this site up to date I thought maintaining my own Jenkins server somewhere was more complicated than it needed to be.

For a Bitbucket Pipeline, you need a docker image to use for the build and a script to execute within a running container. This script is defined in special file named bitbucket-pipelines.yml that lives in the root of a project. Pipelines can be enabled in the settings for a repository. There you can also define environment variables that can be used within the scripting. In the pipeline below, I have two such variables defined - one for the ssh port and another that concatenates the deployment server and path into a single string.

I use SSH keys for authenticating the copy operation. You can bring your own key, but since single use keys are easier to revoke I went through the straightforward process of creating one in Bitbucket. The public key must be registered with the hosting platform in whatever way they support (or you support if you’re running the infrastructure yourself). While we’re on the subject, you’ll want to configure the SSH server as a known host. Under the Settings > SSH Keys configuration for your repository there is a Known hosts section. You’ll want to add the host address for your SSH connection and fetch the fingerprint for the pipeline to work. If your hosting provider uses a non-standard SSH port (as mine does), you can include that at the end following a colon - e.g.,


image: bpack/hugo-npm

          - step:
                - node
                - npm install
                - npm run-script build 
                - scp -i ~/.ssh/config -P $SSH_PORT -r public/* $SSH_DEPLOY_URL

This pipeline takes any commit to master and runs a build inside the previously defined docker container. The npm install gets the various javascript libraries needed for CSS compilation, HTML minification, etc. The run-script invokes our gulp tasks for css and javascript and invokes the hugo build of the site. All of which is copied via scp from the Hugo created ‘public’ folder to the root site directory from which it will be served.

That’s it. I’m sure there will be enhancements to the process over time. If so, I’m sure there will be an update here at some point. Now let’s commit this post to master and see what happens…

Additional Resources

Websites use cookies, this one is no different. Learn more here.