I’ve previously used AWS CodeBuild to build and deploy Hugo on S3, and that worked, but recently I started using GitHub Actions for work, and it looked much easier than AWS CodeBuild. I’ve migrated to GitHub Actions in about an hour with a couple of failures, but that was easy enough to fix.
Here’s how to do that yourself.
Create IAM user for S3 access
Go to AWS IAM (Identity and Access Management), on the left side go to users and then click Create user button.
Give user a descriptive name, click Next.
On Set permissions page, you can Attach policies directly, and find AmazonS3FullAccess
and CloudFrontFullAccess
. If
you’re more skillful I’d also reduce the full access to something more limited, but I don’t work as an admin/devops so
I’ll figure that out later.
Next click Create user.
Then on IAM page click on the user, scroll down to Access keys and click Create access key button. We’ll use the first one - Command Line Interface (CLI) as our use case. Click next two times and copy those keys. We’ll now add that to GitHub repository.
Adding AWS user credentials to GitHub repository
Go to your repository, then Settings, on the left side you’ll see Secrets and variables, expand that and click Actions.
Click New repository secret and paste values. I’ve named them AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
. One more
thing we’ll need is AWS CloudFront Distribution ID.
Getting AWS CloudFront Distribution ID
Go to CloudFront, find the distribution that you’re using and just copy its ID. Put that in the secrets from previous
step and name it AWS_CLOUDFRONT_DISTRIBUTION_ID
.
Setting up GitHub Actions
name: Deploy Hugo site to AWS S3
on:
# Runs on pushes targeting the default branch
push:
branches: ["master"]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# Allow only one concurrent deployment, skipping runs queued between the run in-progress and latest queued.
# However, do NOT cancel in-progress runs as we want to allow these production deployments to complete.
concurrency:
group: "pages"
cancel-in-progress: false
# Default to bash
defaults:
run:
shell: bash
jobs:
build:
runs-on: ubuntu-latest
env:
HUGO_VERSION: 0.122.0
AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
AWS_DEFAULT_REGION: your_aws_bucket_region
AWS_BUCKET_NAME: your_bucket_name
AWS_CLOUDFRONT_DISTRIBUTION_ID: ${{ secrets.AWS_CLOUDFRONT_DISTRIBUTION_ID }}
steps:
- name: Install Hugo CLI
run: |
wget -O ${{ runner.temp }}/hugo.deb https://github.com/gohugoio/hugo/releases/download/v${HUGO_VERSION}/hugo_extended_${HUGO_VERSION}_linux-amd64.deb \
&& sudo dpkg -i ${{ runner.temp }}/hugo.deb
- name: Install AWS CLI
uses: unfor19/install-aws-cli-action@v1
with:
version: 2 # default
verbose: false # default
arch: amd64 # allowed values: amd64, arm64
- name: Checkout repository
uses: actions/checkout@v4
with:
submodules: recursive
- name: Build with Hugo
env:
# For maximum backward compatibility with Hugo modules
HUGO_ENVIRONMENT: production
HUGO_ENV: production
run: hugo --minify --logLevel info
- name: Upload files to S3
run: aws s3 sync public/ s3://${{ env.AWS_BUCKET_NAME }}/ --exact-timestamps --delete
- name: Invalidate AWS CloduFront cache
run: aws cloudfront create-invalidation --distribution-id ${{ secrets.AWS_CLOUDFRONT_DISTRIBUTION_ID }} --paths "/*"
You can copy and paste this code, and it should work without a problem.
Problem
I can think of only one problem, and that is GitHub submodules. I know Hugo uses that, but I don’t. I have theme in my repository. In order to download submodules as well, you should use the Checkout repository step as this:
- name: Checkout repository
uses: actions/checkout@v3
with:
submodules: 'true'
Wrapping up
I find this GitHub Action fairly descriptive. And if you’re not sure about the flags just google them, most of them make sense. This is the most straight forward way I’ve found. We’re not dependent on Actions of other people. The only real dependency is on AWS CLI install and trusting the developer of that Action. And it does everything we need.
Additionally, you can add a cron job to your workflow if you post once a week at the same time. I, personally, don’t care about that time, I’ll just post it whenever ready.