How I Back Up My Ghost Blog to a Cloud Storage | by Jerry Ng | Jul, 2022

Do it in 5 steps

MacBook with an external hard disk
Picture by Siyuan Hu on Unsplash

I’ve been writing on my self-hosted Ghost weblog for a while now. In case you’re questioning, this website is hosted on a Digital Ocean Droplet.

For essentially the most half, I felt like I used to be doing one thing inconsequential that solely meant a lot for myself. In the present day, the positioning has grown to a measurement that it’d really feel like a hat-flying slap to my face if I had been to lose all my content material.

Should you’re in search of the reply to “how do I backup my Ghost weblog?” to your self-hosted Ghost weblog, you’ve come to the appropriate place.

TL;DR: The best way to backup self-hosted Ghost weblog to cloud storage like Google Drive

Getting began with Ghost is simple. You’ll sometimes decide between:

I’d advocate anybody (particularly non-developers) to go for the managed model.

Sure, it’s comparatively dearer; so is each managed service. Nevertheless, it’d most certainly prevent a bunch of complications (and time) that come together with self-hosting another websites:

  • Backups
  • Upkeep
  • Downtime restoration
  • Safety, and so forth.

In brief, you’d sleep higher at night time.

On high of that, 100% of the income goes to funding the event of the open supply venture itself — a win-win.

“Uh, Why are you self-hosting Ghost then?”

  1. Worth — nothing beats the value affordability of internet hosting in your devoted server
  2. Information acquire — I’ve discovered quite a bit from internet hosting and managing my very own VPS

Different perks of self-hosting embody customizability, management, privateness, and so forth. — that are nice, albeit not my major causes.

Most significantly, all of the hassles above of self-hosting got here to me as enjoyable.

Till it isn’t, I suppose.

The ache of backing up Ghost

Organising Ghost on Digital Ocean is as simple as a click of a button. But, there isn’t any correct in-house answer to again up your Ghost website.

From Ghost’s documentation, you’ll be able to manually backup your Ghost website by Ghost Admin. Alternatively, you possibly can use the ghost backup command.

Even so, there was no point out of database backup as of the time of penning this. I actually want they’d speak about this extra.

Why bash

Simplicity. Plus, Bash is nice for command line interplay.

What are we backing up

Two issues:

  • Ghost content material/ — which incorporates your website/weblog content material in JSON, member CSV export, themes, pictures, and a few configuration recordsdata
  • MySQL database

On this article, we’re going to jot down a easy Bash script that does all the next steps for us.

Assuming that we have already got Rclone arrange, right here’s an summary of what our Bash script ought to cowl:

Overview
Overview
  1. Non-obligatory: run requirement checks to make sure that the CLIs that we’d like are put in. E.g. mysqldump, rclone, and so forth.
  2. Again up the content material/ folder the place the Ghost posts are saved
  3. Again up our MySQL database
  4. Copy the backup recordsdata over to our cloud storage (e.g. Google Drive) utilizing Rclone
  5. Non-obligatory: clear up the generated backup recordsdata

Let’s create util.sh which comprises a set of helper capabilities for our backup script.

I like having timestamps printed on my logs, so:

#!/bin/bashlog() 
echo "$(date -u): $1"

With this, we are able to now use log as an alternative of echo to print textual content; with the timestamp utilizing:

$ log 'Hola Jerry!'Solar Jul 22 03:01:52 UTC 2022: Hola Jerry!

Subsequent, we’ll create a utility perform that helps to test if a command is put in:

# util.sh# ...check_command_installation() 
if ! command -v $1 &>/dev/null; then
log "$1 will not be put in"
exit 0
fi

We are able to use this perform in Step 1 to make sure that we now have ghost, mysqldump, and so forth. put in earlier than we begin our backup course of. If the CLI will not be put in, we’d simply log and exit.

The backup script

On this part, we’ll create a backup.sh file as our essential backup Bash script.

To maintain our code organized, we break the steps within the overview into particular person capabilities.

Earlier than we start, we’ll must declare some variables and supply our util.sh in order that we are able to use the utility capabilities that we outlined earlier:

#!/bin/bashset -esupply util.shGHOST_DIR="/var/www/ghost/"REMOTE_BACKUP_LOCATION="ghost_backups/"TIMESTAMP=$(date +%Y_percentm_percentd_percentHpercentM)
GHOST_CONTENT_BACKUP_FILENAME="ghost_content_$TIMESTAMP.tar.gz"
GHOST_MYSQL_BACKUP_FILENAME="ghost_mysql_$TIMESTAMP.sql.gz"

Step 1: Run checks

  • Verify if the default /var/www/ghost listing exists. ghost CLI can solely be invoked inside a folder the place Ghost was put in
  • Verify if the required CLIs to run our backup are put in
# backup.sh# ...pre_backup_checks() 
if [ ! -d "$GHOST_DIR" ]; then
log "Ghost listing doesn't exist"
exit 0
fi
log "Operating pre-backup checks"
cd $GHOST_DIR
cli=("tar" "gzip" "mysql" "mysqldump" "ghost" "rclone")
for c in "$cli[@]"; do
check_command_installation "$c"
performed

Step 2: Backup the content material listing

  • Compress the content material/ listing right into a .gz file
# backup.sh# ...backup_ghost_content() 
log "Dumping Ghost content material..."
cd $GHOST_DIR
tar -czf "$GHOST_CONTENT_BACKUP_FILENAME" content material/

Step 3: Backup MySQL database

  • Fetch all the required database credentials (username, password, DB identify) from the Ghost CLI
  • Run a test to make sure that we are able to hook up with our MySQL database utilizing the credentials above
  • Create a MySQL dump and compress it right into a .gz file
# backup.sh# ...check_mysql_connection() 
log "Checking MySQL connection..."
if ! mysql -u"$mysql_user" -p"$mysql_password" -e ";" &>/dev/null; then
log "Couldn't hook up with MySQL"
exit 0
fi
log "MySQL connection OK"
backup_mysql() tail -n1)
mysql_database=$(ghost config get database.connection.database

Step 4: Copying the compressed backup recordsdata to a cloud storage

# backup.sh# ...rclone_to_cloud_storage() 
log "Rclone backup..."
cd $GHOST_DIR
rclone_remote_name="distant"rclone copy "$GHOST_DIR/$GHOST_CONTENT_BACKUP_FILENAME" "$rclone_remote_name:$REMOTE_BACKUP_LOCATION"
rclone copy "$GHOST_DIR/$GHOST_MYSQL_BACKUP_FILENAME" "$rclone_remote_name:$REMOTE_BACKUP_LOCATION"

Step 5: Clear up the backup recordsdata

# backup.sh# ...clean_up() 
log "Cleansing up outdated backups..."
cd $GHOST_DIR
rm -r "$GHOST_CONTENT_BACKUP_FILENAME"
rm -r "$GHOST_MYSQL_BACKUP_FILENAME"

Lastly, we will invoke all the capabilities outlined for Steps 1–5.

# On the finish of the backup.sh# ...log "Welcome to Wraith"
pre_backup_checks
backup_ghost_content
backup_mysql
rclone_to_cloud_storage
clean_up
log "Accomplished backup to $REMOTE_BACKUP_LOCATION"

And… we’re performed!

The ultimate code

It’s possible you’ll discover the code at github.com/ngshiheng/wraith.

To make use of this venture immediately:

  1. SSH into your VPS the place you host your Ghost website
  2. Set up Rclone (vital)
  3. Clone this repository
  4. Run ./backup.sh from the wraith listing

I despise doing handbook upkeep and administrative duties. Let’s schedule a daily backup for our Ghost website to ease our ache utilizing Crontab:

  1. Run crontab -e
  2. For instance, you’ll be able to run a backup at 5 a.m each Monday with:

Do take into account timezone while you set your Cron schedule.

More Posts