10 Cool Tips for Writing Robust Bash Scripts | by André Müller | Apr, 2022

Photograph by Kristian Strand on Unsplash

Whether or not you’re automating issues domestically in your native laptop working Linux, macOS, and even Home windows or remotely, like construct processes, cron jobs, and docker builds, most builders will want scripts. Nevertheless, typically it’s not easy to search out out that scripts fail, or the place scripts fail. Due to this fact, it’s of utmost significance to combine robustness in scripts. Usually scripts must be moved from one system, consumer, container to a different location and all of the sudden fail then.

When speaking about scripts, we must always discern the 2 use circumstances:

  1. An interactive command-line interface (referred to as a REPL — read-evaluate-print-loop) to the working system. See additionally [1].
  2. An automation script for compiling, deploying, updating, or doing different issues.

For an interactive REPL, I might advocate zsh and the oh-my-zsh configuration framework. For automating issues, I like to recommend utilizing Bash since you get it working on many various techniques and even inside containers. This text focuses on the latter use case.

The next Bash variants can be utilized — Home windows git-bash which successfully makes use of part of MSYS2 — macOS built-in Bash (take care: it’s normally an historic launch) or higher up to date utilizing Homebrew. All Linux variants normally are geared up with Bash or may be merely put in.

Shebang is the primary line in a script file starting with #!, which is learn by the calling interpreter to know which executable to name. Many shell scripts use the shebang header #!/bin/sh. However take care: /bin/sh may be a symbolic hyperlink to POSIX shell, Bash or Sprint or one other shell. That depends upon the present techniques shell. Due to this fact, it’s not a good suggestion to rely your personal scripts on /bin/sh; in case you are not a part of the system! So place #!/bin/bash within the head of your file. Nevertheless, some techniques have the Bash in /usr/bin/bash and the above shebang won’t work. One other risk is to make use of the /usr/bin/env instrument, which searches a binary in accordance with the present search $PATH and returns the primary match. The shebang would then be #!/usr/bin/env bash. There’s a hard discussion of whether or not it’s a good suggestion to do that, or not. For those who write system scripts, then write absolutely the path to the interpreter within the shebang line and don’t use the env command.

Often, you’d place scripts along with different recordsdata and additional dependent scripts in a folder construction. I might count on that the script works with relative paths in an effort to discover its dependent knowledge relative to its personal path. Due to this fact, it’s important to search out out the present scripts’ path. The primary risk is to make use of readlink utility:

SCRIPT_PATH=$(dirname "$(readlink -f "$BASH_SOURCE[0]")")

The -f change canonicalizes the trail and follows symbolic hyperlinks.

Sadly, macOS doesn’t have a gnu suitable readlink instrument supporting the -f change, and you would use the next workaround (you need to use it on all techniques):

SCRIPT_PATH="$(cd "$( dirname "$BASH_SOURCE[0]" )" && pwd -P)"

Instead, you possibly can set up coreutils with Homebrew on macOS. However then it’s good to add aliases in your ~/.profile to map the readlink, id, uname, env to the corresponding greadlink, gid, guname, and genv instruments like so:

# set up with 'brew set up coreutils'
if which greadlink > /dev/null ; then
echo "Putting in coreutils aliases"
alias readlink=greadlink
alias id=gid
alias uname=guname
alias env=genv
fi

There’s a actually hard stackoverflow discussion with many loopy workarounds utilizing a.o. Python scripts and C applications for this. Nevertheless, I might advocate the Homebrew or system impartial options on this part.

Unhealthy issues would possibly occur when a command has a non-zero exit code and additional instructions will probably be executed after that! Error checking is all the time important. You need to know in case your cron job failed!

Abort on Error

For that, activate the choice set -o errexit (that is equal to set -e). This mode exits the Bash script at any time when a command fails and isn’t explicitly checked with an if assertion.

set -o errexit
ls /not/existent # --> exits the script with an error code
set -o errexit
# doesn't abort
if ls /not/existent ; then
echo "exists"
else
echo "doesn't exist"
fi

Present the Error Location

However now we solely know that the script failed by its return code. Nevertheless, it could be good to search out out what failed. Due to this fact, we set up a entice handler for displaying the present line:

#!/usr/bin/env bash
set -o errexit
set -o nounset
SCRIPT_PATH="$(cd "$( dirname "$BASH_SOURCE[0]" )" && pwd -P)"
perform die()
echo "ERROR $? IN $BASH_SOURCE[0] AT LINE $BASH_LINENO[0]" 1>&2
exit 1

entice die ERR

Create a stack dump in Bash

Along with that, you possibly can even create a stack dump for Bash. This may be attention-grabbing for those who create extra complicated perform hierarchies.

perform log_stack()

native i=0
native FRAMES=$#BASH_LINENO[@]
# FRAMES-2 skips primary, the final one in arrays
for ((i=FRAMES-2; i>=0; i--)); do
echo " File "$BASH_SOURCE[i+1]", line $BASH_LINENO[i], in $FUNCNAME[i+1]"
# Seize the supply code of the road
#sed -n "$BASH_LINENO[i]s/^/ /;p" "$BASH_SOURCE[i+1]"
finished
return 0

You possibly can add the decision of the log_stack() perform to the die() perform like so:

perform die() 
echo "ERROR $? IN $BASH_SOURCE[0] AT LINE $BASH_LINENO[0]" 1>&2
log_stack
exit 1

Catastrophe might happen, when an undefined variable is used. For this you possibly can activate the set -o nounset choice. Everytime you devour an undefined variable, the script will finish with an error. Typically you need to use setting variables as enter to your script. You then certainly received’t like a failure if the variable is lacking. For specifying a default worth, the next syntax can be utilized:

set -o nounset

# right here we set VERBOSE to "0" if it's not outlined!
VERBOSE="$VERBOSE:-0"

if [[ "$VERBOSE" = "1" ]] ; then
echo "Oh verbose is on!"
fi

An issue in scripts is usually the passing of arguments and paths containing white areas. So I like to recommend — all the time escape paths with " like:

inputFile="$1"
ls "$inputFile"

When you have an rm -rf command it may be completely harmful to not escape the argument. Observe rm -rf "/ my /file" and rm -rf / my /file. The latter will take away every thing out of your system, whereas the previous expression will take away file from the my listing.

For extra complicated instructions with many arguments, I like to recommend utilizing Bash arrays for assembling a number of choices like within the following docker instance:

set -o errexit
set -o nounset
SCRIPT_PATH="$(cd "$( dirname "$BASH_SOURCE[0]" )" && pwd -P)"
model="$(git -C "$SCRIPT_PATH" describe --tags --always)"
model="$model#v"
commit="$(git -C "$SCRIPT_PATH" rev-parse HEAD)"
container="my-app"

## HERE we create an array of choices
opts=()
opts+=(--tag "$container":newest)
opts+=(--tag "$container":"$model")
opts+=(--build-arg VERSION="$model")
opts+=(--build-arg COMMIT="$commit")
opts+=(--file "$SCRIPT_PATH/$container/Dockerfile")
docker construct "$opts[@]" .

Word: the decision to docker requires the opts array to be escaped "$opts[@]".

When a for loop iterates over an empty globbing expression, the loop and the entire script aborts with an error. To make it extra sturdy, you possibly can activate null globbing. Then the for loop will not be executed, when the expression is empty (on this case no txt file in /tmp exists).

shopt -s nullglob
for f in /tmp/*.txt ; do
echo "$f"
finished

I like to recommend the next construction for easy parsing of command traces in Bash scripts.

perform utilization() 
cat <<EOF
Utilization $0 OPTIONS
Choices:
--version=N units the model
-d begin in daemon mode
--help, -h this assist
EOF

# parse arguments
model=""
daemon=0
args=()
whereas [[ $# -gt 0 ]] ; do
case "$1" in
-v=*|--version=*)
model="$1#*="
;;

-d|--daemon)
daemon=1
;;

-h|--help)
utilization
exit 0
;;
--)
# ahead all arguments after `--` to the command
shift
args+=("$@")
break
;;
*)
echo "ERROR: unknown choice $1"
exit 1
;;
esac
shift
finished
# OK do right here one thing with "$args[@]"

Earlier than going to awk or tr you need to use Bash for easy string processing out of the field. A much less identified characteristic is the higher and decrease case syntax. ### Higher and Decrease Case Utilizing the ^ character you possibly can uppercase the primary letter of a variable x like $x^, with ^^ all characters are uppercase $x^^ and the identical for decrease case utilizing , and ,, (so don’t forget that Bash is intuitive and exquisite when utilizing that).

Lowercase Remodel

x="HELLO"
echo $x,,
# outcome: howdy

Uppercase Remodel

x="howdy"
echo $x^^
# outcome: HELLO

Clearly, a variety of issues may be averted by utilizing the ShellCheck instrument, which may be built-in in your favourite IDE, e.g. Visible Studio Code.

I put right here along with a small template for creating bash scripts:

I hope this text was helpful for you.

Advisable Studying

[1] A. Müller, “The reason why you must study to make use of the command line,” Mar. 10, 2022. https://medium.com/@jumpingkiwi/reasons-why-you-should-learn-to-use-the-command-line-91823128c0ad (accessed Apr. 02, 2022).

[2] dylan, Pure bash bible. 2022.

[3] Why should I care about POSIX if I’m writing bash scripts? — Unix & Linux Stack Exchange

More Posts