bash scripting

Shell scripting and system administration

Some useful programs for use in shell scripts and general system administration tasks:

* grep & egrep      * expand      * cut      * strings      * file      * col      * uniq      * > dirname, basename
  • seq * user{add,mod,del} * sed & awk * sudo * touch * ldd * rsync * screen * valgrind & cachegrind

Bash scripting idioms

*

  Do something to all matching files in a directory:

  for i in *.txt; do          echo "This is file ${i}"        done

*

  Do something to all matching files in a directory structure:

  for i in $( find ./dir1 -type f -name "*.txt" ); do          echo "This is file ${i}"        done

*

  Iterate over a number sequence:

  for i in $( seq 1 9 ); do          echo "Count ${i}"        done

*

  Strip prefixes and substitute suffixes on filenames

    filename="img_foo.JPG"          # Remove the "img_" prefix          newname=${filename#img_}          # Remove

the ".JPG" suffix and replace it with ".jpg" newname=${newname%.JPG}.jpg

  Sadly, I don't know a way to strip both the prefix and suffix in one command without using sed or some similar

tool. If you know how to do it purely within the shell, then I'd like to know! *

General sh-type script tips

* Use $( foo ) notation for command substitution rather than backticks (`foo`): it's more robust against nesting

and, not unimportantly, emacs does better syntax highlighting with the bracket notation. * Use #! /usr/bin/env bash rather than a direct #! /bin/bash "shebang" shell incantation at the top of your script files. This goes for any interpreter actually: the env program runs the script in the user's environment, thus avoiding the issues involved with specifying an absolute path when they system has an inadequate copy of the interpreter at that location. Sweet :-) * Always quote dereferenced variables if there's a chance they might be null, otherwise the resulting behaviour may be very different. For example: $ test -n ""; echo $? 1 $ test -n "fds"; echo $? 0 $ test -n ; echo $? 0 So you can see that the test command, being used to test for non-zero length strings will return the same value (0) for a real non-zero length string ("fds") as for a null argument, even though you would probably expect a null argument to be treated as having zero length. So be careful! * Use code blocks, test statements and &&/|| conditional operators. Remember, code blocks are defined with curly braces e.g. dosomething || { echo "Oops, it's all gone wrong" 1>&2; exit 1; } The last semicolon before the right-hand curly brace is essential if you want to have the last statement and the closing brace on the same line. Note that if you want to implement binary conditionals (i.e. "do this if return value = 0, or do this if return value = 0" constructions) using these operators, you may run into trouble. If the "or" block (as above) doesn't explicitly exit or otherwise return false, the return value of the last statement in the "or" block will feed into the corresponding "and" block, e.g. $ echo foo | grep baz || { echo "Oops" 1>&2; } && { echo "This probably isn't what you want"; } Oops This probably isn't what you want $ echo foo | grep baz || { echo "Oops" 1>&2; false; } && { echo "This probably isn't what you want"; } Oops where the second statement alleviates the problem by forcing the final return value of the "or" block to be 1, ensuring that the "and" block will not be executed. * Designing options passed to scripts: provide a mechanism to negate any options and process in order so that options can be enabled and disabled in the same line (this allows you to override shell aliases). CLI should be designed so that if batch processing of e.g. many files is posible/likely then it should be done by simply specifying all the files to be operated on as arguments to the script. That way the operation can be easily batched by looping over the arguments after option removal and the files to process can be easily specified by command substitution on the command line or in a shell script, e.g imgresize --maxheight=800 $(ls *.jpg).

Portable sh-type scripts

A while ago, I was responsible for a rather complex script which had to be sourced (rather than just executed) by bash, sh, ksh and zsh shells recently and it turned out to be a bit of a nightmare to make it compliant. I'll put some pointers here as I remember them. More recently, I was doing a lot of work using the GNU automake and autoconf packages. For maximum portability across systems, again a very portable base of sh constructs is allowed: the 1977 subset of sh if I remember rightly! The autoconf documentation has a comprehensive section on writing portable sh scripts in the context of autoconf M4 macros.

* If the script is to be sourced then you can fall victim to personal aliases. For example, if the user's .bashrc

defines an alias ls="ls -h" (which makes the file sizes reported by ls "human-readable" then a sourced script which tries to use ls and cut to extract file size information will fail. You can avoid this by only ever calling utility programs with their full path: $( which ls ) -l (or using backticks instead of the $(foo) construction) will do the trick. * Don't use set to set positional parameters if you can help it: use arrays or some other method instead. Why? Well set has massively different behaviour depending on whether or not it is followed by arguments. If it is, then that set of space-separated arguments are set to the positional parameters (i.e. $1, $2, ...) of the current context. But if there are no arguments then set spews out the whole environment definition of the shell. The problem is that there's no difference between the case where you genuinely call set with no arguments and the case where an automatically generated argument list is empty: the latter results in a dramatic and confusing failure of the script! * Use functions to break up large scripts: they are portable! Be careful with the syntax used in for [...] do; [...] done; loops: it's quite fragile. * Remember to unset any variables or functions which are just used internally. Unless adding to the environment is intentional, try to leave is as it was when you started. A handy check of this is like so: $ env | wc -l $ source thescript.sh $ env | wc -l do the number of lines change after sourcing the script? If so, you're affecting the shell environment.