Differences between revisions 40 and 42 (spanning 2 versions)
Revision 40 as of 2007-05-12 14:16:20
Size: 19463
Editor: GreyCat
Comment: BashFaq -> BashFAQ
Revision 42 as of 2007-07-30 14:21:41
Size: 20086
Editor: cpe-74-65-28-251
Comment: Added note about UTF-8 BOMs -- please move this from pitfalls if there's a more appropriate place for it.
Deletions are marked like this. Additions are marked like this.
Line 149: Line 149:

The same thing applies to {{{||}}}. Use {{{[[}}}, or use {{{-o}}}, or use two {{{[}}} commands.
Line 408: Line 410:

== On UTF-8 and Byte-Order Marks (BOM) ==

'''In general:''' UTF-8 always has big-endian byte order. While the presence of a BOM would not normally damage a UTF-8 document, it is entirely superfluous.

'''In shell scripting:''' 'Where UTF-8 is used transparently in 8-bit environments, the use of a BOM will interfere with any protocol or file format that expects specific ASCII characters at the beginning, such as the use of "#!" of at the beginning of Unix shell scripts.'
http://unicode.org/faq/utf_bom.html#29

Bash Pitfalls

This page shows common errors that Bash programmers make. The following examples are all flawed in some way:

TableOfContents

1. for i in `ls *.mp3`

One of the most common mistakes ["BASH"] programmers make is to write a loop like this:

  •  for i in `ls *.mp3`; do     # Wrong!
        some command $i          # Wrong!
     done

This breaks when the user has a file with a space in its name. Why? Because the output of the ls *.mp3 command substitution undergoes word splitting. Assuming we have a file named 01 - Don't Eat the Yellow Snow.mp3 in the current directory, the for loop will iterate over each word in the resulting file name (namely: "01", "-", "Don't", "Eat", and so on).

You can't double-quote the substitution either:

  •  for i in "`ls *.mp3`"; do   # Wrong!
     ...

This causes the entire output of the ls command to be treated as a single word, and instead of iterating over each file name in the output list, the loop will only execute once, with i taking on a value which is the concatenation of all the file names (with spaces between them).

In addition to this, the use of ls is just plain unnecessary. It's an external command, which simply isn't needed to do the job. So, what's the right way to do it?

  •  for i in *.mp3; do         # Better!  But...
       some command "$i"        # ... see Pitfall #2 for more info.
     done

Let Bash expand the list of filenames for you. The expansion will not be subject to word splitting. Each filename that's matched by the *.mp3 pattern will be treated as a separate word, and the loop will iterate once per file name.

For more details on this question, please see [:BashFAQ#faq20:Bash FAQ #20].

The astute reader will notice the double quotes in the second line. This leads to our second common pitfall.

2. cp $file $target

What's wrong with the command shown above? Well, nothing, if you happen to know in advance that $file and $target have no white space or wildcards in them.

But if you don't know that in advance, or if you're paranoid, or if you're just trying to develop good habits, then you should quote your variable references to avoid having them undergo word splitting.

  •  cp "$file" "$target"

Without the double quotes, you'll get a command like cp 01 - Don't Eat the Yellow Snow.mp3 /mnt/usb and then you'll get errors like cp: cannot stat `01': No such file or directory. If $file has wildcards in it (* or ? or [...]), they will be expanded if there are files that match them. With the double quotes, all's well, unless "$file" happens to start with a -, in which case cp thinks you're trying to feed it command line options. This isn't really a shell problem, but it often occurs with shell variables.

One solution is to insert -- between cp and its arguments. That tells it to stop scanning for options, and all is well:

  •  cp -- "$file" "$target"

(There may be some incredibly ancient systems in existence, in which the -- trick doesn't work. For those, read on....)

Another is to ensure that your filenames always begin with a directory (including . for the current directory, if appropriate). For example, if we're in some sort of loop:

  •  for i in ./*.mp3; do
       cp "$i" /target
       ...

In this case, even if we have a file whose name begins with -, the glob will ensure that the variable always contains something like ./-foo.mp3, which is perfectly safe as far as cp is concerned.

3. [ $foo = "bar" ]

This is very similar to the first part of the previous pitfall, but I repeat it because it's so important. In the example above, the quotes are in the wrong place. You do not need to quote a string literal in bash. But you should quote your variables if you aren't sure whether they could contain white space or wildcards.

This breaks for two reasons:

  • If a variable referenced in [ does not exist, or is blank, then the [ command would see the line:

    •   [ $foo = "bar" ]
  • .. as:
    •   [ = "bar" ]
  • .. and throw the error unary operator expected. (The = operator is binary, not unary, so the [ command is rather shocked to see it there.)

  • If the variable contains internal whitespace, then it's split into separate words, before the [ command sees it. Thus:

    •   [ multiple words here = "bar" ]

    While that may look OK to you, it's a syntax error as far as [ is concerned.

A more correct way to write this would be:

  •  [ "$foo" = bar ]       # Pretty close!

But this still breaks if $foo begins with a -.

In bash, the [[ keyword, which embraces and extends the old test command (also known as [), can be used to solve the problem:

  •  [[ $foo = bar ]]       # Right!

You don't need to quote variable references within [[ ]] because they don't undergo word splitting, and even blank variables will be handled correctly. On the other hand, quoting them won't hurt anything either.

You may have seen code like this:

  •   [ x"$foo" = xbar ]    # Also right!

The x"$foo" hack is required for code that must run on ancient shells which lack [[, because if $foo begins with a -, then the [ command may become confused. But you'll get really tired of having to explain that to everyone else.

If the right hand side is a constant, you could just do it this way:

  •   [ bar = "$foo" ]      # Also right!

[ doesn't care whether the token on the right hand side of the = begins with a -. It just uses it literally.

4. cd `dirname "$f"`

This is mostly the same problem. Like variable expansion, the result of backtick expansion undergoes word splitting and filename expansion. So you should quote it:

  •   cd "`dirname "$f"`" 

What's not obvious here is how the quotes nest. A C programmer reading this would expect the first and second double-quotes to be grouped together; and then the third and fourth. But that's not the case in Bash. Bash treats the double-quotes inside the command substitution as one pair; and the double-quotes outside the substitution as another pair.

Another way of writing this: the parser treats the backticks as a "nesting level", and the quotes inside it are separate from the quotes outside it.

The same thing works if we use the [:BashFAQ#faq82:preferred] $() syntax, too:

  •   cd "$(dirname "$f")"

Quotes inside $() are grouped together.

5. [ "$foo" = bar && "$bar" = foo ]

You can't use && inside the old test (or [) command. The Bash parser sees && outside of [[ ]] or (( )) and breaks your command into two commands, before and after the &&. Use one of these instead:

  •  [ bar = "$foo" -a foo = "$bar" ]       # Right!
     [ bar = "$foo" ] && [ foo = "$bar" ]   # Also right!
     [[ $foo = bar && $bar = foo ]]         # Also right!

(Note that we reversed the constant and the variable inside [ for the reasons discussed in the previous pitfall.)

The same thing applies to ||. Use [[, or use -o, or use two [ commands.

6. [[ $foo > 7 ]]

The [[ ]] operator is not used for an ArithmeticExpression. It's used for strings only. If you want to do a numeric comparison using > or <, you must use (( )) instead:

  •  ((foo > 7))                            # Right!

If you use the > operator inside [[ ]], it's treated as a string comparison, not an integer comparison. This may work sometimes, but it will fail when you least expect it. If you use > inside [ ], it's even worse: it's an output redirection. You'll get a file named 7 in your directory, and the test will succeed as long as $foo is not empty.

If you're developing for a BourneShell instead of bash, this is the historically correct version:

  •  [ $foo -gt 7 ]                          # Also right!

Note that the test ... -gt command will fail in interesting ways if $foo is not an integer. Therefore, there's not much point in quoting it properly -- if it's got white space, or is empty, or is anything other than an integer, we're probably going to crash anyway. You'll need to sanitize your input aggressively.

The double brackets support this syntax too:

  •  [[ $foo -gt 7 ]]                        # Also right!

7. grep foo bar | while read line; do ((count++)); done

The code above looks OK at first glance, doesn't it? Sure, it's just a poor implementation of grep -c, but it's intended as a simplistic example. So why doesn't it work? The variable count will be unchanged after the loop terminates, much to the surprise of Bash developers everywhere.

The reason this code does not work as expected is because each command in a pipeline is executed in a separate subshell. The changes to the count variable within the loop's subshell aren't reflected within the parent shell (the script in which the code occurs).

For solutions to this, please see [:BashFAQ#faq24:Bash FAQ #24].

8. if [grep foo myfile]

Many people are confused by the common practice of putting the [ command after an if. They see this and convince themselves that the [ is part of the if statement's syntax, just like parentheses are used in C's if statement.

However, that is not the case! [ is a command, not a syntax marker for the if statement. It's equivalent to the test command, except for the requirement that the final argument must be a ].

The syntax of the if statement is as follows:

  •  if COMMANDS
     then
       COMMANDS
     elif COMMANDS     # optional
     then
       COMMANDS
     else              # optional
       COMMANDS
     fi

There may be zero or more optional elif sections, and one optional else section. Note: there is no [ in the syntax!

Once again, [ is a command. It takes arguments, and it produces an exit code. It may produce error messages. It does not, however, produce any standard output.

The if statement evaluates the first set of COMMANDS that are given to it (up until then, as the first word of a new command). The exit code of the last command from that set determines whether the if statement will execute the COMMANDS that are in the then section, or move on.

If you want to make a decision based on the output of a grep command, you do not need to enclose it in parentheses, brackets, backticks, or any other syntax mark-up! Just use grep as the COMMANDS after the if, like this:

  •  if grep foo myfile >/dev/null; then
     ...
     fi

Note that we discard the standard output of the grep (which would normally include the matching line, if any), because we don't want to see it -- we just want to know whether it's there. If the grep matches a line from myfile, then the exit code will be 0 (true), and the then clause will be executed. Otherwise, if there is no matching line, the grep should return a non-zero exit code.

9. if [bar="$foo"]

As with the previous example, [ is a command. Just like with any other command, Bash expects the command to be followed by a space, then the first argument, then another space, etc. You can't just run things all together without putting the spaces in! Here is the correct way:

  •  if [ bar = "$foo" ]

Each of bar, =, "$foo" (after substitution, but without word splitting) and ] is a separate argument to the [ command. There must be whitespace between each pair of arguments, so the shell knows where each argument begins and ends.

10. if [ [ a = b ] && [ c = d ] ]

Here we go again. [ is a command. It is not a syntactic marker that sits between if and some sort of C-like "condition". Nor is it used for grouping. You cannot take C-like if commands and translate them into Bash commands just by replacing parentheses with square brackets!

If you want to express a compound conditional, do this:

  •  if [ a = b ] && [ c = d ]

Note that here we have two commands after the if, joined by an && operator (see the documentation if you don't know what that does). It's precisely the same as:

  •  if test a = b && test c = d

If the first test command returns false, then body of the if statement is not entered. If it returns true, then the second test command is run; and if that also one returns true, then the body of the if statement will be entered.

11. cat file | sed s/foo/bar/ > file

You cannot read from a file and write to it in the same pipeline. Depending on what your pipeline does, the file may be clobbered (to 0 bytes, or possibly to a number of bytes equal to the size of your operating system's pipeline buffer), or it may grow until it fills the available disk space, or reaches your operating system's file size limitation, or your quota, etc.

If you want to make a change to a file, other than appending to the end of it, there must be a temporary file created at some point. For example, the following is completely portable:

  •  sed 's/foo/bar/g' file > tmpfile && mv tmpfile file

The following will only work on GNU sed 4.x:

  •  sed -i 's/foo/bar/g' file(s)

Note that this also creates a temporary file, and does the same sort of renaming trickery -- it just handles it transparently.

And the following equivalent command requires perl 5.x (which is probably more widely available than GNU sed 4.x):

  •  perl -pi -e 's/foo/bar/g' file(s)

For more details, please see [:BashFAQ#faq21:Bash FAQ #21].

12. echo $foo

This relatively innocent-looking command causes massive confusion. Because the $foo isn't quoted, it will not only be subject to word splitting, but also file globbing. This misleads Bash programmers into thinking their variables contain the wrong values, when in fact the variables are OK -- it's just the echo that's messing up their view of what's happening.

  •  MSG="Please enter a file name of the form *.zip"
     echo $MSG

This message is split into words and any globs are expanded, such as the *.zip. What will your users think when they see this message:

  •  Please enter a file name of the form freenfss.zip lw35nfss.zip

To demonstrate:

  •  VAR=*.zip       # VAR contains an asterisk, a period, and the word "zip"
     echo "$VAR"     # writes *.zip
     echo $VAR       # writes the list of files which end with .zip

13. $foo=bar

No, you don't assign a variable by putting a $ in front of the variable name. This isn't perl.

14. foo = bar

No, you can't put spaces around the = when assigning to a variable. This isn't C. When you write foo = bar the shell splits it into three words. The first word, foo, is taken as the command name. The second and third become the arguments to that command.

Likewise, the following are also wrong:

  •   foo= bar    # WRONG!
      foo =bar    # WRONG!
      $foo = bar; # COMPLETELY WRONG!
    
      foo=bar     # Right.

15. echo <<EOF

A here document is a useful tool for embedding large blocks of textual data in a script. It causes a redirection of the lines of text in the script to the standard input of a command. Unfortunately, echo is not a command which reads from stdin.

  •   # This is wrong:
      echo <<EOF
      Hello world
      EOF
    
      # This is right:
      cat <<EOF
      Hello world
      EOF

16. su -c 'some command'

This syntax is almost correct. The problem is, su takes a -c argument, but it's not the one you want. You want to pass -c 'some command' to a shell, which means you need a username before the -c.

  •   su root -c 'some command'   # Now it's right.

su assumes a username of root when you omit one, but this falls on its face when you want to pass a command to the shell afterward. You must supply the username in this case.

17. cd /foo; bar

If you don't check for errors from the cd command, you might end up executing bar in the wrong place. This could be a major disaster, if for example bar happens to be rm *.

You must always check for errors from a cd command. The simplest way to do that is:

  •   cd /foo && bar

If there's more than just one command after the cd, you might prefer this:

  •   cd /foo || exit 1
      bar
      baz
      bat ... # Lots of commands.

cd will report the failure to change directories, with a stderr message such as "bash: cd: /foo: No such file or directory". If you want to add your own message in stdout, however, you could use command grouping:

  •   cd /net || { echo "Can't read /net.  Make sure you've logged in to the Samba network, and try again."; exit 1; }
      do_stuff
      more_stuff

Note there's a required space between "{" and "echo".

Some people also like to enable set -e to make their scripts abort on any command that returns non-zero, but this can be rather tricky to use correctly (since many common commands may return a non-zero for a warning condition, which you may not want to treat as fatal).

By the way, if you're changing directories a lot in a Bash script, be sure to read the Bash manual page on pushd, popd, and dirs. Perhaps all that code you wrote to manage cd's and pwd's is completely unnecessary.

Speaking of which, compare this:

  •   find ... -type d | while read subdir; do
        cd "$subdir" && whatever && ... && cd -
      done

With this:

  •   find ... -type d | while read subdir; do
        (cd "$subdir" && whatever && ...)
      done

Forcing a subshell here causes the cd to occur only in the subshell; for the next iteration of the loop, we're back to our normal location, regardless of whether the cd succeeded or failed. We don't have to change back manually. In fact, the penultimate example isn't even valid -- if one of the whatever commands fails, we might not cd back where we need to be. To correct it without using the subshell, we'd have to arrange to execute some sort of cd "$ORIGINAL_DIR" command within each loop iteration. It would be frightfully messy.

The subshell version is much simpler and cleaner.

18. [ bar == "$foo" ]

The == operator is not valid for the [ command. Use = instead, or use the [[ keyword instead.

  •   [ bar = "$foo" ] && echo yes
      [[ bar == $foo ]] && echo yes

19. for i in {1..10}; do ./something &; done

You cannot put a ; immediately after an &. Just remove the extraneous ; entirely.

  •   for i in {1..10}; do ./something & done

& already functions as a command terminator, just like ; does. And you cannot mix the two.

20. On UTF-8 and Byte-Order Marks (BOM)

In general: UTF-8 always has big-endian byte order. While the presence of a BOM would not normally damage a UTF-8 document, it is entirely superfluous.

In shell scripting: 'Where UTF-8 is used transparently in 8-bit environments, the use of a BOM will interfere with any protocol or file format that expects specific ASCII characters at the beginning, such as the use of "#!" of at the beginning of Unix shell scripts.' http://unicode.org/faq/utf_bom.html#29

BashPitfalls (last edited 2024-10-05 08:59:29 by emanuele6)