27

I have a shell script with set -x to have verbose/debug output:

#!/bin/bash

set -x
command1
command2
...

The output looks like this:

+ command1
whatever output from command1
+ command2
whatever output from command2

My problem is, the shell output (caused by set -x) goes to the stderr, mixed with the output of the commands (command1, command2, ...). I would be happy to have the "normal" output on the screen (like the script woud run without set -x) and the "extra" output of bash separately in a file.

So I would like to have this on the screen:

whatever output from command1
whatever output from command2

and this in a log file:

+ command1
+ command2

(also fine if the log file has everything together)

The set -x 2> file obviously doens't take the right effect, because it's not the output of the set command, but it change the behaviour of the bash.

Using bash 2> file for the entire script also doesn't do the right thing, because it redirects the stderr of every command which run in this shell as well, so I don't see the error message of the commands.

redseven
  • 642
  • 1
  • 7
  • 12
  • 2
    My google-fu seems to be strong this morning: [Send bash -x output to logfile without interupting standard output](http://serverfault.com/a/579078) – steeldriver Aug 12 '16 at 12:52

4 Answers4

33

Based on this ServerFault answer Send bash -x output to logfile without interupting standard output, modern versions of bash include a BASH_XTRACEFD specifically for specifying an alternate file descriptor for the output of set -x

So for example you can do

#!/bin/bash

exec 19>logfile
BASH_XTRACEFD=19

set -x
command1
command2
...

to send the output of set -x to file logfile while preserving regular standard output and standard error streams for the following commands.

Note the use of fd 19 is arbitrary - it just needs to be an available descriptor (i.e. not 0, 1, 2 or another number that you have already allocated).

steeldriver
  • 131,985
  • 21
  • 239
  • 326
  • It indeed saves the bash trace log separately, however it makes really hard to read the 2 outputs (stdout+stderr on the screen and the bash trace in the log files) as they completely out of sync. See [the solution what I've just posted](https://askubuntu.com/a/1001404/313243). – redseven Jan 30 '18 at 15:28
8

After more than a year I've found the right solution to have both the "normal" output (stdout + stderr - bash trace) on the screen and all together (stdout + stderr + bash trace) in a file (bash.log):

exec   > >(tee -ia bash.log)
exec  2> >(tee -ia bash.log >& 2)
exec 19> bash.log

export BASH_XTRACEFD="19"
set -x

command1
command2
redseven
  • 642
  • 1
  • 7
  • 12
  • That is just the combination of steeldriver's answer and [this one](https://stackoverflow.com/a/11886837/4414935). – jarno Oct 27 '19 at 03:27
  • Doesn't work for me, the bash trace is not displayed on the screen, but the logfile contain all necessity. – Liso Mar 08 '21 at 02:44
  • @Liso That's exactly how it has to work. On the screen you have the stdout and stderr just like before and in the log file you have both of them plus the trace log. – redseven Mar 13 '21 at 19:25
  • @redseven Well I originally wanted them all to appear on the screen and on the log. – Liso Mar 14 '21 at 09:06
  • @Liso That's very easy and doesn't requires any of these tricks discussed here. You simply turn on tracing then redirect all the outputs of your script to a `tee`... – redseven Mar 15 '21 at 09:21
6

Steeldriver gave you one approach. Alternatively, you can simply redirect STDERR to a file:

script.sh 2> logfile

That, however, means that both the output created by the set -x option and any other error messages produced will go to the file. Steeldriver's solution will only redirect the set -x output which is probably what you want.

terdon
  • 98,183
  • 15
  • 197
  • 293
  • "...both the output created by the set -x option and any other error messages produced will go to the file. " And that's why it doesn't work for me. My main issue is, I don't easily see "real errors", because all this bash output goes to the stderr. Redirecting the commands error messages would also hide me the "real errors" on a different way. – redseven Oct 25 '16 at 15:09
  • @redseven I'm afraid what you're asking for isn't very clear then. Could you [edit] your question and clarify? Try avoiding the use of the term "output" for anything that isn't going to stdout. Do you want to separate i) normal output; ii) any errors thrown by your command and iii) set -x? Isn't steeldriver's answer enough then? If not, show us a simple script that we can copy and tell us how you'd want it to behave. – terdon Oct 25 '16 at 15:42
  • @steeldriver has already perfectly answered the question. – redseven Oct 28 '16 at 12:38
  • @redseven ah, cool then. Since your comment came so much later, I thought you still needed something and I couldn't see how steeldriver's answer failed to solve your issue. Glad it's all sorted then. – terdon Oct 28 '16 at 12:41
3

Automatic File Descriptor

Just to improve slightly on the accepted answer, and I'll keep it intact, here you can use the Bash 4.1+ automatic file descriptor allocation {any_var} to use the lowest available file descriptor - no need to hard-code it.

exec  1> >(tee -ia bash.log)
exec  2> >(tee -ia bash.log >& 2)

# Notice no leading $
exec {FD}> bash.log

# If you want to append instead of wiping previous logs
exec {FD}>> bash.log

export BASH_XTRACEFD="$FD"
set -x

# Just for fun, add this to show the file descriptors in this context
# and see the FD to your bash.log file
filan -s

command1
command2
Drakes
  • 229
  • 2
  • 4