2

I am writing a bash script and would like the very last command to start as a separate process. The last command also sends all output to a file. I also, however, want the output to still appear on the console. What I have so far is,

$ command > "file" &

This sends the output to "file" and also starts the command as its own process. However, I also want to view the output in the console at the same time (but if I hit ctrl+c or w/e, the command doesnt stop). This is a lot like this question, but with the caviat that it needs to be its own thread.

I have tried:

$ command | tee "file" &

but the problem is that tee is than part of the process, and output doesn't actually appear..

So, just to clarify, I want to have command on its own process, sending output to a file, but still have the output appear in the console (until I hit q, enter, ctrl+c, or something). Since this is in a bash script, two separate lines would be acceptable.

Humdinger
  • 373
  • 2
  • 11

1 Answers1

2

It sounds like you want command to finish writing to the file, but you want to be able to interrupt the display to the console. I would take a different approach to the solution. In your script:

> "file"
command > "file" &
tail -n +1 -F "file"

Correction:

The original answer used -n 0, which initially outputs no lines of "file" but outputs any lines added to "file" after tail is started. This was not my intention--it was a mistake. I intended to use an option that would list all lines of "file" even if command had written some before tail was started. The correct option for that behavior is -n +1.

From the tail(1) man page:

   -n, --lines=K   output the last K lines, instead of the last 10;
                   or use -n +K to output lines starting with the Kth

The first line clears the contents of the file in case of a race condition where tail hits the file before command.

garyjohn
  • 34,610
  • 8
  • 97
  • 89
  • Could you explain what the `-n 0` flag does? – Humdinger Mar 07 '14 at 21:29
  • Yes, the wrong thing! See my correction above. I'm glad you asked. – garyjohn Mar 07 '14 at 21:47
  • If the file doesn’t already exist, you might want to do something like `> file; command > file & tail -n +1 -F file` or `command > file & sleep 1; tail -n +1 -F file`, to protect against the race condition in which the `tail` process starts before the `file` is created. In the first option you might need to use `command >> file` or `command >| file` to allow writing to a file that already exists. – Scott - Слава Україні Mar 07 '14 at 21:52
  • I think a better solution to the race condition is to just add `touch file` a line before `command`. However, in my case, the file will always exists. – Humdinger Mar 07 '14 at 21:56
  • Scratch that, a better solution is to echo nothing into the file. This clears the contents of the file before `tail` reads it. – Humdinger Mar 07 '14 at 22:07
  • Note that I deleted the echo command in favor of just `> "file"` which actually makes the file empty instead of writing a newline to it. – garyjohn Mar 07 '14 at 23:39