2

Problem:

I want to pipe the output of docker logs -f ( for all containers ) into another command and keep it running on the remote server. I already found Run a nohup command over SSH, then disconnect but I had no success adopting the solution to this use case.

What have I tried so far:

ssh server << EOF
for cont in `docker ps -q`; do
        docker logs -f $cont 2>&1 | bunyan | /var/opt/go-logsink/go-logsink connect 2>&1 &
done
EOF

nohup version: nohup "docker logs -f $cont 2>&1 | bunyan | /var/opt/go-logsink/go-logsink connect"

Running that loop within a screen session works, but I want to automate the call after a deployment and that does not work. Also I tried calling nohup "<command>" within the loop and I see it creating the process, upon disconnection of ssh the processes get killed.

Question:

How to run that loop so it creates processes that stay alive after ssh session close.

Sascha
  • 168
  • 7

1 Answers1

1

Screen or tmux may be what you're looking for, preserving sessions alive independently from SSH.

SYN
  • 2,651
  • 1
  • 11
  • 18