2

I just noticed that the bash I'm using (4.2.25(1)) isn't protected against infinite function recursion. In such a case a Segfault happens (and the bash process terminates). To check this on your bash, just type:

$ bash
$ f() { f; }
$ f

(Starting a subshell first (the first line) gives us a candidate we can experiment with without endangering the shell of the terminal; without the first line, your terminal window will probably close so quickly you cannot see anything interesting.)

I understand the reason of this phenomenon, I think; it probably is a stack overflow which results in the bash trying to write into regions of the memory which are not assigned to its process.

What I'm wondering about is two things:

  • Shouldn't there be a check in the bash to protect it against such situations? A more decent error message like "Stack Overflow in shell function" would certainly be better than a simple unhelpful Segfault.

  • Could this be a security issue? Before this method writes into memory parts which are not assigned to the process (resulting in a Segfault), it might overwrite other parts which were not meant to be used for the bash-internal stack.

Alfe
  • 283
  • 3
  • 17
  • 1
    Can confirm this for bash 3.2.57 (OS X 10.10.2 Yosemite) and 4.3.11 (Ubuntu 14.04 LTS). This should not happen (from `man bash`): `Functions may be recursive. No limit is imposed on the number of recursive calls.` Though that may just mean that `bash` doesn't limit it, only available memory :-) – jaume Feb 19 '15 at 13:43
  • 1
    It's seems to be an old known behaviour since at least 2003. Maybe you can [find interesting this](http://lists.gnu.org/archive/html/bug-bash/2003-12/msg00007.html). – Hastur Feb 20 '15 at 11:16
  • Thank you @hastur, that was enlightening about the history of the "bug". The question remains whether this is a security issue. Probably too chaotic behaviour to exploit it, I guess. – Alfe Feb 23 '15 at 00:39

0 Answers0