Whenever we call a recursive function it feels like something happening in encapsulated form but what is happening in real with recursive function. Is it linear memory storage or encapsulated?
When you call *any* function, conceptually a few things happen:
* the return address (RA) is stored on the stack
* a new stack frame is allocated where all automatic (non-static) variables are then allocated
When you exit *any* function:
* the stack frame is popped off (eliminating all local declared automatic variables), and then it branches to the stored return address, and then the RA is popped off the stack.
Note that any dynamically allocated space (from the heap), if pointed to by a local variable, is lost and is considered a memory leak.
Now, in regards to your recursive question - until the return condition is met, each subsequent call pushes the RA and a new stack frame onto the stack. Something that *never* hits it's return condition will eventually fail due to stack overflow. Depending on the size of the process/thread stack and the size of the stack frame, it could be quickly (a few calls) - or it could take hundreds of calls.
Is it linear memory storage? if no dynamic memory allocations are done, then I would think yes. From what I've seen, stack and heap grow from opposite ends of the processes available memory - so the more dynamic allocated memory, the fewer stack frames that can be supported. Again, in today's 64bit world, stack sizes are generally 100MBs in size. pthread stacks are significantly smaller though.
Use of Dynamic Memory Allocation to Overcome Stack Overflow and Conversion of the Recursive Function into a Simple Class of Loop for the Function may help.