I have a recursive algorithm (in three different forms) which computes the sum of the first n odd positive integers. These algorithms are for learning only, I'm aware that there are better ways of solving this problem. I've written the three variants in both Lua and Java, which I'll call Lua 1, Lua 2, Lua 3 and Java 1, Java 2 and Java 3. The first two are very similar, just rearranged.
The Lua programs are here, and the Java programs are here.
Lua 1 and 2 perform extremely well and could easily reach n = 100,000,000. Lua 3 hits a stack overflow where n > 16,000.
Java 1 could only reach n = 4000 before hitting a stack overflow while Java 2 reached 9000. Java 3 managed to get to n = 15,000 before again hitting a stack overflow.
Can anyone explain these results? Why did Java 1, 2 and 3 perform so poorly while Lua 1 and 2 performed so well?