Background
I am trying to debug a C++ code that uses some personal dynamic libraries. I am using Mac OSX, but I am not using llvm/clang to compile my code nor the libraries. At the moment, I am using the GNU g++ compiler (4.7) provided by homebrew.
The problem
I have two choices of debuggers in this environment: the gdb version provided by Mac Developer Tools (GNU gdb 6.3.50-20050815 (Apple version gdb-1824)) and the gdb installed with homebrew (GNU gdb (GDB) 7.5.1). I would prefer to use the latter, but when using it, it shows many important variables as optimized out.
For example, this is an extract of the output of my program using gdb 7.5.1:
Breakpoint 1, MWE::Outputs (this=<optimized out>, time=<optimized out>)
at /Users/ynet/temp/mwe.cpp:203
203 cout << "example" << endl;
(gdb) p this
$1 = <optimized out>
While gdb 6.3.50 shows:
Breakpoint 1, MWE::Outputs (this=0x100601080, time=0.64300000000000046) at /Users/ynet/temp/mwe.cpp:203
203 cout << "example" << endl;
(gdb) p this
$1 = (MWE * const) 0x100601080
Both programs are identical (i.e. it's the same executable); it has been compiled with the homebrew g++-4.7 and not the llvm/clang compiler provided by Apple Developer Tools. Unlike questions regarding the optimized out results of gdb, I have checked that I am compiling with '-O0' (my current flags are '-O0 -g -ggdb')
Question
Why do I get two different behaviors of gdb in this case and what should I do in order to use the most recent gdb version without the optimized values?