I am implementing event detection in an ODE solver suite. Following is a piece of code in the implementation that I am currently debugging and notice a strange behavior. Once the root finding routine returns successfully with a root, the first if block is executed (ROOTFINDING_ERR == 0) -- lines 1 to 14 -- which is expected behavior. However, once the block finishes executing, execution jumps unexpectedly to the last statement of the ELSE block corresponding to (ROOTFINDING_ERR == 0) -- line 20 -- and the statement EVENT_OUT(j) = .FALSE. is executed before executing ENDIF on line 21.
1 IF (ROOTFINDING_ERR == 0) THEN
2 IF ((ABS(EVENT_TIMES(1, j) - T_STAR) > &
3 NEARBY_ROOTS_ABSTOL) &
4 .AND. (EVENT_ITER < EVENT_ITER_MAX)) THEN
5 EVENT_TIMES(1, j) = T_STAR
6 EVENT_TIMES(2, j) = DBLE(j)
7 EVENT_TEST(j) = .TRUE.
8 EVENT_OUT(j) = .TRUE.
9 ELSE
10 EVENT_TIMES(1, j) = T_STAR
11 EVENT_TIMES(2, j) = DBLE(j)
12 EVENT_TEST(j) = .FALSE.
13 EVENT_OUT(j) = .TRUE.
14 ENDIF
15 ELSE
16 EVENT_TIMES(1, j) = T_STAR
17 EVENT_TIMES(2, j) = DBLE(j)
18 ! Dont test this event further
19 EVENT_TEST(j) = .FALSE.
20 EVENT_OUT(j) = .FALSE.
21 ENDIF
I assumed this has to do with some memory access/allocation problems and ran DrMemory (a tool like Valgrind) on the executable file. DrMemory did specify that some reads are uninitialized which I will look further into.
I was wondering where to go from here. I did step through the code using gdb and had some difficulty isolating "uninitialized read" that DrMemory complained about. I turned on -Wall, -Wextra, -fbounds-check
etc to figure out if the compiler is able to catch it. I also tried to disassemble the exe with x32dbg, but the debug symbols are not read as x32dbg doesn't support DWARF symbols.