Re: debugging "break point" in optimized code?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



Reza Roboubi wrote:
What does a breakpoint mean anyway when code is optimized?


A breakpoint is still a breakpoint. Even when optimized, -g on compile and link will still put in line number information that identifies a source file / line number for each machine language line. That information just isn't as accurate as it would be in non optimized code.

1) After optimization, a single machine language instruction might be parts of several source lines. So no one source line could be a correctly identified as THE source line of that machine language instruction. So far as I understand, the process doesn't try to identify multiple source lines for one machine language instruction. I don't know if the file formats would even support that. If it did, that would present an interesting set of challenges and opportunities for debuggers and profilers.

2) Sometimes GCC confuses itself in the optimization process, so a machine language instruction unambiguously belongs to a single source line and GCC identifies it as belonging to a different source line.

In unoptimized code, there tends to be a small contiguous set of instructions associated with each source line. So setting a breakpoint on a source line obviously means setting a breakpoint on the first instruction of that contiguous set.

In optimized code, the source line might be instantiated multiple times and each instantiation might be a set of discontiguous instructions. What should setting a breakpoint there mean? (If you really want the answer, this discussion should be in a gdb mailing list, not a gcc mailing list). Someone doing source level debugging would probably like it to mean something that causes a break on any transition from code that is not part of that line to code that is part of that line. After that, things get even more complex. What does it mean to proceed from such a breakpoint? Certainly not stop at the next instruction if that is still part of the same line. The one source breakpoint should be many instruction breakpoints. When you proceed, which of them are turned off and for how long?

The Microsoft Visual Studio debugger comes closer than GDB to meeting naive expectations about source level debugging as discussed above. But it doesn't come close enough to make such debugging practical.

I debug only optimized code. (Not by intent, it just works out that way). I always use a split view of source and disassembly. I almost never set breakpoints on source lines. There is too much risk the breakpoint would be somewhere distant from where one would reasonably expect. I look at the source code, then I set breakpoints in the disassembly.



[Index of Archives]     [Linux C Programming]     [Linux Kernel]     [eCos]     [Fedora Development]     [Fedora Announce]     [Autoconf]     [The DWARVES Debugging Tools]     [Yosemite Campsites]     [Yosemite News]     [Linux GCC]

  Powered by Linux