Malcolm McLean said:
I don't use debuggers much. The problem is that I write code for a
wide variety of different platforms, some of which, being parallel,
are inherently hard to provide debugging tools for. If you use a
debgger routinely then that becomes the way of working. It is then
hard to switch to something else - a bit like moving between automatic
and manual cars. So I prefer to debug with diagnostic printfs.
Diagnostic printfs are amateurish and a cause of bugs in my opinion. It
might seem harsh but the very nature of writing them, compiling them in,
reading the logs and removing them is a cause of error and is waste of
time and effort. If there is NO debugger for your system then fine. But
gdb works with most if not all (well, obviously not all....).
But its not just me of course. Richard Stallman wrote uses gdb for a
reason.
Here:
Why Not Use Printf?
===================
http://dirac.org/linux/gdb/01-Introduction.php#whynotuse<tt>printf()</tt>
I know I go on about it, but I have rarely, if ever, seen anyone debug a
system with printf quicker than learning how to use a proper debugger
properly. It is simply ludicrous NOT to use one in this day and age on
large code bases where HW breakpoints and complicated watch points make
it trivial to catch complex state which trigger bugs.
When you catch a bug you can move up and down the stack to see the
variables which contributed to that bug. No wads of printf logs to wade
though. You can often rewind the code. You can change locals to trigger
the bug etc etc etc. Only people who do NOT know what a debugger is or
know how to use one properly argue against these benefits.
But we've discussed this a million times before. If someone wants to
blinker themselves and NOT learn how to use something as potentially
powerful as GDB then that's their decision. A silly one.