M
Markus Dehmann
I have a strange problem. My program segfaults, and when I use the
debugger I find that a function that I call with a parameter int n=1
is actually executed with a garbled parameter int n=136857236. So the
function does not receive the actual value that I pass in, but works
with some messed-up value.
What might be the reason for this? The called function is defined in
another .o file. So I thought the reason might be that the .o file is
compiled with different settings, but that doesn't seem to be the
case. Are there other possible reasons for such a strange behavior?
Thanks!
Markus
debugger I find that a function that I call with a parameter int n=1
is actually executed with a garbled parameter int n=136857236. So the
function does not receive the actual value that I pass in, but works
with some messed-up value.
What might be the reason for this? The called function is defined in
another .o file. So I thought the reason might be that the .o file is
compiled with different settings, but that doesn't seem to be the
case. Are there other possible reasons for such a strange behavior?
Thanks!
Markus