why is it dumping core?


Z

Zach

I told gcc to compile with "-g" yet when I run the debugger it says no
symbols are found and the very brief backtrace does not help me at
all.

[email protected]:~/$ gcc -o main5 main5.c -g
main5.c: In function ‘main’:
main5.c:27: warning: passing argument 1 of ‘fstat’ makes integer from
pointer without a cast

What precisely does this warning mean? I don't understand why casting
would be needed in the fstat call. The file test.log is plain ascii
text.

[email protected]:~/$ ./main5 test.log
Segmentation fault (core dumped)
[email protected]:~/$ gdb --core=./core
GNU gdb 6.4.90-debian
Copyright (C) 2006 Free Software Foundation, Inc.
GDB is free software, covered by the GNU General Public License, and
you are
welcome to change it and/or distribute copies of it under certain
conditions.
Type "show copying" to see the conditions.
There is absolutely no warranty for GDB. Type "show warranty" for
details.
This GDB was configured as "i486-linux-gnu".
(no debugging symbols found)
Using host libthread_db library "/lib/tls/i686/cmov/libthread_db.so.
1".
Core was generated by `./main5 test.log'.
Program terminated with signal 11, Segmentation fault.
#0 0xb7ea7158 in ?? ()
(gdb) bt
#0 0xb7ea7158 in ?? ()
(gdb) quit

Here is the program source:
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <fcntl.h>
#include <sys/stat.h>
#include <sys/types.h>

#define MAXLINE 90

int main(int argc, char *argv[])
{
FILE *fp;
char buf[MAXLINE];
int *psz;

struct stat fileStat;

if((fp = fopen(argv[1], "r")) == NULL)
{
printf("Cannot open file.\n");
exit(1);
}

while((fp = fopen(argv[1], "r")))
{
if(fstat(fp,&fileStat) < 0)
{
psz = malloc(fileStat.st_size);
}
}


while(fgets(buf, MAXLINE - 2, fp) != NULL)
{

char sub_string1[MAXLINE];
char sub_string2[MAXLINE];
char sub_string3[MAXLINE];

sscanf (buf,"%s %s %s",sub_string1,sub_string2,sub_string3);
printf ("%s\n %s\n %s\n",sub_string1,sub_string2,sub_string3);

}

fclose(fp);
free(psz);
return 0;

}

Is there a better way to check if whether the file is readable and non-
zero? I have:
while((fp = fopen(argv[1], "r")))
I just needed way to check if the file is legit and if so to allocate
memory. Is there a better way to do this or a way using a standard
library function?

Zach
 
Ad

Advertisements

I

Ian Collins

Zach said:
I told gcc to compile with "-g" yet when I run the debugger it says no
symbols are found and the very brief backtrace does not help me at
all.

[email protected]:~/$ gcc -o main5 main5.c -g
main5.c: In function ‘main’:
main5.c:27: warning: passing argument 1 of ‘fstat’ makes integer from
pointer without a cast
You are mixing C standard file operations which use FILE* with Unix file
operations the use an integer file descriptor. Not a very helpful error
message!
 
Z

Zach

You are mixing C standard file operations which use FILE* with Unix file
operations the use an integer file descriptor. Not a very helpful error
message!


Hi Ian,

I changed it to be UNIX friendly and use open() instead of fopen() and
I cast to (FILE *) as expected by fgets() and flose() and when I
compile now I get no errors or warning but when I run it nothing
happens, it just sits there idling. Here is my revised program source:

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <fcntl.h>
#include <sys/stat.h>
#include <sys/types.h>

#define MAXLINE 90

int main(int argc, char *argv[])
{
char buf[MAXLINE];
int *psz;
int file=0;

struct stat fileStat;

if((file = open(argv[1], O_RDONLY)) == -1)
{
printf("Cannot open file.\n");
exit(1);
}

while((file = open(argv[1], O_RDONLY)))
{
if(fstat(file,&fileStat) < 0)
{
psz = malloc(fileStat.st_size);
}
}


while(fgets(buf, MAXLINE - 2, (FILE *)file) != NULL)
{

char sub_string1[MAXLINE];
char sub_string2[MAXLINE];
char sub_string3[MAXLINE];

sscanf (buf,"%s %s %s",sub_string1,sub_string2,sub_string3);
printf ("%s\n %s\n %s\n",sub_string1,sub_string2,sub_string3);

}

fclose((FILE *)file);
free(psz);
return 0;

}

Zach
 
J

Joachim Schmitz

Obnoxious said:
A file descriptor (int) and a FILE* are two different things and
cannot be cast from one to the other.

Of course they can, but the result isn't usefull...
I guess you ment that ;-)

Bye, Jojo
 
J

Joachim Schmitz

Zach said:
I told gcc to compile with "-g" yet when I run the debugger it says no
symbols are found and the very brief backtrace does not help me at
all.

[email protected]:~/$ gcc -o main5 main5.c -g

Not really sure, but maybe gcc doesn't like the -g at the end of the command
line?
Try gcc -g -o main5 main5.c

Bye, Jojo
 
B

Bjarni Juliusson

Zach said:
I told gcc to compile with "-g" yet when I run the debugger it says no
symbols are found and the very brief backtrace does not help me at
all.

[email protected]:~/$ gcc -o main5 main5.c -g
main5.c: In function ‘main’:
main5.c:27: warning: passing argument 1 of ‘fstat’ makes integer from
pointer without a cast

What precisely does this warning mean? I don't understand why casting
would be needed in the fstat call. The file test.log is plain ascii
text.

As others have pointed out, fstat() takes a file descriptor, which is
what is returned by open(), but fgets() takes a FILE pointer, which is
what fopen() returns, and which represents a layer on top of the file
descriptor level I/O and provides some additional convenient
functionality, such as buffering.

Do not mix these. Decide whether you need direct file descriptor I/O or
the stdio functions (fopen, fgets, etc), and use the same kind all the
way. Casting the number returned by open() to be a pointer to a FILE
structure or vice versa will not work.
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <fcntl.h>
#include <sys/stat.h>
#include <sys/types.h>

#define MAXLINE 90

int main(int argc, char *argv[])
{
FILE *fp;
char buf[MAXLINE];
int *psz;

struct stat fileStat;

if((fp = fopen(argv[1], "r")) == NULL)
{
printf("Cannot open file.\n");
exit(1);
}

A style issue: Error messages should go to stderr, so use fprintf. This
is not important here.
while((fp = fopen(argv[1], "r")))
{

First of all, this will open the same file again even though you already
opened it with the first fopen() above. If execution gets here, fp
already refers to the opened file.

Second, why "while"? This will keep opening the same file over and over
until it fails, and then it will try to do fgets() below with fp set to
0, which will not work.
if(fstat(fp,&fileStat) < 0)
{

If fstat() returns something less than zero, it failed...
psz = malloc(fileStat.st_size);

.... so this gets done if fstat() failed, and then fileStat is bogus.
}
}


while(fgets(buf, MAXLINE - 2, fp) != NULL)
{

Why MAXLINE-2? Shouldn't that be just MAXLINE?

Also, fp is zero here, as I mentioned above.
char sub_string1[MAXLINE];
char sub_string2[MAXLINE];
char sub_string3[MAXLINE];

sscanf (buf,"%s %s %s",sub_string1,sub_string2,sub_string3);
printf ("%s\n %s\n %s\n",sub_string1,sub_string2,sub_string3);

}

fclose(fp);

Fix it so it only opens the file once and this will work.
free(psz);

Same here.
return 0;

}

Is there a better way to check if whether the file is readable and non-
zero? I have:
while((fp = fopen(argv[1], "r")))
I just needed way to check if the file is legit and if so to allocate
memory. Is there a better way to do this or a way using a standard
library function?

Open it and stat it is about as easy as it gets, but another method if
you just want to read an entire file into memory is to use mmap().


Bjarni
 
Ad

Advertisements

G

Guest

    if(fstat(fp,&fileStat) < 0)
    {
      psz = malloc(fileStat.st_size);
    }

this isn't you problem, but as a matter of good practice you should
check the return value of malloc()

<snip>
 
M

Mark Wooding

Bjarni Juliusson said:
If fstat() returns something less than zero, it failed...


... so this gets done if fstat() failed, and then fileStat is bogus.

Indeed. It's not actually clear what this is for anyway, since the only
thing that's actually done with psz is that it's freed at the end.

Maybe once upon a time the program attempted to fread the whole file
into the block. This leads to an exciting time-of-check/time-of-use
race: the file may be completely different by the time you actually read
it in. There are two failure modes, depending on how you do the slurp.

* You slurp the file up to EOF into the buffer, and overrun it if the
file grew while you weren't looking.

* You slurp the file up to the size you read, and missed stuff off the
end. You now have an incomplete file, and could easily be tempted
to read (or write!) off the end; but it's almost certainly garbled.

Really, I think it's better to read the file up to EOF, reallocing the
buffer (according to some geometric progression) as necessary.

-- [mdw]
 
K

Kenny McCormack

this isn't you problem, but as a matter of good practice you should
check the return value of malloc()

So not the point.

OBFlamebait: In a hobby/testing/beginner program, malloc never fails.
 
N

Nate Eldredge

Zach said:
I told gcc to compile with "-g" yet when I run the debugger it says no
symbols are found and the very brief backtrace does not help me at
all.

[email protected]:~/$ gcc -o main5 main5.c -g
main5.c: In function ‘main’:
main5.c:27: warning: passing argument 1 of ‘fstat’ makes integer from
pointer without a cast

I'd suggest using the options -Wall -W as well, since they will give you
more warnings about things you may be doing wrong.
What precisely does this warning mean? I don't understand why casting
would be needed in the fstat call. The file test.log is plain ascii
text.

[email protected]:~/$ ./main5 test.log
Segmentation fault (core dumped)
[email protected]:~/$ gdb --core=./core

gdb needs to be told where to find the binary. It didn't find any
debugging symbols because it didn't find the binary! Try running

$ gdb ./main5 ./core

Better still, if the crash is easy to reproduce, don't bother with a
core dump. Just do

$ gdb ./main5
(gdb) run test.log
GNU gdb 6.4.90-debian
Copyright (C) 2006 Free Software Foundation, Inc.
GDB is free software, covered by the GNU General Public License, and
you are
welcome to change it and/or distribute copies of it under certain
conditions.
Type "show copying" to see the conditions.
There is absolutely no warranty for GDB. Type "show warranty" for
details.
This GDB was configured as "i486-linux-gnu".
(no debugging symbols found)
Using host libthread_db library "/lib/tls/i686/cmov/libthread_db.so.
1".
Core was generated by `./main5 test.log'.
Program terminated with signal 11, Segmentation fault.
#0 0xb7ea7158 in ?? ()
(gdb) bt
#0 0xb7ea7158 in ?? ()
(gdb) quit

Here is the program source:

#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <unistd.h>
#include <fcntl.h>
#include <sys/stat.h>
#include <sys/types.h>

#define MAXLINE 90

int main(int argc, char *argv[])
{
FILE *fp;
char buf[MAXLINE];
int *psz;

struct stat fileStat;

if((fp = fopen(argv[1], "r")) == NULL)
{
printf("Cannot open file.\n");
exit(1);
}

while((fp = fopen(argv[1], "r")))
{
if(fstat(fp,&fileStat) < 0)
{
psz = malloc(fileStat.st_size);
}
}


while(fgets(buf, MAXLINE - 2, fp) != NULL)
{

char sub_string1[MAXLINE];
char sub_string2[MAXLINE];
char sub_string3[MAXLINE];

sscanf (buf,"%s %s %s",sub_string1,sub_string2,sub_string3);
printf ("%s\n %s\n %s\n",sub_string1,sub_string2,sub_string3);

}

fclose(fp);
free(psz);
return 0;

}

Is there a better way to check if whether the file is readable and non-
zero? I have:
while((fp = fopen(argv[1], "r")))
I just needed way to check if the file is legit and if so to allocate
memory. Is there a better way to do this or a way using a standard
library function?

If the file wasn't readable, fopen wouldn't have succeeded in the first
place. I am not sure what you mean by "nonzero".

It sounds like what you are trying to do is determine the size of the
file, and malloc a buffer large enough to contain all of it. (It's not
clear why you want to do this, because you don't use that buffer in the
rest of your program.) This is not really ideal, because (a) standard C
doesn't provide a way to determine the size of a file, forcing you to
rely on fstat which isn't portable beyond Unix, and (b) the size of the
file could change after you test it.

A better approach, if you want to read the whole file into a buffer, is
to have the buffer grow to the size of the file as you read it. For
instance (untested):

size_t size, i;
char *buf;
FILE *fp;
int c;
fp = fopen(filename, "r");
if (!fp) { /* error */ }
size = 100; /* arbitrary number */
buf = malloc(size);
i = 0;
if (!buf) { /* error */ }
while ((c = getc(fp)) != EOF) {
if (i == size) {
size *= 2;
char *t = realloc(buf, size);
if (!t) {
free(t);
/* error */
}
buf = t;
}
buf[i++] = c;
}

If you must determine the size of a file that you've opened, the way to
do it using fstat is like this:

FILE *fp; /* as returned from fopen */
struct stat st;
off_t size;
if (fstat(fileno(fp), &st) < 0) {
perror("fstat error");
/* maybe exit here */
} else {
/* all is well */
size = st.st_size;
/* rest of program */
}

There isn't any need to reopen the file, just use the original handle
returned from fopen. There's no need for a loop either. And a negative
value returned from fstat indicates that an error occurred and the
contents of the struct stat are *not* valid (in fact, most likely the
struct stat would be unchanged from whatever was in it beforehand).

I'll also recommend the use of the perror function whenever you find
that a standard library function or system call has failed, since it
will give you a message describing what went wrong. Also, it writes the
message to stderr, which is preferred, rather than stdout as printf would.
 
K

Kaz Kylheku

["Followup-To:" header set to comp.lang.c.]
You are mixing C standard file operations which use FILE* with Unix file
operations the use an integer file descriptor. Not a very helpful error
message!

How is that not a helpful error message?

In line 27 you're calling stat, and there is something wrong with the
expression for argument one; passing argument one calls for a pointer to
integer conversion, which is a diagnosable error.

That could only be because the function expects an integer.

Why else would such a conversion be called for in the passing of an argument?
 
Ad

Advertisements

K

Kaz Kylheku

Hi Ian,

I changed it to be UNIX friendly and use open() instead of fopen() and
I cast to (FILE *) as expected by fgets() and flose() and when I

Just what do you think that you achieve by casting an integer value to FILE *?

Don't you think that FILE * actually has to point to something?

The Unix integer file descriptors are /not/ the result of converting
the address of a FILE object to an integer. A big clue is that they are small
integers, assigned consecutively. The values 0, 1 and 2 are your standard
input, ouput and error file handles; when you open additional files, the
new descriptors are typicaly 3, 4, ...

Do you really think there is a FILE object at address 3, so that
casting (FILE *) 3 will produce a meaningful pointer?
 
I

Ian Collins

Kaz said:
["Followup-To:" header set to comp.lang.c.]
You are mixing C standard file operations which use FILE* with Unix file
operations the use an integer file descriptor. Not a very helpful error
message!

How is that not a helpful error message?

Compared to the g++ error:

27:error: invalid conversion from 'FILE*' to 'int'

or Sun CC:

line 27: Error: Formal argument _fd of type int in call to fstat(int,
stat*) is being passed __FILE*.

Now that's a helpful error message!
 
C

CBFalconer

Zach said:
I changed it to be UNIX friendly and use open() instead of fopen()
and I cast to (FILE *) as expected by fgets() and flose() and when
I compile now I get no errors or warning but when I run it nothing
happens, it just sits there idling. Here is my revised program
source:

And in the process you made it off-topic on c.l.c. Try some gnu
groups, or comp.unix.programmer. Debuggers are outside the scope
of c.l.c. anyhow.
 
R

Richard

CBFalconer said:
And in the process you made it off-topic on c.l.c. Try some gnu
groups, or comp.unix.programmer. Debuggers are outside the scope
of c.l.c. anyhow.

Dont be so silly. Real programmers (not home busy bodies like you) use
debuggers all the time. Writing C in a debugger friendly manner is also
a plus. Garbage like your stuff with multiple statements on a line is
NOT debugger friendly.
 
Ad

Advertisements

D

David Resnick

Or on Linux.....

I don't think putting malloc checking in a snippet illustrating
something else is necessary, though I'd always point it out myself if
reviewing what I thought was someones real code. However IMHO your
comment adds no value to the discussion and anyway is wrong at least
on my flavor of Linux (RHEL5).

temp(545)$ cat foo.c
#include <stdlib.h>
#include <stdio.h>
#include <limits.h>
int main(void)
{
int i = 0;
for (i = 0; i < INT_MAX; ++i)
{
char *p = malloc(1000000);
if (p == NULL)
{
printf("malloc failed on iteration %d\n", i);
exit(EXIT_FAILURE);
}
}
exit(1);
}
temp(546)$ foo
malloc failed on iteration 3205
temp(547)$ uname -a
Linux daytona 2.6.18-8.el5 #1 SMP Fri Jan 26 14:15:21 EST 2007 i686
i686 i386 GNU/Linux

-David
 
K

Kenny McCormack

Or on Linux.....

Good point, that.

Incidentally, when I was composing my previous post, I tried,
unsuccessfully, to come up with a good analogy for this particular
flavor of "CLC-think" - that of trying to protect yourself against some
(for all practical purposes) mythical 0.000000001% risk, while losing
sight of the real issues in front of you. Since then, I've realized
that a bridge analogy is in order. There are stories in the Menagerie
series about how some players try so hard to protect themselves against
some mythical 8-0 break or whatever, while losing sight of the more
imminent risks. Something about looking up the odds of a player holding
16 cards...
 
K

Kaz Kylheku

Or on Linux.....

Malloc can definitely return null in a Linux that is configured for strict
overcommit accounting.

E.g. if the overcommit ratio is 50%, and the allocation request would cause the
total amount of virtual memory in the system to exceed more than 50% of
physical memory, then mmap will fail and malloc will return null.

Problem is, that the kernel can potentially eat more than the other 50% (or
whatever percentage that has been configured); those allocations do not count
toward virtual memory. Moreover, strict accounting doesn't cause anonymous mmap
requests to actually allocate the pages. Assignment of pages to virtual memory
maps is still lazy. So even under strict overcommit accounting, it's possible
that an access to a memory map will fail, since the kernel can't allocate a
page, and can't swap anything to make room.

Thus strict overcommit accounting is not a reliable way to ensure that malloc
returns null. It's a way of getting that behavior under some ``typical'', like
when nothing in the system is misbehaving.
 
Ad

Advertisements

G

Guest

Incidentally, when I was composing my previous post, I tried,
unsuccessfully, to come up with a good analogy for this particular
flavor of "CLC-think" - that of trying to protect yourself against some
(for all practical purposes) mythical 0.000000001% risk, while losing
sight of the real issues in front of you.

other posters had addressed the OPs actual problem.
Since then, I've realized
that a bridge analogy is in order.  There are stories in the Menagerie
series

up to here I thought the Menagerie series was some sort of civil
engineering publication.
about how some players try so hard to protect themselves against
some mythical 8-0 break or whatever, while losing sight of the more
imminent risks.  Something about looking up the odds of a player holding
16 cards

but then I saw it must be about a card game
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top