Segmentation fault (core dumped)

Discussion in 'Perl Misc' started by ekilada, Aug 8, 2006.

  1. ekilada

    ekilada Guest

    Hi,
    Using Perl, when I try to open a 1.8G file for reading, I get a
    'Segmentation fault (core dumped)' error.
    Please, is there any way to segment huge files for reading ?

    Thanks And Best Regards,
    Eliyah
    ekilada, Aug 8, 2006
    #1
    1. Advertising

  2. ekilada

    Guest

    ekilada wrote:
    > Using Perl, when I try to open a 1.8G file for reading, I get a
    > 'Segmentation fault (core dumped)' error.


    Oops. You have an error on line 42. You need to fix that.

    > Please, is there any way to segment huge files for reading ?


    perldoc perlop (/while)

    --
    David Filmer (http://DavidFilmer.com)
    , Aug 8, 2006
    #2
    1. Advertising

  3. ekilada

    Brian Wakem Guest

    ekilada wrote:

    > Hi,
    > Using Perl, when I try to open a 1.8G file for reading, I get a
    > 'Segmentation fault (core dumped)' error.
    > Please, is there any way to segment huge files for reading ?



    If the error occurs when *opening* the file then the filesize is irrelevant.


    --
    Brian Wakem
    Email: http://homepage.ntlworld.com/b.wakem/myemail.png
    Brian Wakem, Aug 8, 2006
    #3
  4. ekilada

    Mumia W. Guest

    On 08/08/2006 03:18 AM, ekilada wrote:
    > Hi,
    > Using Perl, when I try to open a 1.8G file for reading, I get a
    > 'Segmentation fault (core dumped)' error.
    > Please, is there any way to segment huge files for reading ?
    >


    Yes, read it in line by line using the "while(<FH>) { ... }"
    syntax.

    > Thanks And Best Regards,
    > Eliyah
    >


    You're welcome.
    Mumia W., Aug 8, 2006
    #4
  5. ekilada

    ekilada Guest

    Hi Brian,
    No, it occurs while reading line number 13457941
    FYI: I use while (<>) to read line by line.

    Regards,
    Eliyah

    Brian Wakem wrote:
    > ekilada wrote:
    >
    > > Hi,
    > > Using Perl, when I try to open a 1.8G file for reading, I get a
    > > 'Segmentation fault (core dumped)' error.
    > > Please, is there any way to segment huge files for reading ?

    >
    >
    > If the error occurs when *opening* the file then the filesize is irrelevant.
    >
    >
    > --
    > Brian Wakem
    > Email: http://homepage.ntlworld.com/b.wakem/myemail.png
    ekilada, Aug 8, 2006
    #5
  6. ekilada

    Dave Guest

    "ekilada" <> wrote in message
    news:...
    > Hi Brian,
    > No, it occurs while reading line number 13457941
    > FYI: I use while (<>) to read line by line.
    >
    > Regards,
    > Eliyah
    >
    > Brian Wakem wrote:
    >> ekilada wrote:
    >>
    >> > Hi,
    >> > Using Perl, when I try to open a 1.8G file for reading, I get a
    >> > 'Segmentation fault (core dumped)' error.
    >> > Please, is there any way to segment huge files for reading ?

    >>
    >>
    >> If the error occurs when *opening* the file then the filesize is
    >> irrelevant.
    >>
    >>
    >> --
    >> Brian Wakem
    >> Email: http://homepage.ntlworld.com/b.wakem/myemail.png

    >


    Questions:
    Which version of Perl?
    On what platform?
    Does it happen on any huge file or is there something special about this one
    (like a huge line).
    Can you post a short but complete script that demonstrates the problem (if
    run on a huge file)?
    Dave, Aug 8, 2006
    #6
  7. ekilada

    Brian Wakem Guest

    ekilada wrote:

    > Hi Brian,
    > No, it occurs while reading line number 13457941
    > FYI: I use while (<>) to read line by line.
    >


    Is that the last line?

    Perhaps the file is not the size is filesystem thinks it is.

    Can you open the file and seek to the end with something like vi?


    --
    Brian Wakem
    Email: http://homepage.ntlworld.com/b.wakem/myemail.png
    Brian Wakem, Aug 8, 2006
    #7
  8. ekilada

    Guest

    "ekilada" <> wrote:

    Please don't top post! Re-arranged.

    > Brian Wakem wrote:
    > > ekilada wrote:
    > >
    > > > Hi,
    > > > Using Perl, when I try to open a 1.8G file for reading, I get a
    > > > 'Segmentation fault (core dumped)' error.
    > > > Please, is there any way to segment huge files for reading ?

    > >
    > >
    > > If the error occurs when *opening* the file then the filesize is
    > > irrelevant.
    > >


    > Hi Brian,
    > No, it occurs while reading line number 13457941
    > FYI: I use while (<>) to read line by line.


    Well, in that case the problem is clearly on line 23, not 42.

    Xho

    --
    -------------------- http://NewsReader.Com/ --------------------
    Usenet Newsgroup Service $9.95/Month 30GB
    , Aug 8, 2006
    #8
  9. ekilada

    Joe Smith Guest

    ekilada wrote:
    > Hi,
    > Using Perl, when I try to open a 1.8G file for reading, I get a
    > 'Segmentation fault (core dumped)' error.


    Are you using a version of perl that was compiled for large file support?

    perl -V | egrep '64|large'
    osname=cygwin, osvers=1.5.18(0.13242), archname=cygwin-thread-multi-64int
    config_args='-de -Dmksymlinks -Duse64bitint -Dusethreads -Uusemymalloc -Doptimize=-O3 -Dman3ext=3pm -Dusesitecustomize'
    useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
    use64bitint=define use64bitall=undef uselongdouble=undef
    Compile-time options: MULTIPLICITY USE_ITHREADS USE_64_BIT_INT

    Programs that are not aware of largefiles (such as 'wget') tend to
    get a segmentation fault in the STDIO library after outputting
    more than 2 or 4GB.
    -Joe
    Joe Smith, Aug 13, 2006
    #9
  10. ekilada

    Ben Morrow Guest

    Quoth Joe Smith <>:
    >
    > perl -V | egrep '64|large'
    > osname=cygwin, osvers=1.5.18(0.13242), archname=cygwin-thread-multi-64int
    > config_args='-de -Dmksymlinks -Duse64bitint -Dusethreads -Uusemymalloc
    > -Doptimize=-O3 -Dman3ext=3pm -Dusesitecustomize'
    > useperlio=define d_sfio=undef uselargefiles=define usesocks=undef
    > use64bitint=define use64bitall=undef uselongdouble=undef
    > Compile-time options: MULTIPLICITY USE_ITHREADS USE_64_BIT_INT


    Are you aware you can write that (arguably) more simply as

    perl -V:'.*large.*|.*64.*'

    ? Also, I get 34 lines from that, rather than your 5, which is a little
    weird... I guess you must be using 5.8.<8...

    Ben

    --
    Joy and Woe are woven fine,
    A Clothing for the Soul divine William Blake
    Under every grief and pine 'Auguries of Innocence'
    Runs a joy with silken twine.
    Ben Morrow, Aug 13, 2006
    #10
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. Pieter Droogendijk

    segmentation fault (core dumped)

    Pieter Droogendijk, Dec 20, 2003, in forum: C Programming
    Replies:
    13
    Views:
    4,533
    no_name
    Jan 5, 2004
  2. DanielJohnson

    Segmentation Fault (core dumped)

    DanielJohnson, Feb 18, 2007, in forum: C Programming
    Replies:
    29
    Views:
    840
    Mark McIntyre
    Feb 22, 2007
  3. Glen Hendry
    Replies:
    2
    Views:
    284
    James Willmore
    Oct 9, 2003
  4. Replies:
    4
    Views:
    1,946
  5. Replies:
    2
    Views:
    30
    Akira Li
    Jun 1, 2014
Loading...

Share This Page