Large amount of files to parse/organize, tips on algorithm?

Discussion in 'Python' started by cnb, Sep 2, 2008.

  1. cnb

    cnb Guest

    I have a bunch of files consisting of moviereviews.

    For each file I construct a list of reviews and then for each new file
    I merge the reviews so that in the end have a list of reviewers and
    for each reviewer all their reviews.

    What is the fastest way to do this?

    1. Create one file with reviews, open next file an for each review see
    if the reviewer exists, then add the review else create new reviewer.

    2. create all the separate files with reviews then mergesort them?
    cnb, Sep 2, 2008
    1. Advertisements

  2. Use the timeit module to find out.

    The answer will depend on whether you have three reviews or three
    million, whether each review is twenty words or twenty thousand words,
    and whether you have to do the merging once only or over and over again.
    Steven D'Aprano, Sep 2, 2008
    1. Advertisements

  3. cnb

    cnb Guest

    I merge once. each review has 3 fields, date rating customerid. in
    total ill be parsing between 10K and 100K, eventually 450K reviews.
    cnb, Sep 2, 2008
  4. cnb

    cnb Guest

    over 17000 files...

    cnb, Sep 2, 2008
  5. cnb

    Eric Wertman Guest

    I think you really want use a relational database of some sort for this.
    Eric Wertman, Sep 2, 2008
  6. cnb

    Paul Rubin Guest

    Scan through all the files sequentially, emitting records like

    (movie, reviewer, review)

    Then use an external sort utility to sort/merge that output file
    on each of the 3 columns. Beats writing code.
    Paul Rubin, Sep 2, 2008
  7. cnb

    jay graves Guest

    jay graves, Sep 2, 2008
    1. Advertisements

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.