Results in multiple pages. Takes too much time

Discussion in 'Perl Misc' started by premgrps@gmail.com, Jul 11, 2006.

  1. Guest

    Hi,
    I have a table of a million records and wrote a CGI-PERL script to
    display the results based on the user input. The results might be
    anywhere from 100 to 1000 per query and presently I am displaying them
    as 25 results per page.

    Problem: Each query is taking about 20-30 seconds.

    My solution: I have tried to optimize the table and also index the
    table. I have actually converted a MS access database to SQL database,
    so it wasn't previously indexed. Both optimization and indexing doesn't
    give any good results. I always get a timeout. ie. it takes longer
    after indexing and optimizing.

    1. I was wondering if someone has a creative solution for this. ie.
    reduce the time from 20-30 seconds to atleast 10 seconds.

    2. I have links of pages of results beneath the first page result. When
    each of these links are clicked it takes 20-30 seconds again. Is there
    a way I can reduce the time taken for the subsequent pages are reduced?
    I cannot use the LIMIT option in mysql, since I have a where clause
    which has to search through the whole table. I tried using views and
    using limits, but it takes as much time.

    Please let me know.

    Thanks.
     
    , Jul 11, 2006
    #1
    1. Advertising

  2. wrote:
    > Hi,
    > I have a table of a million records and wrote a CGI-PERL script to
    > display the results based on the user input. The results might be
    > anywhere from 100 to 1000 per query and presently I am displaying them
    > as 25 results per page.
    >
    > Problem: Each query is taking about 20-30 seconds.
    >
    > My solution: I have tried to optimize the table and also index the
    > table. I have actually converted a MS access database to SQL database,
    > so it wasn't previously indexed. Both optimization and indexing doesn't
    > give any good results. I always get a timeout. ie. it takes longer
    > after indexing and optimizing.
    >
    > 1. I was wondering if someone has a creative solution for this. ie.
    > reduce the time from 20-30 seconds to atleast 10 seconds.
    >
    > 2. I have links of pages of results beneath the first page result. When
    > each of these links are clicked it takes 20-30 seconds again. Is there
    > a way I can reduce the time taken for the subsequent pages are reduced?
    > I cannot use the LIMIT option in mysql, since I have a where clause
    > which has to search through the whole table. I tried using views and
    > using limits, but it takes as much time.
    >


    At first glance, this is more of a MySQL question than a Perl question.
    How long does a typical query take if executed at the MySQL prompt? What
    happens when you prefix the query with EXPLAIN? You may find you don't
    have all the indexes you need.

    If the query doesn't run fast by itself, no amount of tuning in Perl is
    going to help.

    Mark
     
    Mark Clements, Jul 11, 2006
    #2
    1. Advertising

Want to reply to this thread or ask your own question?

It takes just 2 minutes to sign up (and it's free!). Just click the sign up button to choose a username and then you can ask your own questions on the forum.
Similar Threads
  1. nicholas
    Replies:
    7
    Views:
    602
    John.Net
    Aug 3, 2005
  2. Filip Gruszczyński

    urllib2 urlopen takes too much time

    Filip Gruszczyński, Jun 21, 2009, in forum: Python
    Replies:
    1
    Views:
    1,125
  3. cpp4ever
    Replies:
    3
    Views:
    378
    Francesco
    Sep 8, 2009
  4. JVRudnick

    CDOSYS executes but takes TOO TOO long?

    JVRudnick, Feb 20, 2008, in forum: ASP General
    Replies:
    13
    Views:
    293
    JVRudnick
    Feb 28, 2008
  5. FMAS

    loop in loop takes too much time

    FMAS, Jun 12, 2004, in forum: Perl Misc
    Replies:
    8
    Views:
    160
    Anno Siegel
    Jun 13, 2004
Loading...

Share This Page