Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Perl > Perl Misc > Results in multiple pages. Takes too much time

Reply
Thread Tools

Results in multiple pages. Takes too much time

 
 
premgrps@gmail.com
Guest
Posts: n/a
 
      07-11-2006
Hi,
I have a table of a million records and wrote a CGI-PERL script to
display the results based on the user input. The results might be
anywhere from 100 to 1000 per query and presently I am displaying them
as 25 results per page.

Problem: Each query is taking about 20-30 seconds.

My solution: I have tried to optimize the table and also index the
table. I have actually converted a MS access database to SQL database,
so it wasn't previously indexed. Both optimization and indexing doesn't
give any good results. I always get a timeout. ie. it takes longer
after indexing and optimizing.

1. I was wondering if someone has a creative solution for this. ie.
reduce the time from 20-30 seconds to atleast 10 seconds.

2. I have links of pages of results beneath the first page result. When
each of these links are clicked it takes 20-30 seconds again. Is there
a way I can reduce the time taken for the subsequent pages are reduced?
I cannot use the LIMIT option in mysql, since I have a where clause
which has to search through the whole table. I tried using views and
using limits, but it takes as much time.

Please let me know.

Thanks.

 
Reply With Quote
 
 
 
 
Mark Clements
Guest
Posts: n/a
 
      07-11-2006
http://www.velocityreviews.com/forums/(E-Mail Removed) wrote:
> Hi,
> I have a table of a million records and wrote a CGI-PERL script to
> display the results based on the user input. The results might be
> anywhere from 100 to 1000 per query and presently I am displaying them
> as 25 results per page.
>
> Problem: Each query is taking about 20-30 seconds.
>
> My solution: I have tried to optimize the table and also index the
> table. I have actually converted a MS access database to SQL database,
> so it wasn't previously indexed. Both optimization and indexing doesn't
> give any good results. I always get a timeout. ie. it takes longer
> after indexing and optimizing.
>
> 1. I was wondering if someone has a creative solution for this. ie.
> reduce the time from 20-30 seconds to atleast 10 seconds.
>
> 2. I have links of pages of results beneath the first page result. When
> each of these links are clicked it takes 20-30 seconds again. Is there
> a way I can reduce the time taken for the subsequent pages are reduced?
> I cannot use the LIMIT option in mysql, since I have a where clause
> which has to search through the whole table. I tried using views and
> using limits, but it takes as much time.
>


At first glance, this is more of a MySQL question than a Perl question.
How long does a typical query take if executed at the MySQL prompt? What
happens when you prefix the query with EXPLAIN? You may find you don't
have all the indexes you need.

If the query doesn't run fast by itself, no amount of tuning in Perl is
going to help.

Mark
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Too Much Talent + Too Much Demand = ? Lawrence D'Oliveiro NZ Computing 5 02-18-2011 10:14 PM
urllib2 urlopen takes too much time Filip Gruszczyński Python 1 06-24-2009 03:09 AM
CDOSYS executes but takes TOO TOO long? JVRudnick ASP General 13 02-28-2008 03:03 PM
stored procedure takes too much time to execute nicholas ASP .Net 7 08-03-2005 07:20 PM
loop in loop takes too much time FMAS Perl Misc 8 06-13-2004 12:27 AM



Advertisments