Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Perl > Perl Misc > Debian MySQL Perl DBI - connection terminates unexpectedly after 100 secs.

Reply
Thread Tools

Debian MySQL Perl DBI - connection terminates unexpectedly after 100 secs.

 
 
John
Guest
Posts: n/a
 
      03-16-2009
Hi

I'm updating a MySQL table with about 2 million records. After about 100
seconds the program ends with no error message. If I use LIMIT 800000,50000
the part where it stopped is OK so the UPDATE part is OK.

Is there a time out parameter? Or am I running out of memory. Can the
memory be increased?

Regards
John



 
Reply With Quote
 
 
 
 
J. Gleixner
Guest
Posts: n/a
 
      03-16-2009
John wrote:
> Hi
>
> I'm updating a MySQL table with about 2 million records. After about 100
> seconds the program ends with no error message. If I use LIMIT 800000,50000
> the part where it stopped is OK so the UPDATE part is OK.
>
> Is there a time out parameter? Or am I running out of memory. Can the
> memory be increased?


Check line 24.
 
Reply With Quote
 
 
 
 
John
Guest
Posts: n/a
 
      03-16-2009

"J. Gleixner" <(E-Mail Removed)> wrote in message
news:49be71fd$0$89873$(E-Mail Removed)...
> John wrote:
>> Hi
>>
>> I'm updating a MySQL table with about 2 million records. After about 100
>> seconds the program ends with no error message. If I use LIMIT
>> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
>>
>> Is there a time out parameter? Or am I running out of memory. Can the
>> memory be increased?

>
> Check line 24.


Line 24 says:-

our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
socket hot

Can't see how that can affect it.

Regards
John


 
Reply With Quote
 
Jens Thoms Toerring
Guest
Posts: n/a
 
      03-16-2009
John <(E-Mail Removed)> wrote:

> "J. Gleixner" <(E-Mail Removed)> wrote in message
> news:49be71fd$0$89873$(E-Mail Removed)...
> > John wrote:
> >> I'm updating a MySQL table with about 2 million records. After about 100
> >> seconds the program ends with no error message. If I use LIMIT
> >> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
> >>
> >> Is there a time out parameter? Or am I running out of memory. Can the
> >> memory be increased?

> >
> > Check line 24.


> Line 24 says:-


> our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
> socket hot


> Can't see how that can affect it.


I rather guess it was a gentle hint that your problem description
is too vague to allow to deduce what's going wrong. But your sub-
ject line may contain a hint: it looks as if the update is done
from within a CGI script or something similar, running on a web
server. And normally there's a CPU time limit set for such scripts
to catch e.g. script that run in an endless loop. That's what I
would check for first (note: if it's not set then an OS default
limit may be used!). If you are using Apache it might be a RLimitCPU
directive. But there could also be a limit on the memory consumption
of such scripts (RLimitMEM with Apache2)... Perhaps a look at the log
files of the web server will give you some more hints.

Regards, Jens
--
\ Jens Thoms Toerring ___ http://www.velocityreviews.com/forums/(E-Mail Removed)
\__________________________ http://toerring.de
 
Reply With Quote
 
Tad J McClellan
Guest
Posts: n/a
 
      03-16-2009
John <(E-Mail Removed)> wrote:
>
> "J. Gleixner" <(E-Mail Removed)> wrote in message
> news:49be71fd$0$89873$(E-Mail Removed)...
>> John wrote:
>>> Hi
>>>
>>> I'm updating a MySQL table with about 2 million records. After about 100
>>> seconds the program ends with no error message. If I use LIMIT
>>> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
>>>
>>> Is there a time out parameter? Or am I running out of memory. Can the
>>> memory be increased?

>>
>> Check line 24.

>
> Line 24 says:-
>
> our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
> socket hot
>
> Can't see how that can affect it.



Check line 17.


--
Tad McClellan
email: perl -le "print scalar reverse qq/moc.noitatibaher\100cmdat/"
 
Reply With Quote
 
Xho Jingleheimerschmidt
Guest
Posts: n/a
 
      03-17-2009
John wrote:
> Hi
>
> I'm updating a MySQL table with about 2 million records. After about 100
> seconds the program ends with no error message.


Are you sure? If there were an error message, do you know where it
would go and do you know you could find it?

> If I use LIMIT 800000,50000
> the part where it stopped is OK so the UPDATE part is OK.
>
> Is there a time out parameter?


I don't think there is one in DBD::mysql. The program that calls your
perl script could have a time out, or your MySQL server might have a
time out, but neither of those are Perl issues


> Or am I running out of memory. Can the
> memory be increased?


Unlike a select, there is no reason that large update should consume
much memory in the Perl client. And I don't use Debian specifically,
but on other linux systems running out of memory does produce an error
somewhere, but not always in the first place one might think to look.

Xho
 
Reply With Quote
 
John
Guest
Posts: n/a
 
      03-25-2009

"Jens Thoms Toerring" <(E-Mail Removed)> wrote in message
news:(E-Mail Removed)-berlin.de...
> John <(E-Mail Removed)> wrote:
>
>> "J. Gleixner" <(E-Mail Removed)> wrote in message
>> news:49be71fd$0$89873$(E-Mail Removed)...
>> > John wrote:
>> >> I'm updating a MySQL table with about 2 million records. After about
>> >> 100
>> >> seconds the program ends with no error message. If I use LIMIT
>> >> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
>> >>
>> >> Is there a time out parameter? Or am I running out of memory. Can
>> >> the
>> >> memory be increased?
>> >
>> > Check line 24.

>
>> Line 24 says:-

>
>> our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
>> socket hot

>
>> Can't see how that can affect it.

>
> I rather guess it was a gentle hint that your problem description
> is too vague to allow to deduce what's going wrong. But your sub-
> ject line may contain a hint: it looks as if the update is done
> from within a CGI script or something similar, running on a web
> server. And normally there's a CPU time limit set for such scripts
> to catch e.g. script that run in an endless loop. That's what I
> would check for first (note: if it's not set then an OS default
> limit may be used!). If you are using Apache it might be a RLimitCPU
> directive. But there could also be a limit on the memory consumption
> of such scripts (RLimitMEM with Apache2)... Perhaps a look at the log
> files of the web server will give you some more hints.
>
> Regards, Jens
> --
> \ Jens Thoms Toerring ___ (E-Mail Removed)
> \__________________________ http://toerring.de


Many thanks. That was very helpful. I have maxed both RLimitCPU and
RLimitMEM in Apache2 and that has had led to better performance. There are
severe problems in using MySQL when you start hitting more than 2 million
rows irrespective how well you tweak your settings.

Regards
John



 
Reply With Quote
 
John
Guest
Posts: n/a
 
      03-25-2009

"Xho Jingleheimerschmidt" <(E-Mail Removed)> wrote in message
news:49bf08ca$0$9725$(E-Mail Removed)...
> John wrote:
>> Hi
>>
>> I'm updating a MySQL table with about 2 million records. After about 100
>> seconds the program ends with no error message.

>
> Are you sure? If there were an error message, do you know where it would
> go and do you know you could find it?


>
>> If I use LIMIT 800000,50000 the part where it stopped is OK so the UPDATE
>> part is OK.
>>
>> Is there a time out parameter?

>
> I don't think there is one in DBD::mysql. The program that calls your
> perl script could have a time out, or your MySQL server might have a time
> out, but neither of those are Perl issues
>
>
>> Or am I running out of memory. Can the memory be increased?

>
> Unlike a select, there is no reason that large update should consume much
> memory in the Perl client. And I don't use Debian specifically, but on
> other linux systems running out of memory does produce an error somewhere,
> but not always in the first place one might think to look.
>
> Xho


Yes I have now found that error log. There is a special log for slow
queries. So evidently MySQL are aware that some constructs do take a long
time.
I am not certain whether it is a Perl problem. My feeling now it is a MySQL
problem and that they are aware of similar problems.

Regards
John




 
Reply With Quote
 
sln@netherlands.com
Guest
Posts: n/a
 
      03-25-2009
On Wed, 25 Mar 2009 07:01:38 -0000, "John" <(E-Mail Removed)> wrote:

>
>"Jens Thoms Toerring" <(E-Mail Removed)> wrote in message
>news:(E-Mail Removed)-berlin.de...
>> John <(E-Mail Removed)> wrote:
>>
>>> "J. Gleixner" <(E-Mail Removed)> wrote in message
>>> news:49be71fd$0$89873$(E-Mail Removed)...
>>> > John wrote:
>>> >> I'm updating a MySQL table with about 2 million records. After about
>>> >> 100
>>> >> seconds the program ends with no error message. If I use LIMIT
>>> >> 800000,50000 the part where it stopped is OK so the UPDATE part is OK.
>>> >>
>>> >> Is there a time out parameter? Or am I running out of memory. Can
>>> >> the
>>> >> memory be increased?
>>> >
>>> > Check line 24.

>>
>>> Line 24 says:-

>>
>>> our $old_fh=select(STDOUT); $|=1; select($old_fh); # Make standard output
>>> socket hot

>>
>>> Can't see how that can affect it.

>>
>> I rather guess it was a gentle hint that your problem description
>> is too vague to allow to deduce what's going wrong. But your sub-
>> ject line may contain a hint: it looks as if the update is done
>> from within a CGI script or something similar, running on a web
>> server. And normally there's a CPU time limit set for such scripts
>> to catch e.g. script that run in an endless loop. That's what I
>> would check for first (note: if it's not set then an OS default
>> limit may be used!). If you are using Apache it might be a RLimitCPU
>> directive. But there could also be a limit on the memory consumption
>> of such scripts (RLimitMEM with Apache2)... Perhaps a look at the log
>> files of the web server will give you some more hints.
>>
>> Regards, Jens
>> --
>> \ Jens Thoms Toerring ___ (E-Mail Removed)
>> \__________________________ http://toerring.de

>
>Many thanks. That was very helpful. I have maxed both RLimitCPU and
>RLimitMEM in Apache2 and that has had led to better performance. There are
>severe problems in using MySQL when you start hitting more than 2 million
>rows irrespective how well you tweak your settings.
>
>Regards
>John
>
>

I am for hire. I've never done your fancy server names stuff.
But I can assure you I am an expert at Data Base and Perl.
The learning curve is like the next day.

Pay me and you get quality software! Price's are negotiable.
I'm looking for full-time at 140K but will work at $120/hr at
minimal time.

-sln
 
Reply With Quote
 
Xho Jingleheimerschmidt
Guest
Posts: n/a
 
      03-26-2009
John wrote:
>
> Many thanks. That was very helpful. I have maxed both RLimitCPU and
> RLimitMEM in Apache2 and that has had led to better performance. There are
> severe problems in using MySQL when you start hitting more than 2 million
> rows irrespective how well you tweak your settings.


I have MySQL tables with 150 times that many rows. Obviously I don't
try to run unrestricted update DML on them. It isn't how many rows you
have, it is what you are trying to do with them.

Xho

 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Tetration (print 100^100^100^100^100^100^100^100^100^100^100^100^100^100) jononanon@googlemail.com C Programming 5 04-25-2012 08:49 PM
Broadband connection terminates after around 5-10 minutes PradeepR Computer Support 4 08-31-2006 06:03 AM
&read_file in File::Slurp terminates unexpectedly on file Charles R. Thompson Perl Misc 6 01-13-2004 04:00 PM



Advertisments