Velocity Reviews - Computer Hardware Reviews

Velocity Reviews > Newsgroups > Programming > Perl > Perl Misc > Communication across Perl scripts

Reply
Thread Tools

Communication across Perl scripts

 
 
Jean
Guest
Posts: n/a
 
      10-11-2010
I am searching for efficient ways of communication across two Perl
scripts. I have two scripts; Script 1 generates some data. I want my
script two to be able to access that information. The easiest/dumbest
way is to write the data generated by script 1 as a file and read it
later using script 2. Is there any other way than this ? Can I store
the data in memory and make it available to script two (of-course with
support from my Linux ) ? Meaning malloc somedata by script 1 and make
script 2 able to access it.

There is no guarantee that Script 2 will be run after Script 1. So
there should be some way to free that memory using a watchdog timer.
 
Reply With Quote
 
 
 
 
Ted Zlatanov
Guest
Posts: n/a
 
      10-11-2010
On Mon, 11 Oct 2010 09:25:25 -0700 (PDT) Jean <(E-Mail Removed)> wrote:

J> I am searching for efficient ways of communication across two Perl
J> scripts. I have two scripts; Script 1 generates some data. I want my
J> script two to be able to access that information. The easiest/dumbest
J> way is to write the data generated by script 1 as a file and read it
J> later using script 2. Is there any other way than this ? Can I store
J> the data in memory and make it available to script two (of-course with
J> support from my Linux ) ? Meaning malloc somedata by script 1 and make
J> script 2 able to access it.

J> There is no guarantee that Script 2 will be run after Script 1. So
J> there should be some way to free that memory using a watchdog timer.

Depends on your latency and load requirements.

If you need speed, shared memory is probably your best bet.

If you need easy reliable implementation, put the information in files
(you can notify the reader there's a new file with fam/inotify or
SIGUSR1). That's not a dumb way as long as you implement it properly
and it fits your requirements.

If you need low latency, use a message queue.

Ted
 
Reply With Quote
 
 
 
 
Ted Zlatanov
Guest
Posts: n/a
 
      10-11-2010
On Mon, 11 Oct 2010 13:09:06 -0400 Sherm Pendley <(E-Mail Removed)> wrote:

SP> Jean <(E-Mail Removed)> writes:
>> I am searching for efficient ways of communication across two Perl
>> scripts.


SP> Options are plentiful. Have a look at "perldoc perlipc" for a good
SP> overview.

Unfortunately that page doesn't mention (nor should it) databases,
message queues, ESBs, loopback network interfaces, etc. Each one of
those may have distinct advantages over plain IPC, depending on the OS,
environment, policies, and existing infrastructure.

Ted
 
Reply With Quote
 
Martijn Lievaart
Guest
Posts: n/a
 
      10-11-2010
On Mon, 11 Oct 2010 14:17:58 -0500, Ted Zlatanov wrote:

> If you need low latency, use a message queue.


Speaking of message queues, what do people recommend on Unix/Linux?

M4

 
Reply With Quote
 
jl_post@hotmail.com
Guest
Posts: n/a
 
      10-11-2010

On Oct 11, 10:25*am, Jean <(E-Mail Removed)> wrote:
> I am searching for efficient ways of communication across two Perl
> scripts. I have two scripts; Script 1 generates some data. I want my
> script two to be able to access that information.


> There is no guarantee that Script 2 will be run after Script 1. So
> there should be some way to free that memory using a watchdog timer.


It sounds like there's no guarantee that either script will overlap
while running, either. Unless you write your data to a file on disk,
you'll need another program to act as some sort of broker to manage
the data you want to share.

You could try using a third-party broker, or perhaps use an SQL
database to store your data. ...or you could just write what you want
to share to disk, to be picked up by Script 2.


> The easiest/dumbest way is to write the data generated by
> script 1 as a file and read it later using script 2.


That may be easiest, but I don't think it's the dumbest. And if
you use this approach, I highly recommend using the "Storable" module
(it's a standard module so you should already have it). If you have a
reference to data in Script 1 (for example, $dataReference), you can
save it in one line (if you don't count the "use Storable" line), like
this:

use Storable qw(lock_nstore lock_retrieve);
lock_nstore($dataReference, "file_name");

and then Script 2 can read it in with one line like this:

use Storable qw(lock_nstore lock_retrieve);
my $dataReference = lock_retrieve("file_name");

Now Script 1 and Script 2 should both have a $dataReference that
refers to identical data.

Type "perldoc Storable" at the Unix/DOS prompt to read more about
this module.

It's hard to get much simpler than this. You might be tempted to
write your own file-writing and file-reading code, but if you do,
you'll have to handle your own file locking and your own to/from file
stream conversions. (And that'll probably take more than just two
lines of code to implement.)

If you're good with SQL, you may want to try a DBI module like
DBD::SQLite. The database for SQLite is stored on disk (so you don't
need a third-party program to manage the data), and it gives you the
flexibility in that if you ever have move your shared data to a
database server, most of the data-sharing code will remain unchanged.

Also, don't forget to "use strict;" and "use warnings;" if you
aren't using them already; they'll save you lots of headaches in the
long run.

I hope this helps,

-- Jean-Luc
 
Reply With Quote
 
C.DeRykus
Guest
Posts: n/a
 
      10-11-2010
On Oct 11, 9:25*am, Jean <(E-Mail Removed)> wrote:
> I am searching for efficient ways of communication across two Perl
> scripts. I have two scripts; Script 1 generates some data. I want my
> script two to be able to access that information. The easiest/dumbest
> way is to write the data generated by script 1 as a file and read it
> later using script 2. Is there any other way than this ? Can I store
> the data in memory and make it available to script two (of-course with
> support from my Linux ) ? Meaning malloc somedata by script 1 and make
> script 2 able to access it.
>
> There is no guarantee that Script 2 will be run after Script 1. So
> there should be some way to free that memory using a watchdog timer.


It sounds like a named pipe (see perlipc) would be
the easiest, most straightforward solution. (See
T.Zlatonov's suggestions though for other possible
non-IPC solutions which, depending on the exact
scenario, may be a better fit.)

With a named pipe though, each script just deals
with the named file for reading or writing while
the OS takes care of the messy IPC details for
you. The 2nd script will just block until data
is available so running order isn't a concern. As
long as the two scripts are running more or less
concurrently, I would guess memory use will be
manageable too since the reader will be draining
the pipe as the data arrives.

--
Charles DeRykus
 
Reply With Quote
 
Xho Jingleheimerschmidt
Guest
Posts: n/a
 
      10-12-2010
Jean wrote:
> I am searching for efficient ways of communication across two Perl
> scripts. I have two scripts; Script 1 generates some data. I want my
> script two to be able to access that information. The easiest/dumbest
> way is to write the data generated by script 1 as a file and read it
> later using script 2.


This is usual not dumb. It is often the best way to do it.
Intermediate files and shell pipelines are by far the most common way
for me to do this--I never use anything other than those two unless I
have a compelling reason. Maybe you have a compelling reason, I don't
know and you haven't given us enough information to determine.

(Well, the third default option is to reconsider whether these two
scripts really need to be different rather than one script. I assume
you already did that rejected it for some good reason.)

> Is there any other way than this ? Can I store
> the data in memory and make it available to script two (of-course with
> support from my Linux ) ? Meaning malloc somedata by script 1 and make
> script 2 able to access it.


There are many ways to do this, and AFAIK they all either leave a lot to
be desired, or introduce annoying and subtle complexities.

> There is no guarantee that Script 2 will be run after Script 1. So
> there should be some way to free that memory using a watchdog timer.


Can't you control the timing of the execution of your scripts?

Xho
 
Reply With Quote
 
Ted Zlatanov
Guest
Posts: n/a
 
      10-12-2010
On Mon, 11 Oct 2010 22:14:29 +0200 Martijn Lievaart <(E-Mail Removed)> wrote:

ML> On Mon, 11 Oct 2010 14:17:58 -0500, Ted Zlatanov wrote:
>> If you need low latency, use a message queue.


ML> Speaking of message queues, what do people recommend on Unix/Linux?

I've heard positive things about http://www.rabbitmq.com/ but haven't
used it myself. There's a lot of others, see
http://en.wikipedia.org/wiki/Categor...ted_middleware

Depending on your needs, TIBCO may fit. It's very popular in the
financial industry and in my experience has been a pretty good system
over the last 3 years I've used it. The Perl bindings
are... well... usable. The major headaches I've had were when the
process is slow handling incoming data. Unless you write your Perl very
carefully, it's easy to block and balloon the memory size (because
TIBCO's queue uses your own application's memory) to multi-gigabyte
footprints. So forget about database interactions, for instance--you
have to move them to a separate process and use IPC or file drops.
Threads (as in "use threads") are probably a bad idea too.

Ted
 
Reply With Quote
 
Ted Zlatanov
Guest
Posts: n/a
 
      10-12-2010
On Mon, 11 Oct 2010 14:25:59 -0700 (PDT) "C.DeRykus" <(E-Mail Removed)> wrote:

CD> With a named pipe though, each script just deals with the named file
CD> for reading or writing while the OS takes care of the messy IPC
CD> details for you. The 2nd script will just block until data is
CD> available so running order isn't a concern. As long as the two
CD> scripts are running more or less concurrently, I would guess memory
CD> use will be manageable too since the reader will be draining the
CD> pipe as the data arrives.

The only warning I have there is that pipes are pretty slow and have
small buffers by default in the Linux kernel (assuming Linux). I forget
exactly why, I think it's due to terminal disciplines or something, I
didn't dig too much. I ran into this earlier this year.

So if you have a fast writer pipes can be problematic.

Ted
 
Reply With Quote
 
Martijn Lievaart
Guest
Posts: n/a
 
      10-12-2010
On Mon, 11 Oct 2010 19:55:03 -0500, Ted Zlatanov wrote:

> On Mon, 11 Oct 2010 22:14:29 +0200 Martijn Lievaart <(E-Mail Removed)>
> wrote:
>
> ML> On Mon, 11 Oct 2010 14:17:58 -0500, Ted Zlatanov wrote:
>>> If you need low latency, use a message queue.

>
> ML> Speaking of message queues, what do people recommend on Unix/Linux?
>
> I've heard positive things about http://www.rabbitmq.com/ but haven't
> used it myself. There's a lot of others, see
> http://en.wikipedia.org/wiki/Categor...ted_middleware


Thanks, I'll look into it.

M4
 
Reply With Quote
 
 
 
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are On
Pingbacks are On
Refbacks are Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
What is required for perl scripts to run correct when launched from rc scripts on HPUX 11? deanjones7@gmail.com Perl Misc 13 09-10-2007 11:58 AM
Communication between python scripts Chris Python 9 03-04-2005 08:52 PM
RE: Communication between remote scripts Tim Golden Python 1 09-14-2004 09:08 PM
Communication between remote scripts ChrisH Python 5 09-14-2004 07:25 PM
Sharing variables across scripts. Darren Dunham Perl Misc 1 10-13-2003 05:18 PM



Advertisments