Passing hash to another script via commandline

I

IanW

Hi

How do I pass a hash to another script via the command line?

That is, if I do this

use strict;
my %data = (
field1 => 'f1val',
field2 => 'f2val'
);
my $result = `perl z:/interface.pl \%data`;

and in interface.pl I have the lines:

use strict;
my($dataref) = $ARGV[0];
print $dataref->{'field1'};

I get the error:

Can't use string ("%data") as a HASH ref while "strict refs" in use at
z:/interface.pl line ..

and if I comment out "use strict" then I get nothing.

What am I doing wrong?

Thanks
Ian
 
J

Jürgen Exner

IanW said:
How do I pass a hash to another script via the command line?
[sample script snipped, thank you for providing it]
What am I doing wrong?

The shell command line argument interface does not provide any means to pass
complex data structures like hashes or references. It can deal with simple
strings only. This is not a limitation of Perl but of the command shell.

In other words: you need to pass the actual keys and values of your hash (as
strings!) and then recompose them into a hash in the called program.

It might be easier to use Data::Dumper to convert the hash into a textual
representation that can readily loaded into perl again or to use some other
form of interprocess communication.

jue
 
M

mattsteel

IanW ha scritto:
Hi

How do I pass a hash to another script via the command line?

You can't.
What am I doing wrong?

Thanks
Ian

Possible workaround: you can try to tie-untie your hash to a file so
you can pass the filename to the second script which can tie-untie the
same hash, then.
 
I

IanW

It might be easier to use Data::Dumper to convert the hash into a textual
representation that can readily loaded into perl again

Hi Jue

Thanks for the reply.

If I understand it correctly, doesn't Data::Dumper just convert the hash
into a string representing the hashes structure? In which case, if I tried
to pass that via the command line, the spaces (amongst some other characters
like double quotes) would cause probs when the string is pulled into @ARGV
at the receiving script?

Eg:

use Data::Dumper;
my $d = Dumper(\%data);
my $result = `perl z:/interface.pl $d`;

That would result in only $ARGV[0] only containing "$VAR1". One could put $d
in double quotes, but if there were double quotes in one of the hash values
then it would presumably mess things up again.

Regards
Ian
 
I

IanW

Possible workaround: you can try to tie-untie your hash to a file so
you can pass the filename to the second script which can tie-untie the
same hash, then.

I was rather hoping avoiding passing data via a file but thanks anyway.

Regards
Ian
 
G

Gunnar Hjalmarsson

IanW said:
How do I pass a hash to another script via the command line?

As others have told you, you can't.

However, nothing prevents you from assigning a hashref to @ARGV
directly, i.e. instead of
my $result = `perl z:/interface.pl \%data`;

you may want to try:

$ARGV[0] = \%data;
do 'z:/interface.pl';
 
T

Tim Southerwood

IanW said:
It might be easier to use Data::Dumper to convert the hash into a textual
representation that can readily loaded into perl again

Hi Jue

Thanks for the reply.

If I understand it correctly, doesn't Data::Dumper just convert the hash
into a string representing the hashes structure? In which case, if I tried
to pass that via the command line, the spaces (amongst some other
characters like double quotes) would cause probs when the string is pulled
into @ARGV at the receiving script?

Eg:

use Data::Dumper;
my $d = Dumper(\%data);
my $result = `perl z:/interface.pl $d`;

That would result in only $ARGV[0] only containing "$VAR1". One could put
$d in double quotes, but if there were double quotes in one of the hash
values then it would presumably mess things up again.

Regards
Ian

I think there is a simple point being missed here.

Why not use Data::Dumper to serialise the data (because that's what it
does), print the result to STDOUT, have the called program read it in from
STDIN to a scalar, then eval the scalar.

In other words, pipe the serialised version of the hash between the two
programs. Very much the "unix way" (TM).

Another way to do it would be to use IPC::Shareable and tie the hash to a
blob of shared memory. All that the callee program needs to know is the
shared memory key id which is trivially passed as an argument.

Probably not a lot efficiency wise as the latter method has to serialise the
data anyway, but it's neat, and potentially bi-directional - ie the callee
can modify the data and have the caller see it.

Cheers

Tim
 
T

Tad McClellan

Others have explained why you cannot do this.
I would simply
create an array, pass the array, then work with array @ARGV
in my secondary script, without creating a new hash.

PRIMARY:

#!perl

%Data = (field1 => 'f1val', field2 => 'f2val', field3 => 'f3val');


Try it with these values:

%Data = (field1 => 'f1 val', field2 => 'f2val', field3 => 'f3val');

then rethink your "solution".

system ("perl test2.pl @{[%Data]}");
 
I

IanW

Gunnar Hjalmarsson said:
IanW said:
How do I pass a hash to another script via the command line?

As others have told you, you can't.

However, nothing prevents you from assigning a hashref to @ARGV directly,
i.e. instead of
my $result = `perl z:/interface.pl \%data`;

you may want to try:

$ARGV[0] = \%data;
do 'z:/interface.pl';

I did want, I did try and it did work :)

I had actually found another way that seemed to work, just before I read
your post, and that was to use Data::Dumper to stringify the hash, then
MIME::Base64 to encode it, then remove the line breaks, and then pass the
resulting string via the command line to then decode it at the other end.

However your way it decidedly simpler thanks.

Thanks also for the other responses. One thing I noticed from the different
replies is that there seem to be 3 ways to call another Perl script - the
backticks that I used, 'system', and 'do' - are there any real differences
between them?

Ian
 
M

Mumia W.

[...]
If I understand it correctly, doesn't Data::Dumper just convert the hash
into a string representing the hashes structure? In which case, if I tried
to pass that via the command line, the spaces (amongst some other characters
like double quotes) would cause probs when the string is pulled into @ARGV
at the receiving script?

Eg:

use Data::Dumper;
my $d = Dumper(\%data);
my $result = `perl z:/interface.pl $d`;

That would result in only $ARGV[0] only containing "$VAR1". One could put $d
in double quotes, but if there were double quotes in one of the hash values
then it would presumably mess things up again.

Regards
Ian

You can use quotemeta() to help the string survive being in the shell.
The \Q...\E syntax is a shortcut for using quotemeta():

my $result = `perl z:/interface.pl \Q$d\E`;

You can also avoid using the shell entirely. Look at "perldoc -f open"
and examine the "open FILEHANDLE,MODE,EXPR,LIST" syntax.

IPC::Run also lets you receive input from external programs without
putting the arguments in the shell.


HTH
 
T

Tim Southerwood

IanW said:
Thanks also for the other responses. One thing I noticed from the
different replies is that there seem to be 3 ways to call another Perl
script - the backticks that I used, 'system', and 'do' - are there any
real differences between them?

Ian

I would *always* use the "system (@)" method unless I really needed the
shell to do expansions or the convenience of capturing output via
backticks.

using system(@) without a shell avoids most possible quoting and hacking
nasties and avoids spaces-in-arguments problems. The only problem you get
left with is the command-line length limit, but that's quite large on linux
anyway.

Cheers

Tim
 
I

IanW

Tim Southerwood said:
I would *always* use the "system (@)" method unless I really needed the
shell to do expansions or the convenience of capturing output via
backticks.

I do quite often like to get the output. "do" seems to do this aswell - is
there a difference between "do" and backticks? (I suppose "do" is easier to
spot than backticks when reviewing the script!)

Thanks
Ian
 
T

Tim Southerwood

IanW said:
I do quite often like to get the output. "do" seems to do this aswell - is
there a difference between "do" and backticks? (I suppose "do" is easier
to spot than backticks when reviewing the script!)

Thanks
Ian

do() only works for executing lumps of perl in the same process as the
caller of do(), whereas system() spawns a subprocess and can execute any
binary that the underlying OS can.

They are quite different - though if you had been accustomed to calling a
second perl program, then do() will have worked for you, but possibly not
quite in the way that you thought.

Cheers

Tim
 
G

Gunnar Hjalmarsson

Also require(), use() and exec().

Yes. See "perldoc perlfunc".

Always system() to call another Perl script??
do() only works for executing lumps of perl in the same process as the
caller of do(),

Which makes it fit well for the OP's problem, don't you think?
 
P

Peter Wyzl

Tim Southerwood said:
IanW said:
It might be easier to use Data::Dumper to convert the hash into a
textual
representation that can readily loaded into perl again

Hi Jue

Thanks for the reply.

If I understand it correctly, doesn't Data::Dumper just convert the hash
into a string representing the hashes structure? In which case, if I
tried
to pass that via the command line, the spaces (amongst some other
characters like double quotes) would cause probs when the string is
pulled
into @ARGV at the receiving script?

Eg:

use Data::Dumper;
my $d = Dumper(\%data);
my $result = `perl z:/interface.pl $d`;

That would result in only $ARGV[0] only containing "$VAR1". One could put
$d in double quotes, but if there were double quotes in one of the hash
values then it would presumably mess things up again.

Regards
Ian

I think there is a simple point being missed here.

Why not use Data::Dumper to serialise the data (because that's what it
does), print the result to STDOUT, have the called program read it in from
STDIN to a scalar, then eval the scalar.

In other words, pipe the serialised version of the hash between the two
programs. Very much the "unix way" (TM).

Another way to do it would be to use IPC::Shareable and tie the hash to a
blob of shared memory. All that the callee program needs to know is the
shared memory key id which is trivially passed as an argument.

Probably not a lot efficiency wise as the latter method has to serialise
the
data anyway, but it's neat, and potentially bi-directional - ie the callee
can modify the data and have the caller see it.

The Storable module can also serialise data and then retrieve it via the
freeze and thaw methods. Might be worth an investigation.

use Storable qw(freeze thaw);

# Serializing to memory
$serialized = freeze \%table;
%table_clone = %{ thaw($serialized) };

P
 
T

Tim Southerwood

Gunnar said:
Also require(), use() and exec().


Yes. See "perldoc perlfunc".


Always system() to call another Perl script??

Quite frankly, yes, in the general case. Without knowledge of what the
callee script does, it is natural to expect to call it as a sub process
where it cannot interfere with the caller's environment in any way.

Also, the subject of this message is "Passing hash to another script via
commandline" which implies an exec() at some point. I would suggest reading
the original post again where the example code uses backticks.

I agree that there are special cases where it is fine to do() or eval() a
script - but then the OP's problem can be handled by allowing the callee
script to access the hash directly as it is running in the same process,
providing the hash is declared "our".
Which makes it fit well for the OP's problem, don't you think?

With respect, no.

Personally, I would not run a foreign script via a do() except in very
special cases. I know there is lexical separation of the callee from the
caller, but not everything in the caller's space is protected from the
actions of the script (package globals, filehandles, signal handlers etc).
If the script is not foreign to the caller, then I probably would have
integrated it's code into the caller (as a module or subroutine) anyway.

Perhaps the OP needs to clarify a bit more - the subject line is
inconsistent with where this discussion is going, too much is at
cross-purposes now.

Cheers

Tim
 
J

Joe Smith

Purl said:
Irrelevant. Your comments do not comply with the originating
author's stated parameters.

You're assuming that the author's stated parameters are the only
parameters that he is ever going to use. Bad assumption.

A competent programmer provides solutions that work in the general
case, not just the case with "field1 => 'f1val', field2 => 'f2val'".

-Joe
 
T

Tad McClellan

l v said:
Purl Gurl wrote:
Oh look, another mindless, senseless, and ignorant post from the Purl Gurl.



+-------------------+ .:\:\:/:/:.
| PLEASE DO NOT | :.:\:\:/:/:.:
| FEED THE TROLLS | :=.' - - '.=:
| | '=(\ 9 9 /)='
| Thank you, | ( (_) )
| Management | /`-vvv-'\
+-------------------+ / \
| | @@@ / /|,,,,,|\ \
| | @@@ /_// /^\ \\_\
@x@@x@ | | |/ WW( ( ) )WW
\||||/ | | \| __\,,\ /,,/__
\||/ | | | (______Y______)
/\/\/\/\/\/\/\/\//\/\\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
==================================================================
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,479
Members
44,899
Latest member
RodneyMcAu

Latest Threads

Top