L
Leo
Dear all,
I'm writing code to run on a cluster machine. Basically I need to run
a Perl program 200 times and each time generate a line of the output.
I use a C-shell script to call the perl progam and distribute it
through Sun Grid Engine(SGE) to the nodes of the cluster (because SGE
only takes shell script). My have two implementations: (1) write the
output into different files (2) append the output to a single file.
Obviously the first approach is not efficient. But for the second one,
I've only got less than 200 results(120-180 output lines). My
questions are: (1) is this the problem with the deadlock when writing
files? if so, how perl handle this issue? (like the one in C, means I
have to write my own code to deal with the situation?) (2) Or SGE has
the ability to not allow the file-writing at the same time? I hope
someone can explain this to me and I would appreciate your help!
Thanks.
Leo
I'm writing code to run on a cluster machine. Basically I need to run
a Perl program 200 times and each time generate a line of the output.
I use a C-shell script to call the perl progam and distribute it
through Sun Grid Engine(SGE) to the nodes of the cluster (because SGE
only takes shell script). My have two implementations: (1) write the
output into different files (2) append the output to a single file.
Obviously the first approach is not efficient. But for the second one,
I've only got less than 200 results(120-180 output lines). My
questions are: (1) is this the problem with the deadlock when writing
files? if so, how perl handle this issue? (like the one in C, means I
have to write my own code to deal with the situation?) (2) Or SGE has
the ability to not allow the file-writing at the same time? I hope
someone can explain this to me and I would appreciate your help!
Thanks.
Leo