popen pipe limit

S

skunkwerk

I'm getting errors when reading from/writing to pipes that are fairly
large in size. To bypass this, I wanted to redirect output to a file
in the subprocess.Popen function, but couldn't get it to work (even
after setting Shell=True). I tried adding ">","temp.sql" after the
password field but mysqldump gave me an error.

the code:
p1 = subprocess.Popen(["mysqldump","--all-databases","--user=user","--
password=password"], shell=True)
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout)
output = p2.communicate()[0]
file=open('test.sql.gz','w')
file.write(str(output))
file.close()

the output:
gzip: compressed data not written to a terminal. Use -f to force
compression.
For help, type: gzip -h
mysqldump: Got errno 32 on write

I'm using python rather than a shell script for this because I need to
upload the resulting file to a server as soon as it's done.

thanks
 
G

Gabriel Genellina

I'm getting errors when reading from/writing to pipes that are fairly
large in size. To bypass this, I wanted to redirect output to a file
in the subprocess.Popen function, but couldn't get it to work (even
after setting Shell=True). I tried adding ">","temp.sql" after the
password field but mysqldump gave me an error.

the code:
p1 = subprocess.Popen(["mysqldump","--all-databases","--user=user","--
password=password"], shell=True)
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout)
output = p2.communicate()[0]
file=open('test.sql.gz','w')
file.write(str(output))
file.close()

You need a pipe to chain subprocesses:

import subprocess
p1 =
subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],
stdout=subprocess.PIPE)
ofile = open("test.sql.gz", "wb")
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout, stdout=ofile)
p1.wait()
p2.wait()
ofile.close()

If you don't want the final file on disk:

p1 =
subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],
stdout=subprocess.PIPE)
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout,
stdout=subprocess.PIPE)
while True:
chunk = p2.stdout.read(4192)
if not chunk: break
# do something with read chunk

p1.wait()
p2.wait()
 
S

skunkwerk

I'm getting errors when reading from/writing to pipes that are fairly
large in size.  To bypass this, I wanted to redirect output to a file
in the subprocess.Popen function, but couldn't get it to work (even
after setting Shell=True).  I tried adding ">","temp.sql" after the
password field but mysqldump gave me an error.
the code:
p1 = subprocess.Popen(["mysqldump","--all-databases","--user=user","--
password=password"], shell=True)
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout)
output = p2.communicate()[0]
file=open('test.sql.gz','w')
file.write(str(output))
file.close()

You need a pipe to chain subprocesses:

import subprocess
p1 =  
subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],  
stdout=subprocess.PIPE)
ofile = open("test.sql.gz", "wb")
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout, stdout=ofile)
p1.wait()
p2.wait()
ofile.close()

If you don't want the final file on disk:

p1 =  
subprocess.Popen(["mysqldump","--all-databases","--user=user","--password=password"],  
stdout=subprocess.PIPE)
p2 = subprocess.Popen(["gzip","-9"], stdin=p1.stdout,  
stdout=subprocess.PIPE)
while True:
   chunk = p2.stdout.read(4192)
   if not chunk: break
   # do something with read chunk

p1.wait()
p2.wait()

thanks Gabriel - tried the first one and it worked great!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,744
Messages
2,569,484
Members
44,903
Latest member
orderPeak8CBDGummies

Latest Threads

Top