configure 'time out' time for urllib

A

Andreas Dahl

Hi,

I use urllib to retrieve data via HTTP. Unfortunately my program crashes
after a while (after some loops) because the connection timed out.

raise socket.error, msg
IOError: [Errno socket error] (60, 'Connection timed out')

I am not so familiar with python, but is there a possibility to
configure the 'waiting time'? Or how can I handle such an event? To skip
that query and go to the next one would also work.

Many thanks in advance, Andreas

code:
params = urllib.urlencode({'rs': rs})
try:
file =
urllib.urlopen("http://www.ncbi.nlm.nih.gov/SNP/snp_ref.cgi?%s" %
params)
except IOError, message: # file open failed
print >> sys.stderr, "File could not be opend:", message
sys.exit(1)
data = file.readlines() # array with html-doc-content
file.close()
 
W

wes weston

Andreas,
I had a similar problem and coded in two tries; could be 3,4 etc.
I couldn't find a way to set the timeout.
 
W

wes weston

Andreas,
Follow the link in Pieter's reply. It explains that the method
changes the timeout for ALL connections. You won't need to recode
in sockets. Thanks Pieter.
wes
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,755
Messages
2,569,536
Members
45,016
Latest member
TatianaCha

Latest Threads

Top