Python "robots.txt" parser broken since 2003

J

John Nagle

This bug, "[ 813986 ] robotparser interactively prompts for username and
password", has been open since 2003. It killed a big batch job of ours
last night.

Module "robotparser" naively uses "urlopen" to read "robots.txt" URLs.
If the server asks for basic authentication on that file, "robotparser"
prompts for the password on standard input. Which is rarely what you
want. You can demonstrate this with:

import robotparser
url = 'http://mueblesmoraleda.com' # this site is password-protected.
parser = robotparser.RobotFileParser()
parser.set_url(url)
parser.read() # Prompts for password

That's the tandard, although silly, "urllib" behavior.

This was reported in 2003, and a patch was uploaded in 2005, but the patch
never made it into Python 2.4 or 2.5.

A temporary workaround is this:

import robotparser
def prompt_user_passwd(self, host, realm):
return None, None
robotparser.URLopener.prompt_user_passwd = prompt_user_passwd # temp patch


John Nagle
 
T

Terry Reedy

| This was reported in 2003, and a patch was uploaded in 2005, but the
patch
| never made it into Python 2.4 or 2.5.

If the patch is still open, perhaps you could review it.

tjr
 
J

John Nagle

Terry said:
| This was reported in 2003, and a patch was uploaded in 2005, but the
patch
| never made it into Python 2.4 or 2.5.

If the patch is still open, perhaps you could review it.
I tried it on Python 2.4 and it's in our production system now.
But someone who regularly does check-ins should do this.

John Nagle
 
S

Steven Bethard

John said:
I tried it on Python 2.4 and it's in our production system now.
But someone who regularly does check-ins should do this.

If you post such a review (even just the short sentence above) to the
patch tracker, it often increases the chance of someone committing the
patch.

Steve
 
N

Nikita the Spider

John Nagle said:
This bug, "[ 813986 ] robotparser interactively prompts for username and
password", has been open since 2003. It killed a big batch job of ours
last night.

Module "robotparser" naively uses "urlopen" to read "robots.txt" URLs.
If the server asks for basic authentication on that file, "robotparser"
prompts for the password on standard input. Which is rarely what you
want. You can demonstrate this with:

import robotparser
url = 'http://mueblesmoraleda.com' # this site is password-protected.
parser = robotparser.RobotFileParser()
parser.set_url(url)
parser.read() # Prompts for password

That's the tandard, although silly, "urllib" behavior.

John,
robotparser is (IMO) suboptimal in a few other ways, too.
- It doesn't handle non-ASCII characters. (They're infrequent but when
writing a spider which sees thousands of robots.txt files in a short
time, "infrequent" can become "daily").
- It doesn't account for BOMs in robots.txt (which are rare).
- It ignores any Expires header sent with the robots.txt
- It handles some ambiguous return codes (e.g. 503) that it ought to
pass up to the caller.

I wrote my own parser to address these problems. It probably suffers
from the same urllib hang that you've found (I have not encountered it
myself) and I appreciate you posting a fix. Here's the code &
documentation in case you're interested:
http://NikitaTheSpider.com/python/rerp/

Cheers
 
J

John Nagle

Steven said:
If you post such a review (even just the short sentence above) to the
patch tracker, it often increases the chance of someone committing the
patch.

Steve

OK, updated the tracker comments.

John Nagle
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

No members online now.

Forum statistics

Threads
473,769
Messages
2,569,581
Members
45,057
Latest member
KetoBeezACVGummies

Latest Threads

Top