Almost Have Threads Working for HTTP Scan

B

Bart Nessux

This is almost working. I've read up on queues and threads and I've
learned a lot, still not enough to fully understand what I'm doing
though, but I'm getting there. Much of this script was copied straight
from this example:

http://aspn.activestate.com/ASPN/Cookbook/Python/Recipe/284631

My goal is to scan a big private network (65,000 hosts) for HTTP
servers. Make a list of IPs that are running the servers and a list of
those that are not. I need to use threads to speed up the process. I can
do it sequentially, but it takes 2 days!!!

Everything in the script works except that it just hangs and never
produces the two results lists. Could a threading guru please offer some
advice?

Thanks... Bart

import urllib
import socket
import time
import Queue
import threading

######################
# Network Section #
######################

urls = []
x = 0
while x < 255:
x = x + 1
urls.append('http://192.168.1.' + str(x))

######################
# Queue Section #
######################

url_queue = Queue.Queue(65536)
for url in urls:
url_queue.put(url)

######################
# Thread Section #
######################

def test_http(url_queue, result_queue):

def sub_thread_proc(url, result):
try:
data = urllib.urlopen(url).read()
except Exception:
result.append(-1)
else:
result.append(1)

while 1:
try:
url = url_queue.get()
size = url_queue.qsize()
print size
except Queue.Empty:
return
print "Finished"
result = []
sub_thread = threading.Thread(target=sub_thread_proc,
args=(url,result))
sub_thread.setDaemon(True)
sub_thread.start()
sub_thread.join(HTTP_TIMEOUT)
if [] == result:
result_queue.put((url, "TIMEOUT"))
elif -1 == result[0]:
result_queue.put((url, "FAILED"))
else:
result_queue.put((url, result[0]))

HTTP_TIMEOUT = 20
workers = []
result_queue = Queue.Queue()

for thread_num in range(0, 64):
workers.append(threading.Thread(target=test_http, args=(url_queue,
result_queue)))
workers[-1].start()
for w in workers:
w.join()

web_servers = []
failures = []

while not result_queue.empty():
url, result = result_queue.get(0)
if isinstance(result, str):
failures.append((result, url))
else:
web_servers.append((result,url))

web_servers.sort()
failures.sort()

for result, url in web_servers:
print "%7d %s" % (result, url)
for result, url in failures:
print"%7s %s" % (result, url)

#############
# END #
#############
 
E

Eddie Corns

Some observations:

Since every worker thread is creating another thread for each host you will
end up with 65,535 threads all active, that seem overkill to me.

On closer inspection it's going to be massively skewed towards thread 1, since
it could simply empty the entire url_queue before the others get started.

I presume the network section isn't finished since it's only actually scanning
255 addresses.

Wouldn't it be enough to just try and connect to ports 80,8080, (etc.), using
just a socket?

Why not use seperate queues for failures and successes (not sure what the
failures gives you anyway?)

As for it hanging, I'm guessing most of the threads are sitting on
url = url_queue.get()
the exception will never happen on a blocking get().

How about a simpler approach of creating 256 threads (one per subnet) each of
which scans its own subnet sequentially.

def test_http (subnet):
for i in range(256):
ip='192.168.%d.%d'%(subnet,i)
x = probe (ip) ; returns one of 'timeout','ok','fail'
Qs[x].put(ip)

Qs = {'timeout':Queue.Queue(),'ok':Queue.Queue(),'fail':Queue.Queue()}
for subnet in range(256):
workers.append (.. target=test_http, args=(subnet,) )
 
B

Bart Nessux

Eddie said:
Some observations:

Since every worker thread is creating another thread for each host you will
end up with 65,535 threads all active, that seem overkill to me.

On closer inspection it's going to be massively skewed towards thread 1, since
it could simply empty the entire url_queue before the others get started.

I presume the network section isn't finished since it's only actually scanning
255 addresses.

Yes, I'm only testing one subnet as it's quicker. It takes about an hour
to do 65,536 urls.
Wouldn't it be enough to just try and connect to ports 80,8080, (etc.), using
just a socket?

I guess it would, but isn't that the same as urllib trying to read a url???
Why not use seperate queues for failures and successes (not sure what the
failures gives you anyway?)

This is a good point, I could probably just have a queue for urls that
urllib can read.
How about a simpler approach of creating 256 threads (one per subnet) each of
which scans its own subnet sequentially.

def test_http (subnet):
for i in range(256):
ip='192.168.%d.%d'%(subnet,i)
x = probe (ip) ; returns one of 'timeout','ok','fail'
Qs[x].put(ip)

Qs = {'timeout':Queue.Queue(),'ok':Queue.Queue(),'fail':Queue.Queue()}
for subnet in range(256):
workers.append (.. target=test_http, args=(subnet,) )

I like the idea of having one thred per subnet, but I don't really
understand your example. Sorry, I'm having trouble with threads. Can't
get my head around them just yet. I'll probably stick with my earlier
script (I can use netstat to watch it make the syn requests) so I know
that it's working. Just can't figure out how to make it write out the
results.

Thanks for the tips,

Bart
 
?

=?iso-8859-15?Q?Pierre-Fr=E9d=E9ric_Caillaud?=

Sometimes doing it yourself is nice, but I'd suggest you use nmap which
is a very powerful port scanner. It is designed to do such jobs very, very
fast. All you have to do then is parse it's output.

get it from http://www.insecure.org
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Members online

Forum statistics

Threads
473,766
Messages
2,569,569
Members
45,043
Latest member
CannalabsCBDReview

Latest Threads

Top