Does anyone know of a program that can crawl a website and tell what
files are not used any more?
Obviously just by crawling the site (ie following links) you can only tell
which files ARE in use (if you disregard files that may be dynamically
referenced by scripts).
Try Xenu. It's primary intented use is checking for broken links, but it
can also crawl a website, then crawl the server using ftp and finally
compare the two structures to find redundant files. You can easily get,
install and configure a simple free ftp server just for this purpose (not
as much work as it sounds).
The servers are running on IIS
Servers, plural? That may be less convenient... Don't know how Xenu can
handle that, play with it