Here's what i am looking for i don't care if you use existing open source tools as a starting point.I already have code for checking pagerank if need be.
Program must do the following
1) First must have a low cpu/memory footprint
2) Must be fully multi-threaded/tasking.
3) Able to spawn as many processes/tasks as my machine can handle.
4) Be able to specify how many threads/connections to spawn at once and how many per domain.
5) Take a text file list of domains as a starting [login to view URL] can be either a url like <[login to view URL]> or [login to view URL] etc.
6) Follow all links/java redirects looking for domains that don't exist.
7) If a domain is found keep track of the refferal pages pagerank and the pagerank of the domain found.
8) Crawling must support setting the browser type/last refered page to follow links..
9) Full support for encoded/java urls.
## Deliverables
1) Complete and fully-functional working program(s) in executable form as well as complete source code of all work done.
2) Deliverables must be in ready-to-run condition, as follows (depending on the nature of the deliverables):
a) For web sites or other server-side deliverables intended to only ever exist in one place in the Buyer's environment--Deliverables must be installed by the Seller in ready-to-run condition in the Buyer's environment.
b) For all others including desktop software or software the buyer intends to distribute: A software installation package that will install the software in ready-to-run condition on the platform(s) specified in this bid request.
3) All deliverables will be considered "work made for hire" under U.S. Copyright law. Buyer will receive exclusive and complete copyrights to all work purchased. (No GPL, GNU, 3rd party components, etc. unless all copyright ramifications are explained AND AGREED TO by the buyer on the site per the coder's Seller Legal Agreement).
## Platform
Will run on linux/freebsd.