Grub To Do List
Last Updated 03/22/02

Client Stuff
------------
 * once again, get it working in BSD due to thread probs
 * finish GUI for Windows and get it linked with crawler
 * bandwidth limiting on return to server
 * bandwidth limiting by time (could be hacked via cron)
 * GTK interface/GUI for x-windows
 * add local database indexing
 * enable scheduled run times
 * MD5 sums for sets of URLs
 * improve lockfile intelligence
 * sanity checking on bandwidth limits
 * Mbps or Kbps tags for bandwidth limits - depending
 * completion bar for sending back to the server
 * let configure print where default directories are
 * client sleep indicator
 * meta robots tag consideration
 * add keyboard controls for gui
 * make a standalone gui to be started and stopped
   as needed, connecting to an existing crawler

Server Stuff
------------
 * enable scheduling by:
    - file size 
    - mime type
    - extension
    - update rate
 * interface into mnogosearch
 * get xml export of database working
 * improve stats for clients
 * do stuff with status of URL
 * logging for the scheduler
 * porn and other unmentionables filter
 * unschedule certain URLs based on a sub-string
 * implement redundant crawling for security
 * MD5 sums for sets of URLs
 * meta robots tag consideration

Website Stuff
-------------
 * improve client stats
 * add graph of current users
 * improve overall stats
 * add interface for controlling local crawl URLs
 * email reminder when a site updates 

Done Stuff (see ChangeLog for more info)
----------
 * add daily ranking
 * port to windows via Cygwin
 * remove tcp4u/ and win32/ directories from CVS
 * enable restarting of client upon crash
 * allow submission of URLs from website
 * make it work with BSD
 * enable version printing
 * robots.txt yanking and using
 * enabled url randomization function
 * got mnogosearch running
 * enable less verbose mode w/stats
 * added text/curses gui interface
 * make move to cURL library
 * create new verbose mode that gives better info 
 * add a web interface like larbin (did gui instead)
 * enable email and/or clientid login on website
 * identify client as a grub client to servers
 * clean up output/interface
 * added nice, clean ctrl-c shutdown
 * bandwidth limiting on cURL routines
 * check if it compiles ok on gcc 3.0
 * per run limits on bandwidth
 * on kill, finish crawling first
 * unlink archive, save to one large file before sending 
 * improve gui display refresh
 * new version available flag 
 * added 'q' exit key for GUI
 * fix arch_dir so directories are locked in it
 * local crawling implementation
 * support for a proxy
 * fix small bad session bugs(client side)
 * client authentication(server side)
 * fix bad session bug(server side)
