文件名称:baidu
- 所属分类:
- WEB源码
- 资源属性:
- [Windows] [Visual C] [Basic/ASP] [源码]
- 上传时间:
- 2012-11-26
- 文件大小:
- 19kb
- 下载次数:
- 0次
- 提 供 者:
- 黄**
- 相关连接:
- 无
- 下载说明:
- 别用迅雷下载,失败请重下,重下不扣分!
介绍说明--下载内容均来自于网络,请自行研究使用
功能:记录蜘蛛爬行轨迹,一个星期内的按爬行时间完全记录,一个星期以前的记录时间(没有删除记录的功能,需要删除的请直接在数据库里面删除即可!).
利用main.asp可以直接一眼看出哪个页面被爬行的最多,利用robots.asp可以查看一个星期内蜘蛛爬行的轨迹.
用法:将上面的这段代码插入到网站ASP页面中即可!本蜘蛛爬行分析器总共只有四个文件,很小巧,哈哈!
-
Function: spider crawling track record, within a week time to complete the record by crawling a week before the recording time (there is no function to delete records, the need to delete it directly in the database can be deleted!).
Main.asp can use at a glance which pages have been crawling up to the use of robots.asp within a week can see the tracks of spiders crawling.
Usage: This above code will be inserted into the site to ASP page! The spiders to crawl a total of only four documents analyzer, it is small, ha ha!
利用main.asp可以直接一眼看出哪个页面被爬行的最多,利用robots.asp可以查看一个星期内蜘蛛爬行的轨迹.
用法:将上面的这段代码插入到网站ASP页面中即可!本蜘蛛爬行分析器总共只有四个文件,很小巧,哈哈!
-
Function: spider crawling track record, within a week time to complete the record by crawling a week before the recording time (there is no function to delete records, the need to delete it directly in the database can be deleted!).
Main.asp can use at a glance which pages have been crawling up to the use of robots.asp within a week can see the tracks of spiders crawling.
Usage: This above code will be inserted into the site to ASP page! The spiders to crawl a total of only four documents analyzer, it is small, ha ha!
相关搜索: 网站asp
(系统自动生成,下载前可以参看下载内容)
下载文件列表
#robots.mdb
index.asp
main.asp
robots_conn.asp
使用方法.txt
说明.htm
index.asp
main.asp
robots_conn.asp
使用方法.txt
说明.htm