ANALYSIS OF THE USAGE STATISTICS OF ROBOTS EXCLUSION STANDARD
暂无分享,去创建一个
Robots Exclusion standard [4] is a de-facto standard that is used to inform the crawlers, spiders or web robots about the disallowed sections of a web server. Since its inception in 1994, the robots exclusion standard has been extensively used. In this paper, we present our results of the statistical analysis of the usage of robots exclusion standard. Based on the results obtained, we propose that organizations like W3C should adopt the Robert Exclusion Standard and make it an official standard.
[1] Antonio Gulli,et al. The indexable web is more than 11.5 billion pages , 2005, WWW '05.