In this paper, we introduce a probabilistic modeling approach for addressing the problem of Web robot detection from Web-server access logs. More specifically, we construct a Bayesian network that classifies automatically access-log sessions as being crawler- or human-induced, by combining various pieces of evidence proven to characterize crawler and human behavior. Our approach uses machine learning techniques to determine the parameters of the probabilistic model. We apply our method to real Web-server logs and obtain results that demonstrate the robustness and effectiveness of probabilistic reasoning for crawler detection.
|Title of host publication||International Conference on Internet Surveillance and Protection, ICISP'06|
|Publication status||Published - 2006|
|Event||International Conference on Internet Surveillance and Protection, ICISP'06 - Cote d'Azur, France|
Duration: 26 Aug 2006 → 28 Aug 2006
|Other||International Conference on Internet Surveillance and Protection, ICISP'06|
|Period||26/08/06 → 28/08/06|