first, as others have mentioned, really appreciate your work on this protector module, gijoe.
i have got it installed yesterday. now i start seeing the following records in the 'protect center'/log:
2005/1/6 23:53:26 Guests 144.41.125.69
IE 6.0 CRAWLER
2005/1/6 23:53:00 Guests 144.41.125.69
IE 6.0 DoS
repetitively...
my question is:
1) crawler should be fine right? why we block them from crawling our site?
2) in 12 hours, i have got 2 pages of log. should i manually clean the log up regularly? or it will be deleted automatically once it reaches some limit?
thanks in advance.
Have a look at the
Watch time for high loadings (sec) and
Bad counts for Crawlers values in the preferences section of Protector. You may need to increase them. Have a look at
this page for more details on that IP.
A Crawler is usually fine, but they can also be to heavy on your resources and become a problem. This is why you'll need to tweak those settings mentioned earlier.
2 Pages of logs
! You either have your thesholds set way to tight, or someone finds your site REALLY interesting.
thanks brash for the feedback.
i have used the standard watch time and bad count: 60s and 10.
should it be sufficient in most of the cases?
i'm lost, how can i determine it is a true crawler/not real DoS? there are basically 2 ip in the log (always a pair, crawler and dos): when is check through the link you provided: it is from Universitaet Hohenheim, Stuttgart, Germany and the other one is from iana.org. how can i tell it is a true crawler & not dos?
i do not want to block the ip, then some members of my website can't access my website anymore?
appreciate your help. thanks. (and sorry for the stupid question)