Quote:
The reason for asking is that currently I am banning certain sub-sect IPs through.htaccess. Sub-sect IP is an issue with some European and Asian IPs - I have to ban 12.34.[5-6][0-9] in order to spare those IPs in the range of 12.34.[1-4][0-9], because they may well be allocated in two different countries. I thought maybe Protector could also do as .htaccess does, since now BAD IPs are text-file based.
The meaning of Protector might be "ban automatically".
If you've already know IPs should be banned, you'd better to write such addresses into .htaccess instead of Protector.
.htaccess is much faster than Protector.
(Though Protector's ban is much faster than XOOPS's ban)
Quote:
Bad counts for Crawlers
The default is 30 - is this number per a minute? I just checked one of my logs - one of googlebots from a single IP was banned because of the high load, even though Googlebots have been classified as safe crawlers. Should I increase this to 60?
You can see it "Watch time for high loadings (sec)". (default 60)
But, It sounds a problem of User-Agent instead of settings of the numbers.
Which User-Agent is the googlebot?
And which patterns do you use for "Welcomed User-Agent"?