hi tl.
Quote:
I assume Bad IPs are not confilicting with Xoops Own (?)
Independent each other.
Bad IPs for Protector is assumed former than XOOPS own.
Because Protector's bad IP does not need DB connection.
(You can see Protector's bad IP is much faster than XOOPS own)
Quote:
Request: Any possibility of using the following format for bad IPs?
^123\.456\.8[1-9]\. or
^123.456.8[1-9].
Protector's bad IP system have to be FAST enough to estimate.
preg_match() is much expensive function than substr()+strlen().
(Though preg_match() is much faster than ereg()
)
Almost purpose will be achieved by describing '192.168.1.' etc.
Thanks, GIJOE.
The reason for asking is that currently I am banning certain sub-sect IPs through.htaccess. Sub-sect IP is an issue with some European and Asian IPs - I have to ban 12.34.[5-6][0-9] in order to spare those IPs in the range of 12.34.[1-4][0-9], because they may well be allocated in two different countries. I thought maybe Protector could also do as .htaccess does, since now BAD IPs are text-file based.
Bad counts for Crawlers
The default is 30 - is this number per a minute? I just checked one of my logs - one of googlebots from a single IP was banned because of the high load, even though Googlebots have been classified as safe crawlers. Should I increase this to 60?
Quote:
The reason for asking is that currently I am banning certain sub-sect IPs through.htaccess. Sub-sect IP is an issue with some European and Asian IPs - I have to ban 12.34.[5-6][0-9] in order to spare those IPs in the range of 12.34.[1-4][0-9], because they may well be allocated in two different countries. I thought maybe Protector could also do as .htaccess does, since now BAD IPs are text-file based.
The meaning of Protector might be "ban automatically".
If you've already know IPs should be banned, you'd better to write such addresses into .htaccess instead of Protector.
.htaccess is much faster than Protector.
(Though Protector's ban is much faster than XOOPS's ban)
Quote:
Bad counts for Crawlers
The default is 30 - is this number per a minute? I just checked one of my logs - one of googlebots from a single IP was banned because of the high load, even though Googlebots have been classified as safe crawlers. Should I increase this to 60?
You can see it "Watch time for high loadings (sec)". (default 60)
But, It sounds a problem of User-Agent instead of settings of the numbers.
Which User-Agent is the googlebot?
And which patterns do you use for "Welcomed User-Agent"?
Quote:
But, It sounds a problem of User-Agent instead of settings of the numbers.
Which User-Agent is the googlebot?
And which patterns do you use for "Welcomed User-Agent"?
The user-agent is
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)
Welcomed User-Agent:
/(msnbot|Googlebot|Mediapartners-Google|Yahoo! Slurp|Ask Jeeves/Teoma)/i
Quote:
tl wrotes:
Welcomed User-Agent:
/(msnbot|Googlebot|Mediapartners-Google|Yahoo! Slurp|Ask Jeeves/Teoma)/i
It looks wrong pattern.
You have to escape / or
?(msnbot|Googlebot|Mediapartners-Google|Yahoo! Slurp|Ask Jeeves/Teoma)?i
thanks. After my post, I realized that the pattern may be wrong - that I would have to escape "/" or remove "/". I'll give a try and see if this fixes google-bot problem.