PEAK XOOPS - Re: Xoops Protector 3.00beta2 in englishin japanese

Re: Xoops Protector 3.00beta2

  • As this forum is only for commentation, you cannot open a new topic
  • Guests cannot post into this forum
Target Downloads
Subject Protector 3.41
Summary ●要旨Protector は、XOOPS2ベースの各種CMSを様々な悪意ある攻撃から守るためのモジュールです。このモジュールでは、以下の攻撃を防ぎます。- DoS- 悪意あるクローラー(メール収集ボットなど)- SQL Injection- XSS (といっても、全てではありません)- システムグロ...
Previous post - Next post | Parent - Children.1 | Posted on 2007/2/1 5:22
tl  三等軍曹   Posts: 84
GIJOE:

Thanks for the release, now it is more secure than ever. :-)

I have 1 question and 1 request

I assume Bad IPs are not confilicting with Xoops Own (?)
Request: Any possibility of using the following format for bad IPs?
^123\.456\.8[1-9]\. or
^123.456.8[1-9].

Thanks again.

tl
Votes:10 Average:10.00
Previous post - Next post | Parent - Children.1 | Posted on 2007/2/1 18:12
GIJOE  先任軍曹   Posts: 4110
hi tl.

Quote:

I assume Bad IPs are not confilicting with Xoops Own (?)
Independent each other.
Bad IPs for Protector is assumed former than XOOPS own.
Because Protector's bad IP does not need DB connection.
(You can see Protector's bad IP is much faster than XOOPS own)

Quote:
Request: Any possibility of using the following format for bad IPs?
^123\.456\.8[1-9]\. or
^123.456.8[1-9].
Protector's bad IP system have to be FAST enough to estimate.

preg_match() is much expensive function than substr()+strlen().
(Though preg_match() is much faster than ereg() )

Almost purpose will be achieved by describing '192.168.1.' etc.
Votes:3 Average:6.67
Previous post - Next post | Parent - Children.1 | Posted on 2007/2/2 3:53 | Last modified
tl  三等軍曹   Posts: 84
Thanks, GIJOE.

The reason for asking is that currently I am banning certain sub-sect IPs through.htaccess. Sub-sect IP is an issue with some European and Asian IPs - I have to ban 12.34.[5-6][0-9] in order to spare those IPs in the range of 12.34.[1-4][0-9], because they may well be allocated in two different countries. I thought maybe Protector could also do as .htaccess does, since now BAD IPs are text-file based.

Bad counts for Crawlers
The default is 30 - is this number per a minute? I just checked one of my logs - one of googlebots from a single IP was banned because of the high load, even though Googlebots have been classified as safe crawlers. Should I increase this to 60?
Votes:7 Average:10.00
Previous post - Next post | Parent - Children.1 | Posted on 2007/2/2 5:05
GIJOE  先任軍曹   Posts: 4110
Quote:

The reason for asking is that currently I am banning certain sub-sect IPs through.htaccess. Sub-sect IP is an issue with some European and Asian IPs - I have to ban 12.34.[5-6][0-9] in order to spare those IPs in the range of 12.34.[1-4][0-9], because they may well be allocated in two different countries. I thought maybe Protector could also do as .htaccess does, since now BAD IPs are text-file based.
The meaning of Protector might be "ban automatically".
If you've already know IPs should be banned, you'd better to write such addresses into .htaccess instead of Protector.
.htaccess is much faster than Protector.
(Though Protector's ban is much faster than XOOPS's ban)

Quote:
Bad counts for Crawlers
The default is 30 - is this number per a minute? I just checked one of my logs - one of googlebots from a single IP was banned because of the high load, even though Googlebots have been classified as safe crawlers. Should I increase this to 60?
You can see it "Watch time for high loadings (sec)". (default 60)

But, It sounds a problem of User-Agent instead of settings of the numbers.
Which User-Agent is the googlebot?
And which patterns do you use for "Welcomed User-Agent"?
Votes:9 Average:8.89
Previous post - Next post | Parent - Children.1 | Posted on 2007/2/2 5:34
tl  三等軍曹   Posts: 84
Quote:
But, It sounds a problem of User-Agent instead of settings of the numbers.
Which User-Agent is the googlebot?
And which patterns do you use for "Welcomed User-Agent"?


The user-agent is
Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

Welcomed User-Agent:
/(msnbot|Googlebot|Mediapartners-Google|Yahoo! Slurp|Ask Jeeves/Teoma)/i
Votes:4 Average:5.00
Previous post - Next post | Parent - Children.1 | Posted on 2007/2/2 5:42
GIJOE  先任軍曹   Posts: 4110
Quote:

tl wrotes:
Welcomed User-Agent:
/(msnbot|Googlebot|Mediapartners-Google|Yahoo! Slurp|Ask Jeeves/Teoma)/i
It looks wrong pattern.
You have to escape / or

?(msnbot|Googlebot|Mediapartners-Google|Yahoo! Slurp|Ask Jeeves/Teoma)?i
Votes:5 Average:4.00

none Re: Xoops Protector 3.00beta2

msg# 1.1.1.1.1.1.1
Previous post - Next post | Parent - No child | Posted on 2007/2/2 6:45
tl  三等軍曹   Posts: 84
thanks. After my post, I realized that the pattern may be wrong - that I would have to escape "/" or remove "/". I'll give a try and see if this fixes google-bot problem.
Votes:9 Average:10.00

  Advanced search


Login
Username or e-mail:

Password:

Remember Me

Lost Password?

Register now!