PEAK XOOPS - Re: Xoops Protector 3.00beta2 in englishin japanese

Re: Xoops Protector 3.00beta2

Target Downloads
Subject Protector 3.41
Summary ●要旨Protector は、XOOPS2ベースの各種CMSを様々な悪意ある攻撃から守るためのモジュールです。このモジュールでは、以下の攻撃を防ぎます。- DoS- 悪意あるクローラー(メール収集ボットなど)- SQL Injection- XSS (といっても、全てではありません)- システムグロ...

List posts in the topic

none Re: Xoops Protector 3.00beta2

msg# 1.1.1.1
depth:
3
Previous post - Next post | Parent - Children.1 | Posted on 2007/2/2 5:05
GIJOE  先任軍曹   Posts: 4110
Quote:

The reason for asking is that currently I am banning certain sub-sect IPs through.htaccess. Sub-sect IP is an issue with some European and Asian IPs - I have to ban 12.34.[5-6][0-9] in order to spare those IPs in the range of 12.34.[1-4][0-9], because they may well be allocated in two different countries. I thought maybe Protector could also do as .htaccess does, since now BAD IPs are text-file based.
The meaning of Protector might be "ban automatically".
If you've already know IPs should be banned, you'd better to write such addresses into .htaccess instead of Protector.
.htaccess is much faster than Protector.
(Though Protector's ban is much faster than XOOPS's ban)

Quote:
Bad counts for Crawlers
The default is 30 - is this number per a minute? I just checked one of my logs - one of googlebots from a single IP was banned because of the high load, even though Googlebots have been classified as safe crawlers. Should I increase this to 60?
You can see it "Watch time for high loadings (sec)". (default 60)

But, It sounds a problem of User-Agent instead of settings of the numbers.
Which User-Agent is the googlebot?
And which patterns do you use for "Welcomed User-Agent"?
Votes:9 Average:8.89

Posts tree

  Advanced search


Login
Username or e-mail:

Password:

Remember Me

Lost Password?

Register now!