PEAK XOOPS - Ban IP when accessing a web page in englishin japanese

Ban IP when accessing a web page

  • You cannot open a new topic into this forum
  • Guests cannot post into this forum
Previous post - Next post | Parent - Children.1 | Posted on 2006/8/1 19:10
danand725  企霹始 From: London. UK  Posts: 7
Dear all,

First many thanks for Protector.

I米e been getting lots of spam in a guestbook basically accessing the sign.php directly. None of the span ever gets through but it does increase our bandwidth.

I want to have a go at creating a fake sign.php page that will automatically ban IPs in Protector, but not quite sure how to do this. I梦 happy that any access of the file will be from a hostile IP.

I梦 guessing I can just extract parts of the postcheck.inc.php file and then just keep playing until it works?

// If precheck has already judged that he should be banned
	if( $can_ban && $protector->_should_be_banned ) {
		$protector->register_bad_ips() ;
	}

Any help appreciated if anyone has already done this kind of thing to beat spam!

Daniel
Votes:364 Average:9.97
Previous post - Next post | Parent - Children.1 | Posted on 2006/8/2 5:38
GIJOE  黎扦烦菱   Posts: 4110
hi Daniel.

example of sign.php as dummy for spammers, though I don't know it works well or not

include '../../mainfile.php' ;

require_once( XOOPS_ROOT_PATH . '/modules/protector/class/protector.php' ) ;
$db =& Database::getInstance() ;
$protector =& Protector::getInstance( $db->conn ) ;
$conf = $protector->getConf() ;
$protector->register_bad_ips() ;

Don't forget registering your rescue password before testing it.
Votes:10 Average:1.00
Previous post - Next post | Parent - Children.1 | Posted on 2006/8/2 18:36
danand725  企霹始 From: London. UK  Posts: 7
Hi GIJOE,

Many thanks for that. I'll give it a test in a few days (as I'm going away and don't want to cripple the band website before I go).

To be clear, I'm going to add the offending folder to robots.txt so hopefully I don't ban Google bots etc... that would be bad.
Votes:3 Average:3.33
Previous post - Next post | Parent - Children.1 .2 | Posted on 2006/8/4 6:07
GIJOE  黎扦烦菱   Posts: 4110
Oh, you are right.

You have to add the file's path into robots.txt

- Good crawlers like googlebot
-- read robots.txt first
-- they don't access this trap file
-- your site will be indexed correctly

- Bad crawlers
-- ignore robots.txt
-- they access the trap file
-- the crawler is banned!

Votes:11 Average:8.18
Previous post - Next post | Parent - No child | Posted on 2006/8/9 19:37
danand725  企霹始 From: London. UK  Posts: 7
OK, the page is up. I've updated robots.txt and will keep an eye on the ever growing list of IP bans.

Thanks for your support. It's a pity that the guestbook was attacked so often, but on the upside, it makes it very easy to block rouge IPs.

I expect after a few days to be able to re-instate the guestbook. My bandwidth usage is already falling rapidly!



Daniel
Votes:185 Average:9.95
Previous post - Next post | Parent - Children.1 | Posted on 2006/8/9 20:09
danand725  企霹始 From: London. UK  Posts: 7
Whooo, the list of IP bans has doubled over night! Looks like its working

On the topic of not banning good bots and crawlers by mistake, I might have a go at getting each IP ban caused by the file to appear in the Protector Log, say under the "Type" of "TRAP" or something... that way it would be easy to see the referer info, even if ultimately you have to manually remove the IP address from the IP ban?

.....or is that something else you could help me with I guess I just add:

$this->message .= "IP Ban. ($val)\n" ;
to my sign.php file?

Daniel
Votes:5 Average:4.00
Previous post - Next post | Parent - Children.1 | Posted on 2006/8/10 6:10
GIJOE  黎扦烦菱   Posts: 4110
OK.
try this.

include '../../mainfile.php' ;

require_once( XOOPS_ROOT_PATH . '/modules/protector/class/protector.php' ) ;
$db =& Database::getInstance() ;
$protector =& Protector::getInstance( $db->conn ) ;
$conf = $protector->getConf() ;
$protector->output_log( 'Ban by trap' ) ;
$protector->register_bad_ips() ;
Votes:4 Average:5.00
Previous post - Next post | Parent - No child | Posted on 2006/8/10 19:29
danand725  企霹始 From: London. UK  Posts: 7
Wow this really is working well. So far I've not seen any good bots banned. I've used www.spidertrack.org to check the IP addresses for Google, MSN and Yahoo.

Yes, the blocks now appear in the log for easy reference. This is a really great way of blocking bad IPs, even if it did mean loosing the guestbook for a while.

Many thanks,

Daniel
Votes:68 Average:8.97

  Advanced search


Login
Username or e-mail:

Password:

Remember Me

Lost Password?

Register now!