YaBB Community and Support Forum
YaBB Home About YaBB Download YaBB YaBB Support Customize Your Forum Development Contribute to the Project
  Welcome, Guest. Please Login or Register


 
Page Index Toggle Pages: 1
Topic Tools
 
xrumer in clicklog (Read 2,097 times)
 Jan 2nd, 2012 at 1:42pm
There are no actions to perform.  

BloodyRue 
Junior Member
**
Offline
Posts: 83


YaBB 2.5
xrumer in clicklog
I am writing a bot search and log module for my board.

Looking for stuff to find new search engines and add them to the search engine admin list. (I will write this mod eventually but I bet somone can do it faster than I can.  I do have it working now though.

Anyway, checking through the clicklog for potentials I found this:

Code Select All
91.207.6.170|1325502722|/yabb2/YaBB.pl?num=XXXXXXXXX||xpymep.exe
 



Several hits from this guy, I searched out xpymep.exe to be an xrumer exec

I added xpymep.exe  to my main .htaccess to auto ban anyone that shows up with that.

If you want to stop xrumer users that is a potential way of doing it.

perhaps I ought to put the stuff I use to auto ban things in here also:

(this stuff works for me, but needs editing to work for you if you use it, please take extreme caution in re-programming it to your site)

trap.cgi (located in the cgi bin)

Code Select All
#!/usr/bin/perl

$basedir = $ENV{DOCUMENT_ROOT};
$htafile = "/\.htaccess";
$termsfile = "/badbot\.htm";

# Form full pathname to .htaccess file
$htapath = "$basedir"."$htafile";

# Form full pathname to terms.htm file
$termspath = "$basedir"."$termsfile";

# Get the bad-bot's IP address, convert to regular-expressions
#(regex) format by escaping all periods.
$remaddr = $ENV{REMOTE_ADDR};
$remaddr =~ s/\./\\\./gi;

# Get User-agent & current time
$usragnt = $ENV{HTTP_USER_AGENT};
$date = scalar localtime(time);

# Open the .htaccess file and wait for an exclusive lock. This
# prevents multiple instances of this script from running past
# the flock statement, and prevents them from trying to read and
# write the file at the same time, which would corrupt it.
# When .htaccess is closed, the lock is released.
#
# Open existing .htaccess file in r/w append mode, lock it, rewind
# to start, read current contents into array.
open(HTACCESS,"+>>$htapath") || die $!;
flock(HTACCESS,2);
seek(HTACCESS,0,0);
@contents = <HTACCESS>;

# Empty existing .htaccess file, then write new IP ban line and
# previous contents to it
truncate(HTACCESS,0);
print HTACCESS ("SetEnvIf Remote_Addr \^$remaddr\$ getout \# $date $usragnt\n");
print HTACCESS (@contents);

# close the .htaccess file, releasing lock - allow other instances
# of this script to proceed.
close(HTACCESS);

# Write html output to server response
if (open(TERMS,"< $termspath")){
# Copy the terms.htm file as output here.
 print ("Content-type: text/html\n\n");
 seek(TERMS,0,0);
 @contents = <TERMS>;
 print (@contents);

 # close the terms.htm file.
 close(TERMS);
}
else{
# if we can't open terms.htm, output a canned error message
  print "Content-type: text/html\n\n";
  print "<html><head><title>Fatal Error</title></head>\n";
  print "<body text=\"#000000\" bgcolor=\"#FFFFFF\">\n";
# tell them something, I chose a joke
  print "<p>Fatal error: See instructions on page 3 or enter any key.</p></body></html>\n";
}

# trying to send an e-mail message
  open(MAIL, "|/usr/sbin/sendmail -t") || die "Content-type: text/text\n\nCan't open /usr/sbin/sendmail!";
#edit the email stuff to reflect you
  print MAIL "To: me@myemail\.whatever\n";
  print MAIL "From: me\@myemail\.whatever\n";
  print MAIL "Subject: You caught another one!\n";
  print MAIL "The ip address \^$remaddr\$ has been banned on $date \n";
  print MAIL "The associated user agent was $usragnt\n";
  close(MAIL);
exit;
 




add this type of line to .htaccess:

Code Select All
<Files .htaccess>
order deny,allow
deny from all
</Files>

# Block bad-bots using lines written by trap.cgi script above
SetEnvIf Request_URI "^(/403.*\.shtml|/robots\.txt|/badbot\.htm|)$" allowsome

<Files *>
order deny,allow
allow from env=allowsome
deny from env=getout
Deny from env=spam_bot
</Files>

RewriteEngine on

# xrumer captcha defeater
#edit the mydomain  info
RedirectMatch xpymep.exe http://mydoman.place/cgi-bin/trap.cgi


RewriteRule   !^(.*)403\.shtml - [C]
RewriteRule   !^(.*)trap\.cgi - [C]
RewriteRule (.*) http://mydomain.place/cgi-bin/trap.cgi [L]

ErrorDocument 403 http://mydomain.place/403.shtml
 




badbot.htm:
Code Select All
<html><head><title>Fatal Error 693: Bad trip sense</title></head>
<body text="#000000" bgcolor="#ffffff">
<p>Fatal Error: See instructions on page 3 or enter any key.<p>
Something for spambots to choke on:<br>
The following addresses are not for use on this page but for<br>
potential spammers to help report themselves.<p> 




I have other things in there for them also but that is the basic nutshell.

« Last Edit: Jan 2nd, 2012 at 1:50pm by BloodyRue »  
...    ...
WWW MVMB1  
IP Logged  
 Reply #1 - Jan 2nd, 2012 at 4:45pm
There are no actions to perform.  

Dandello 
Global Moderator
YaBB Next Team
Operations Team
Beta Testers
Support Team
*****
Offline
Posts: 1,856
Earth


YaBB 2.5
Re: xrumer in clicklog
We need a 'bowing' smilie.  Smiley
 
WWW  
IP Logged  
 Reply #2 - Jan 2nd, 2012 at 8:12pm
There are no actions to perform.  

BloodyRue 
Junior Member
**
Offline
Posts: 83


YaBB 2.5
Re: xrumer in clicklog
Been thinking about this one today.
I have to wait and see if that catches him without further coding. What I put in the htaccess file catches page hits and attempted commands, but based on where the xrumer info showed up I think its in  his browser info.

Anyway, I put a marker in the search engine hunter module I am working on to find him again and see what happens to him.  If it doesn't trap it then I can create some coding to do it elsewhere. I can definitely isolate him with the search engine hunter I have.
 
...    ...
WWW MVMB1  
IP Logged  
 Reply #3 - Jan 3rd, 2012 at 3:21am
There are no actions to perform.  

BloodyRue 
Junior Member
**
Offline
Posts: 83


YaBB 2.5
Re: xrumer in clicklog
Just as I thought, that trapper isnt working on the xrumer info in their browser, but my bot hunter picks them up now and reports them to me:

###XRUMER###|SPAMMER ALERT|91.207.6.170

snippet.

hmmm, lets see them get through this rewrite in my htaccess file:

Code Select All
RewriteEngine on
RewriteCond %{HTTP_USER_AGENT} ^xpymep

RewriteRule   !^(.*)403\.shtml - [C]
RewriteRule   !^(.*)trap\.cgi - [C]
RewriteRule (.*) http://mydomain.place/cgi-bin/trap.cgi [L]
 

 
...    ...
WWW MVMB1  
IP Logged  
 Reply #4 - Jan 3rd, 2012 at 4:22pm
There are no actions to perform.  

JonB 
YaBB Administrator
YaBB Next Team
Operations Team
Beta Testers
Support Team
*****
Offline
Posts: 3,617
Land of the Blazing Sun!


None
Re: xrumer in clicklog
"bowing smiley"

...

~ tada ~
Wink
 
I find your lack of faith disturbing.
 
IP Logged  
 Reply #5 - Jan 3rd, 2012 at 11:25pm
There are no actions to perform.  

BloodyRue 
Junior Member
**
Offline
Posts: 83


YaBB 2.5
Re: xrumer in clicklog
I never META bot I didn't fully understand.

BotHunter: WebmasterCoffee

Who knew: http://webmastercoffee.com/blog/about-webmastercoffee

crawled my stuff.

added:
WebmasterCoffee|Webmaster Coffee
 
...    ...
WWW MVMB1  
IP Logged  
 Reply #6 - Mar 8th, 2012 at 1:03pm
There are no actions to perform.  

BloodyRue 
Junior Member
**
Offline
Posts: 83


YaBB 2.5
Re: xrumer in clicklog
KKman2.0

user agent in my clicklog, seems to be an xrumer  variant.
 
...    ...
WWW MVMB1  
IP Logged  
Page Index Toggle Pages: 1
Topic Tools
 

Get Yet another Bulletin Board at SourceForge.net. Fast, secure and Free Open Source software downloads Support This Project BoardMod - YaBB features and templates YaBB Codex - support on installation and usage YaBB Toolbar for your browser

YaBB Facebook Group Page

Vulnerability Scanner

Valid RSS Valid XHTML Valid CSS Powered by Perl
YaBB Chat and Support Community » Powered by YaBB 3.0 Beta!
YaBB Forum Software © 2000-2011. All Rights Reserved.