VP-ASP :: Shopping Cart Software

Shopping Cart Software Solutions for anywhere in the World

US/Canada(Toll Free): +1 888 587 2278
Europe/UK: +44 (020) 7193 9408
Australia/New Zealand: +61 3 9016 4497

VP-ASP Shopping Cart Customer Forum

Home | Profile | Register | Active Topics | Members | Search | FAQ
Username:
Password:
Save Password
Forgot your Password?

 All Forums
 VPCart Forum
 Announcements
 A Word Of Warning About Bots and Spiders
 New Topic  Reply to Topic
 Printer Friendly
Author Previous Topic Topic Next Topic  

Steve2507
VP-ASP Expert

590 Posts

Posted - January 14 2011 :  11:01:24  Show Profile  Reply with Quote
I thought this might be of interest to new website owners and maybe experienced ones who had never looked at this.

Our visitor numbers have more than doubled in the past 12 months which is obviously good. But because of this and the customisation we have done to vpasp I have spent a disproportionate amount of time looking after our server. By this I mean making sure it doesn’t crash.

The problem had been that with the increase in traffic, the server was coming under more pressure and so I would have to restart www services maybe 4 or 5 times a day.

At first I thought it was vpasp, mssql or our hosts. So I checked code (not a fun job – I wouldn’t recommend it, unless you want to be really bored), made sure updates were in mssql and got our hosts to check the server. I made sure css files were compressed, I even purchased a second server and made sure all images and static files were hosted there to reduce the calls to the main server, thereby allowing the main server the resources to just process database info.

Everything I did helped at first, but then after a few days it went back to how it had been. On one day, when we had sent a newsletter out, I had to restart services nearly every hour. I was literally tearing my hair out (and if you know me I haven’t got much as it is).

Then I read on a forum about bots, spiders and robot text files. So I created a robots text file, I blocked various spiders using it and others I forced a crawl delay.

Did this work – like hell did it; in fact I would say it made it even worse. It was as if the spiders were saying “Don’t like us? Well tough were going to attack your sites even more now”. Basically they just ignored the robots file. Although I should be fair and say that only a few ignored it, but they were the worst offenders in the first place.

So I went to the isapi rewrite forum and found a posting about pushing bad bots to a trap page and completely blocking them from the site.

What I’ve done is block various bots; they can only go to a static page that has the word “thanks” on it. The page is only 299 bytes so it’s nothing.

Now comes the interesting bit.

I’ve been monitoring this file and it is being accessed by one bot every 10 seconds. Even though that bot has been blocked by the robots file. That is only one bot. Overall bots are being forced to this page every 2.6 seconds. This shows just how much strain they were putting on our server.

Finally I feel we have solved the problem of the server crashing. It wasn’t the server, vpasp, mssql or human visitors it was an artificial menace.

So if you are having problems with your site speed don’t just look at the software or the hardware. Look at your logs and see if you are getting an excessive visitation from bad bots, if you pay extra for bandwidth it will save you money, but more importantly human visitors will get a site that is up more often.


Hope this helps someone.



Steve
Sex toys from a UK sex shop including vibrators and dildos.

support
Administrator

4266 Posts

Posted - January 15 2011 :  04:44:40  Show Profile  Visit support's Homepage  Reply with Quote
Hi Steve,

Great post! I might get our hosting guys to look into this one for our site actually.

Have a great weekend!

Thank you.

Cam Flanigan
VP-ASP Cart Support

Follow us on Twitter:
http://www.twitter.com/vpasp
Go to Top of Page

Mark Priest
VP-ASP Expert

United Kingdom
570 Posts

Posted - January 17 2011 :  19:34:26  Show Profile  Reply with Quote
Hi Steve

Great post, once you've isolated the bad bots you should have an automated script that bans the IP subnet. Puts pay to them for good.

Regards,

Mark
Fireworks
Go to Top of Page

Steve2507
VP-ASP Expert

590 Posts

Posted - January 18 2011 :  07:32:37  Show Profile  Reply with Quote
Thanks for the comments.

Thought I'd give a quick update on this.

At peak times the trap pages was being accessed by the worst offender once every second. Absolutely unbelievable.

What I don't understand is why the programmers stop the bots following the robots text file in the first place. I have nothing against bots in general as long as they follow the rules.

But if you don't follow the rules then you can't play with us.


Steve
Sex toys from a UK sex shop including vibrators and dildos.
Go to Top of Page
  Previous Topic Topic Next Topic  
 New Topic  Reply to Topic
 Printer Friendly
Jump To:
Snitz Forums 2000