Block Malicious Bots and Spiders in web.config

Search engines bots, crawlers and spiders helps your site getting discovered by users. However, some search engines are very aggressive and consume a lot of server bandwidth. And sometimes, bots and spiders can be malicious and try to extract sensitive data. In this article I will give you a step by step guide on How to block malicious bots and spiders in web.config

1. Open the web.config file of your site or ASP application

2. Look for the <security> tag:

security asp websitepanel arvixe

2. Inside this tag, we are going to apply the Request Filtering rule. Open and close a <requestFiltering> and a <filteringRules> tag:

security asp websitepanel arvixe

3. Now, let’s make a new filtering rule. For this example, we will filter the BingBot. We will name the rule, and also define the user agent within the add string.

<filteringRule name=”BlockSearchEngines” scanUrl=”false” scanQueryString=”false”>

 <scanHeaders><clear />

<add requestHeader=”User-Agent” />  </scanHeaders>

<appliesTo> <clear /></appliesTo>  

<denyStrings> <clear />

<add string=”YandexBot” />  </denyStrings>


security asp websitepanel arvixe

Note: You can add another bot, adding a new filtering rule.

4. Save your changes and that’s it! The automated bot will be filter and won’t waste useful server resources.

This concludes how to block malicious bots and spiders in web.config

Looking for quality ASP Web Hosting? Look no further than Arvixe Web Hosting!

Happy Hosting!

Rodolfo Hernandez

Tags: , , , , , , , , , , , , , , , | Posted under DotNet/Windows Hosting, WebsitePanel Tutorials | RSS 2.0

Author Spotlight

Rodolfo Hernandez

I like photography and reading books. Currently working for Arvixe as Elgg Community Liaison. Elgg Security Expert Web Security Expert CEO of UDP SW Social Web

Leave a Reply

Your email address will not be published. Required fields are marked *