How to deny bots to access website:
As you know there are different bots that crawls on your website. Mostly the search bots such as Google, bing, yahoo etc are crawling on websites. Frequent fetches of these bots will sometimes result in a slowdown of the websites. So in this case you can deny the access of bots to your website. But there are some cases that need Google bots to crawl on your website too.
You can use a robots.txt file on your website document root to control the bots towards your website. Here there are different options in denying bots to your website, that is you can deny bots to the whole website or for a specified location.
First you need to create a file called robots.txt on your websites document root. It will be the public_html folder.
Now you can add the codes to the file according to your need. You can see some of the example codes below,
To deny all the bots towards your website, use the code below,
User-agent: * Disallow: /
To deny bots to browse a specified folder use the code below,
User-agent: * Disallow: /folder/
Please note that, you need to rename “folder” with the name of the folder that you need to deny access to bots.
To deny bots to browse multiple folders use the code below,
User-agent: * Disallow: /folder1/ Disallow: /folder2/
To deny bots to browse a page use the code below,
User-agent: * Disallow: /iserversupport.html
Please note that, you need to provide the name of the page in the code replacing the text iserversupport in the above code.
Save the file after editing. That’s it
If you need our help to fix any issues on your server. Please feel free to contact us, simply email to [email protected]
Monthly server support with Unlimited tickets, 24×7 monitoring, Security Audit and lot more for just $59
[sep][/sep][button size=”large” color=”green” title=”Server Management from iServersupport” link=”http://iserversupport.com/cpanel-server-management/”]Server Management at just $59[/button]