In this article I will explain how to use set Robots.txt in Elgg 1.9
The robots.txt file is used to provide instructions about the Web site to Web robots and spiders. Web site owners use the /robots.txt file to give instructions about their site to web robots; this is called The Robots Exclusion Protocol. Elgg in the latest version added a robots.txt file configuration from the administration back-end.
1. Sign into your elgg network as an administrator.
2. Open the Administration Dashboard.
3. Go to the menu “Configure -> Utilities -> Robots.txt“:
4. In this section you can configure your Robots.txt file:
Note: If you have installed Elgg in a subdirectory, possibly will not work correctly. In order for this to work, elgg must be installed on your main domain
Save the changes and that’s it! You have set the Robots.txt file.
Note: Elgg 1.9 has been released. We will be sharing articles for elgg 1.8 and also for elgg 1.9
This concludes Robots.txt in Elgg 1.9