Controlling Website Crawling Using Search Engine Visibility
The Control Crawling option of Search Engine Visibility lets you control what pages of your Website can be crawled, and generates the required file or tags.
From the Optimize tab, select Control Crawling to get started.
To Control Crawling for your website
Each tab lets you perform the indicated function and generates the necessary file. On each tab you can do the following:
- Allow All — Lets crawlers access all of your Website's pages.
- Block specific web pages and search engines — Under this tab, click Add new or modify existing rule to choose specific pages of your Website to block by selecting from a list of search engines that cannot crawl the pages.
- Block All — Blocks all of your Website's pages from crawlers. NOTE: This is not recommended.
Once you have determined how you want to control crawling for your Website, click Get File. Click Create robots.txt or Create Meta Tag to generate the appropriate file. For more information on robots.txt and Meta Tags, see What's the difference between robots.txt and Meta Tags?