You can find the simulator attributes in the Robot Utilities page under the Utilities tab. The Robot Utilities page is a debugging tool that performs a partial simulation of robot filtering on a URL. You can type in a new URL to check. It checks the URL, DNS translations (including Smart Host Heuristics), and site redirections. It does not check the contents of the document specified by the URL, so it does not detect duplications, MIME types, network errors, permissions, and the like. The simulator indicates whether the listed sites would be accepted by the robot (ACCEPTED) or not (WARNING).
The table below provides the attributes and their description in the Simulator section in the Robot Utilities page.
Table 4–8 Robot Simulator Attributes
Attribute |
Default Value |
Description |
---|---|---|
Run Simulator on |
URLs you have already defined and one blank text box. |
You can check access to a new site by typing its URL in the blank text box. This checks to see if the new site accepts crawling. Format http://www.sesta.com:80/ |
Show advanced DNS information |
Unselected |
Selected displays more information about the site. |
Check for server redirects |
Selected |
Selected checks for any server redirects. |