Working with Robots.txt

A robots.txt file provides instructions to search engines, or web robots about how to scan your site. (This is known as the Robots Exclusion Protocol.) The process of search engines scanning and indexing your site can slow down the delivery of web pages to potential customers. Reduce the impact of web robots on site performance by using robots.txt to slow down the crawl rate, or to exclude certain web robots from access to your site.

For information about how to create a robots.txt file visit the web site, www.robotstxt.org.

To upload your robots.txt file:

  1. Find the root folder for your web site in the Web Site Hosting Files folder.

  2. Click the hosting root folder associated with the web site domain for which you want to use robots.txt.

  3. Click Add File.

The file you create overrides the default robots.txt file NetSuite generates for your site.

Related Topics

Adding Page Titles in Site Builder
Adding META Tags in Site Builder
Adding Alt Text to Website Images in Site Builder
Descriptive URLs in Site Builder
Keyword Marketing With Search Engines
Using the Sitemap Generator in Site Builder

General Notices