Updating the robots.txt File

You can update the robots.txt file to prevent random web crawler (or spider) searches against an Oracle b2C Service site.

To prevent random spider site searches against Oracle B2C Service sites, a robots.txt file is installed on each site interface. If Sitemap is enabled on the interface, the robots.txt file can be edited in Configuration Assistant. See Enable or Disable the Sitemap Protocol.

There are some considerations when updating the robots.txt file:
  • If you add lines to the text file, ensure each line ends with: # CUSTOM

    Any lines not ending with # CUSTOM are automatically deleted.

  • Don't include lines ending with: # ADDED BY HMS

    These are default entries that are read-only and are already included in the robots.txt file.

  1. Click the name of site where you want to update the robots.txt file.
  2. Click the Interfaces tab.
  3. Click the menu icon for the interface that you want to update.
  4. Click the file name: robots.txt
  5. Edit the robots.txt file as necessary.
  6. Click Submit.

Results:

The robots.txt file is updated. To view the updated file on the interface, you can do one of the following:
  • Click Download robots.txt on the robots.txt editor window.
  • Access the URL: https://<interface_name>.custhelp/robots.txt