Robots.txt

Your online store is indexed by search engines, which makes it possible for your clients to find your online store. The Robots.txt file is used to prevent certain pages of your online store from being indexed and displayed in search results.

SEOshop does not provide technical support for creating your own Robots.txt file. To prevent your online store from being properly indexed and, as a result of which, becoming difficult to find, we recommend only editing the Robots.txt file if you have the required expertise.

Creating a Robots.txt file

Navigate to GENERAL > Settings in the left menu of the back office. You will find the Robots.txt option below the Other caption. In order to create the Robots.txt file, you will first have to activate its status. After doing so, an input field will open, where you can enter its code.

Entering code

User-agent: Here you can indicate to which robots the code following it will apply. For example, you can choose to have the page in question indexed by other search engines such as Yahoo! or Bing, but not by Google. Within search engines, you can even be more specific. If you want the code fragment to apply to all search engines, use an asterisk * in the User-agent field.

Disallow: Enter the URL you wish to block here. You do not have to specify the entire URL, just the file location will do. For example, if you wish to exclude http://domeinnaam.com/category1/ from being indexed, then you only have to enter /category1/. There is a difference however, between the use / omission of a slash sign (/) at the end of a code line: Disallow: /category1 - This will apply to the entire folder, including all files and sub-folders in it. Disallow: /category1/ - This will only apply to the category1 page in this folder. The best way to check this, is looking at the URL of the page in question. If it contains a slash sign at the end, be sure to include it in your Disallow rule.

Always check your code thoroughly for proper use of slash signs and whether it contains empty fields in between.

Sitemap - If so desired, you can also indicate the file location of your sitemap in the Robots.txt file. In this case however, you will have to enter its URL in full.

Code example: 
User-agent: *
Disallow: /categorie1/subcategorie1/
User-Agent: Googlebot
Disallow: /product1.html Sitemap: http://www.domeinnaam.nl/sitemap.xml

User-Agent: Googlebot Disallow: /product1.html Sitemap: http://www.domainname.nl/sitemap.xml

For more information on the Robots.txt file, please check its Google support page

A 'Crawl-delay 2' notification is not a serious matter and will not affect the operation of your online store. Crawl-Delay involves Google bots visiting online stores and prevents server overload, which could cause operational delay in your online store.
Have more questions? Submit a request