Robots.txt Generator

Using our tool you can generate a robots.txt that will help Google and other search engines to crawl and index your website properly

This is a premium tool. Please purchase a subscription to gain access to all premium tools and features

About the Robots.txt generator tool

What is a Robots.txt file? It’s a file that tells search engine bots how to crawl a website. Also known as a robots exclusion protocol, a robots.txt file can also be used to specify which parts of a website you don’t want to be crawled. i.e. duplicate content, thank you pages, pages in draft etc.

The contents of a Robots.txt file include the User-agent, and below it,  other directives like Allow, Disallow, Crawl-Delay etc.

To write a Robots.txt manually would take a lot of time, and one wrong line can exclude your pages from being indexed. So, it is better to leave the task to the professionals or tools such as Urgent Expert’s Robots.txt generator.

Robots.txt file and SEO

A robots.txt file is one of the first files a search engine bot will look for on a website. If it is not found, there is a chance crawlers won’t index all of the pages of your website and this can impact the ranking of your website in search engines.

You can use Robots.txt to also disallow pages too, so only the rich pages are indexed, and your thin or unnecessary pages are ignored.

Difference between a sitemap and a robots.txt file

A sitemap is important for all websites as it holds useful information for search engines. It tells bots how often you update your website, and what kind of content your site has. A robots.txt file is for crawlers, it tells them what pages to crawl and which not to.

How to use the Robots.txt Generator tool

To create your robots.txt file, follow the steps below:

Robots.txt generator tool - fields to build the file

  1. You can select if all bots are allowed to crawl or not, by clicking the dropdown of Default – All Robots are. This selection would be the default for all crawlers.
  2. You can then delay your crawl by selecting the seconds 5, 10, 20, 60, or 120 seconds. Why is this useful? Let’s say you have a lot of pages and many of them are linked together, a bot that starts crawling may produce too many requests in a short amount of time. This would lead to a traffic peak and deplete your hosting resources monitored on an hourly basis. So to avoid this, it is a good idea to set a crawl delay to avoid causing peaks in load.
  3. Enter your sitemap if you have one. If you don’t, why not use our sitemap generator tool?
  4. Our Robots.txt allows you to modify which search engines should crawl your site or not. For instance, you may not want to index your site in China’s Baidu, or maybe you only want to index in Baidu
  5. You can also restrict folders from being crawled by entering the directory path
  6. When ready, you can see your robots.txt file in the box below by clicking the create Robots.txt button, or view the file in a new window by clicking create and view Robotx.txt file

40+ optimisation tools

Over 40 online tools to help you improve your search engine optimisation, on-page copy and email marketing for growing your traffic and increasing conversions

Buy now

Pay monthly or annually and cancel anytime. We also offer a complete technical setup and maintenance package with an Urgent Expert specialist