Guide: Creating a robots.txt file for WordPress

As a new business we need to play our cards close to our chest, we sell SEO and we sell Websites, so we can’t give all of our secrets away. But we’re also here to help. So from this day on we will be creating a series of quick, actionable guides that you can do yourself. Our first is this…

How to create a basic robots.txt file for WordPress

This is a pretty basic one, so let’s keep it short and sweet.

Your website is super important (not sure if i’ve mentioned that one before?) so it’s essential to take good care of it, so treat it like a puppy. Give it all the vaccinations it needs to keep is safe from external threats. WordPress is the most widely used content management system in the world… yet it doesn’t have a default robots.txt file that excludes some relatively simple stuff. This is why you need to add one to your site manually.

What is a robots.txt file, and why do I need one?

A robots.txt file is a small text file that instructs search engines what pages it should and shouldn’t crawl on your website. It can be used to disallow search engines from crawling private pages such as cart, and login pages, and is an important part of getting your site ready to rank in the search engines.

Simply create a file called robots.txt and add the code below to it. This is only a basic file, you can add to it to stop search engines crawling certain directories and subdirectories. It also allows you to specify the location of your sitemap, simply swap out the http://example.com for your site.

User-agent: *
Disallow: /wp-admin/
Disallow: /trackback/
Disallow: /xmlrpc.php
Disallow: /feed/
Sitemap: **insert link to sitemap here**

Share this post

Share on facebook
Share on twitter
Share on linkedin
Share on pinterest
Share on print
Share on email

0