Last Updated on by
Many bloggers finds this difficult, optimising your blog’s robot.txt isn’t as much as fun as it may seem.
Robot.txt is very important when it comes to the segment of SEO, it helps the search engines to locate your blog and understand what it’s all about.
What is robot.txt ?
A Robot.txt is a text file that a blog or website owners use or create to help tell the search engine bots, how to operate (crawl, index and submit) pages on their blog/website.
All Robot.txt file are usually stored in the root directory (main folder) of a blog/website.
Robot.txt file are always in the format below:
1. User-agent : [user-agent name]
Disallow:[URL string not to be crawled]
2. User-agent:[user-agent name]
Disallow:[URL string to be crawled]
Sitemap:url of your xml sitemap
You are free to have multiple lines of instructions to allow or disallow specific URLs and also add multiple sitemap.
I will advice you to always define your instruction because if you don’t disallow a URL then search engine bots will then assume you allowed them to crawl it.
Below is an example of a robot.txt file look alike:
The above XML show us allowing the search engine bots to crawl and index files in our WordPress upload folder and also we disallowed search engine bots from crawling and indexing plugins and WordPress admin folders.
Then we provided the URL of our XML sitemap
Search engines will crawl your blog even if you don’t have a robot.txt file, but you won’t be able to tell the search engine bots which pages or folders they should not crawl.
Robot.txt file will help you a lot if you owned a blog with lots of contents as it will allow you to take control of how search engines crawl and index your site/blog.
If you created a robot.txt file for your blog then you will experience your blog content getting featured on search engines.
Because search engine bots will always visit your blog time to time crawling and indexing your blog/website.
Website owners with lots of content
If you’re a blog owner with lots of contents and you don’t have a robot.txt file for your blog.
Then search engines may not always feature your content because search engine bots crawls a certain number of pages during a crawl session.
Actually if they don’t finish crawling all your blog’s page, then they will come back and resume the crawl in the next session.
It may slow down your blog’s indexing rate and may not give you enough traffic.
You can fix this type of issue by disallowing the search engine bots from crawling and indexing unnecessary pages (admin login pages, themes, files)
This will help your blog get crawled and indexed faster by search engines bots.
You can also use the robot.txt file to stop search engines from displaying a post or page on your blog/website.
Generally all blog/website use the simple robot.txt file, it all depends on their need for search engine indexing.
Below is the example of a simple robot.txt file format:
The above robot.txt file allows all bots to index all content and pages on your blog and also giving them a link to the XML sitemaps of the blog.
I recommend user with WordPress blog to follow the rules in the robot.txt file below:
This simply allow search engine bots to index all blogs images and files.
It then disallow them to index plugin files, WordPress admin area, the WordPress read me, files and afilliate link.
How to create a robot.txt file in WordPress.
You can use it to create, edit and monitor a robot.txt file using the WordPress admin area.
To modify your WordPress blog robot.txt file using Yoast SEO plugin follow the procedure below:
I will explain the two ways to create a robot.txt file in WordPress then you can choose which method to go for.
Creating or editing Robot.txt file using Yoast SEO
Toast SEO is an amazing SEO plugin that comes with a robot.txt file generator.
It can be used to create and modify a Robot.txt file using your admin area in WordPress.
Simply Navigate to SEO—>>> Tools (A page in your WordPress Admin then you’ll click on the File editor link)
On the next page, Yoast SEO plugin will show up your existing robot.txt file ready to be modified but if you don’t have a robot.txt file, then you can create one using Yoast SEO.
Yoast SEO always generates robot.txt file in a format shown below by default:
It is necessary for you to delete this because it will disallow all search engine from crawling and indexing your blog/website.
When you are done deleting this, then you can go ahead and create your own robot.txt file adding your own rules as explained above.
Do not forget to click on the “Save Robot.txt file” button in order to save your changes.
You can edit robot.txt file by using the FTP manually.
This method requires you using FTP client to edit robot.txt file.
You just have to connect your WordPress hosting account using FTP client.
Once you are done with that, you will get the access to robot file in your blog/website root folder
If you can’t see one, then it’s because you don’t have a robot.txt file created for your blog/website.
You can just create one, go on by downloading it to your computer.
Then use txt file opener software such as Notepad or TextEdit
To edit your blog/website robot.txt file by adding your own rules.
Then You save it when you’re done.
After that you’ll then upload it back to your blog/website’s root folder to make changes then save.
You can check if it is installed with Google Search Console
If you’re curious if your blog’s robot.txt file has been created.
You can try testing it by using a robot.txt tester tools.
You have one installed for you in the Google Console search, it’s easy to use.
With the above tutorial, you see that these above methods will help your blog outstand your competitors.
Thereby increasing the way your blog content is being shown to user in search engines.
Thank you for reading… If you found this article good, please go on liking and sharing.
Don’t forget to subscribe to our blog.
In order to receive latest post updates on technology, information, earning tips and tricks and website tips generally.