Help Me/Question How Can I Create Robot.txt File Easily?


Active Member
Let's handle each question one at a time...

"How and where on my website do I add a robots.txt file?"

You add the robots.txt file to the root directory of your website's Web server. Once you've added it, you should be able to access it at (where is your website's domain).

"Where do I get one from?"

There are numerous robots.txt generators, but it's also very easy to create your own (it's a very simple text file).

"What do they do?"

They tell search engine crawlers what content on your website you would like them NOT to index.

"How do they increase traffic?"

They don't increase traffic. In fact, they can easily decrease traffic if you choose not to index sections of your website (since those sections are not indexed, potential users won't find them when they perform search queries).

"Do they work in conjunction with the sitemap on the website?"

You can specify the location of your XML Sitemap in the robots.txt file, but other than that, they don't explicitly work together.


Content Writer
It is a simple text file. All you have to do is create a blank notepad document called "robots" and save as plain text.

Zirkon Kalti

Content Writer
You can easily create a robots.txt file by opening a new text document with a text editor such as Notepad and saving it as robots.txt. After that, you must add some rules to the file, save it and get it uploaded to your domain root for example, You can set the file permission to 644.

Swati Mishra

Content Writer
This is a set of plain notepad file with few line of codes. Please make sure each websites has different demands and different requirement to allow and disallow process. You cannot used one robots.txt to another website. There are various different pages are available and you have to understand the extensions as well as needs.

niranjan kumar

White Belt
User-agent: *
Disallow: /dev/
Disallow: /test/
Disallow: /account/
Disallow: /admin.php
Disallow: /ajax/
Disallow: /login/
User-agent: *
Make according to your website and how much search engine crawler you want on your website.
Add all url of webpages in robots.txt make a file on your computer.
Login on your server where you are hosted your website and files.
Upload robots.txt in you public html folder.

Manish Mishra

Content Writer
robots.txt file can vary from one website to another. The disallow concept may lead the content not be read to the search engines. By default, you are allowing everything and anything to be crawled and indexed by search engines.


Well-Known Member
robots.txt file is just a text can create this using notepad.specify these things
User-agent: *
Disallow: /dev/
Disallow: /test/

specify user agent which is the search engine.and then specify the location of the pages which you don't want to crawl.


Money Making Ideas Online UAE, UK, USA
These files are very simple text files that are placed on the root folder of your website: or you can create it. If there are files and directories you do not want indexed by search engines, you can use the "robots.txt" file to define where the robots should not go.


Well-Known Member
The "/robots.txt" file is a text file, with one or more records. Usually contains a single record looking like this:
User-agent: *
Disallow: /cgi-bin/

daisy gorard

New Member
Just create a simple txt file with name of robots.txt & give robots instruction like which page or path is allow for crawl or index & which is not.


Well-Known Member
robots.txt file is a text file which contains instructions for the Web Crawlers (especially search engine robots) about how to crawl the pages on a particular site.


White Belt
Writing a robots.txt is an easy process. Follow these simple steps:

  • Open Notepad, Microsoft Word or any text editor and save the file as ‘robots,’ all lowercase, making sure to choose .txt as the file type extension (in Word, choose ‘Plain Text’ ).
  • Next, add the following two lines of text to your file:
User-agent: *

‘User-agent’ is another word for robots or search engine spiders. The asterisk (*) denotes that this line applies to all of the spiders. Here, there is no file or folder listed in the Disallow line, implying that every directory on your site may be accessed. This is a basic robots text file.

  • Blocking the search engine spiders from your whole site is also one of the robots.txt options. To do this, add these two lines to the file:
User-agent: *
Disallow: /

  • If you’d like to block the spiders from certain areas of your site, your robots.txt might look something like this:
User-agent: *
Disallow: /database/
Disallow: /scripts/

The above three lines tells all robots that they are not allowed to access anything in the database and scripts directories or sub-directories. Keep in mind that only one file or folder can be used per Disallow line. You may add as many Disallow lines as you need.

  • Be sure to add your search engine friendly XML sitemap file to the robots text file. This will ensure that the spiders can find your sitemap and easily index all of your site’s pages. Use this syntax: