Help Me/Question How Make Robots.txt

niranjan kumar

White Belt
How can make robots.txt for e-commerce site and normal site for various search engine. how can i test it proper working or not.


Well-Known Member
You can create robots.txt or you can try Robots.txt Generator to create automatically robots.txt file for your site.


Well-Known Member
  • Use Google Search Console – With this tester tool you can analyze the latest cached version of the page, as well as using the Fetch and Render tool to see renders from the Googlebot user agent as well as the browser user agent. Things to note: GSC only works for Google User agents, and only single URLs can be tested.

Draftify Essay

New Member
robots.txt file with two easy method, explained below

# Method 1
User - agent: Googlebot
Disallow: /nogooglebot/

# Method 2
User - agent: *
Allow: /


New Member
Robots.txt is a file to prevent the particular pages to get indexed by search engine. To prevent the the page which is credential then you can use disallow for e-commerce and normal site difference is in e-commerce there is money transaction data in checkout page this is not to be indexed and for normal website admin logins not to be indexed then can use disallow for these page by the agent name of any crawlers. And to check its activated check with site url/robots.txt


White Belt
Simply type in your root domain, then add /robots. txt to the end of the URL. For instance, Moz's robots file is located at

Basic robots. txt examples
  • Allow full access. User-agent: * Disallow:
  • Block all access. User-agent: * Disallow: /
  • Block one folder. User-agent: * Disallow: /folder/
  • Block one file. User-agent: * Disallow: /file.html.