There are numerous robots.txt generators, but it's also very easy to create your own (it's a very simple text file).
"What do they do?"
They tell search engine crawlers what content on your website you would like them NOT to index.
"How do they increase traffic?"
They don't increase traffic. In fact, they can easily decrease traffic if you choose not to index sections of your website (since those sections are not indexed, potential users won't find them when they perform search queries).
"Do they work in conjunction with the sitemap on the website?"
You can specify the location of your XML Sitemap in the robots.txt file, but other than that, they don't explicitly work together.
You can easily create a robots.txt file by opening a new text document with a text editor such as Notepad and saving it as robots.txt. After that, you must add some rules to the file, save it and get it uploaded to your domain root for example, www.xyz.com/robots.txt. You can set the file permission to 644.
This is a set of plain notepad file with few line of codes. Please make sure each websites has different demands and different requirement to allow and disallow process. You cannot used one robots.txt to another website. There are various different pages are available and you have to understand the extensions as well as needs.
Every CMS come with Robot txt file so you won't need to create them. I think you have WordPress blog, so you won't need to create it. Have you checked in the files if it exists or not, I am not sure about Word-press, but in Drupal it exists.
here is link for WordPress robot. text file. WordPress Ideas — WordPress Needs a Default robots.txt File and More...
Make according to your website and how much search engine crawler you want on your website.
Add all url of webpages in robots.txt make a file on your computer.
Login on your server where you are hosted your website and files.
Upload robots.txt in you public html folder.
robots.txt file can vary from one website to another. The disallow concept may lead the content not be read to the search engines. By default, you are allowing everything and anything to be crawled and indexed by search engines.
These files are very simple text files that are placed on the root folder of your website: www.yourwebsite.com/robots.txt or you can create it. If there are files and directories you do not want indexed by search engines, you can use the "robots.txt" file to define where the robots should not go.
Writing a robots.txt is an easy process. Follow these simple steps:
Open Notepad, Microsoft Word or any text editor and save the file as ‘robots,’ all lowercase, making sure to choose .txt as the file type extension (in Word, choose ‘Plain Text’ ).
Next, add the following two lines of text to your file:
‘User-agent’ is another word for robots or search engine spiders. The asterisk (*) denotes that this line applies to all of the spiders. Here, there is no file or folder listed in the Disallow line, implying that every directory on your site may be accessed. This is a basic robots text file.
Blocking the search engine spiders from your whole site is also one of the robots.txt options. To do this, add these two lines to the file:
If you’d like to block the spiders from certain areas of your site, your robots.txt might look something like this:
The above three lines tells all robots that they are not allowed to access anything in the database and scripts directories or sub-directories. Keep in mind that only one file or folder can be used per Disallow line. You may add as many Disallow lines as you need.
Be sure to add your search engine friendly XML sitemap file to the robots text file. This will ensure that the spiders can find your sitemap and easily index all of your site’s pages. Use this syntax: