Yes, it's basically a .txt file created in Notepad, saved as robots.txt and uploaded to the root directory of your site. It is the official way to let the search engine bots (all major search engine bots respect it) know what to crawl and index and what not to and it was the primary means of protecting material.
These days it is necessary in order to help search engine crawlers save time. If you do a Google search for 'robots.txt' you will find the format you need to apply - it's pretty simple actually, about 2 - 3 lines unless you have some specific requirements on your site.
Robots.txt will allow you to control the crawlers by allowing them or not. I would say that it is wise to have this file inside the root of a website since SE looks for this file. This file will also allow you to see the and filter the spiders visit comes to your website. And there lots of reason why this file is needed.
The primary purpose for using a robots.txt file is to gain complete control over the data indexed by the searchbots. Implement a Robots.txt file only when you want to prevent unwanted web pages from being indexed. A robots.txt file is always placed in the root folder of the website where the searchbots can access it easily.