Robots.txt is a file used to make some instructions for search engines considering crawling and indexing pages of the website. Using it you may allow or disallow certain user-agents from crawling your website. It won't improve your rankings directly, but it surely should be used in SEO.
Robots.txt do not help in ranking keywords. It is a text file that lists webpages which contain instructions for search engines robots. The file lists webpages that are allowed and disallowed from search engine crawling.
A robots.txt file indicates search engine crawlers which pages or records the crawler can or can't request from your webpage. This is used mainly to avoid overloading your site with requests; it is not a mechanism for keeping a website page out of Google.