As the name indicates, these are files that are not meant to be read by humans. Instead, we create these files and add them to our website in order to let "robots" or in this case search engines crawlers know which pages on our website they can access. The reason we want to let the search engine crawlers know exactly which pages to crawl, is to avoid overcrowding our server with unnecessary requests to pages that we don't want to be indexed by search engines.
1. Fill the necessary fiels
2. Generate the robots.txt file
3. Copy the robots.txt file to your site