Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.
The robots. txt file, also known as the robots exclusion protocol or standard, is a text file that tells web robots (most often search engines) which pages on your site to crawl. It also tells web robots which pages not to crawl.