Create a perfectly optimized robots.txt file for your website
Choose which search engine bots the rules will apply to. Select "*" for all bots or specify particular crawlers.
Decide whether to allow or disallow crawling by default. Most websites should allow crawling with specific exceptions.
List any specific paths you want to allow or disallow. These will override your default permission setting.
Include your sitemap URL to help search engines discover your content more efficiently.
Click generate, then copy and upload the file to your website's root directory.