Go Live overview
Optimize Robots.txt
Configuring your robots.txt file is essential for controlling how search engines crawl and index your website. It helps guide crawlers to important pages while blocking them from indexing irrelevant or sensitive areas.
How to do it on Webflow?
- Access Webflow Project settings: Go to your project settings in Webflow.
- Navigate to SEO settings: Under the “SEO” tab, find the robots.txt settings.
- Edit the robots.txt File: Add directives to allow or disallow search engines from accessing specific parts of your site.
If your goal is to allow all your websites to be crawled by search engines, use the following:
User-agent: *
Disallow:
To learn more, have a look at the deep dive from Finsweet: https://finsweet.com/seo/article/robots-txt
How to set your Robot.txt in Seconds on Webflow?
- Access Robots.txt Builder:
- Within Graphite app, locate the "Robots.txt Builder" feature.
- Configure Directives: Use the builder to specify which parts of your site search engines should or shouldn't crawl.
- Save and Publish