Optimize Robots.txt

Configuring your robots.txt file is essential for controlling how search engines crawl and index your website. It helps guide crawlers to important pages while blocking them from indexing irrelevant or sensitive areas.

How to do it on Webflow?

  1. Access Webflow Project settings: Go to your project settings in Webflow.
  2. Navigate to SEO settings: Under the “SEO” tab, find the robots.txt settings.
  3. Edit the robots.txt File: Add directives to allow or disallow search engines from accessing specific parts of your site.

If your goal is to allow all your websites to be crawled by search engines, use the following:

User-agent: * 
Disallow:

To learn more, have a look at the deep dive from Finsweet: https://finsweet.com/seo/article/robots-txt

How to set your Robot.txt in Seconds on Webflow?

  1. Access Robots.txt Builder:
  2. Within Graphite app, locate the "Robots.txt Builder" feature.
  3. Configure Directives: Use the builder to specify which parts of your site search engines should or shouldn't crawl.
  4. Save and Publish

Do's

Don'ts

Don't have the Checklist yet?