How to Fix Robots.txt Blocking Important Pages

errorThe Problem

Your robots.txt file contains rules that block search engine crawlers from accessing important pages — products, collections, or even your entire site. This means Google literally cannot see those pages, so they won't appear in search results no matter how good your SEO is everywhere else.

trending_upWhy It Matters for SEO

Robots.txt is the first file Google checks when it visits your site. If it says 'Disallow: /products', none of your product pages will be indexed. This is the most destructive SEO issue possible — it's like locking the front door of your store and wondering why no one comes in. It's also one of the easiest to fix.

buildHow to Fix It

  1. 1

    Check your current robots.txt: visit yourdomain.com/robots.txt in a browser.

  2. 2

    On Shopify, robots.txt is auto-generated but can be customized via a theme template.

  3. 3

    Go to Online Store → Themes → Edit Code.

  4. 4

    Look for templates/robots.txt.liquid (on newer themes) or create it if it doesn't exist.

  5. 5

    Edit the file to remove any overly broad Disallow rules.

  6. 6

    Make sure your sitemap URL is referenced at the bottom.

  7. 7

    Save and verify by visiting /robots.txt again.

foldertemplates/robots.txt.liquid
{%- comment -%}
  templates/robots.txt.liquid — Shopify robots.txt customization
  This replaces the default auto-generated robots.txt
{%- endcomment -%}
User-agent: *
Disallow: /admin
Disallow: /cart
Disallow: /orders
Disallow: /checkouts/
Disallow: /checkout
Disallow: /carts
Disallow: /account
Disallow: /search
{%- comment -%} Allow all product and collection pages {%- endcomment -%}
Allow: /collections/
Allow: /products/

Sitemap: {{ shop.url }}/sitemap.xml
menu_bookRead Google's official documentationopen_in_new

Related Fix Guides

Want to check if YOUR store has this issue?

Scan your site for free — get copy-paste fixes in 60 seconds.

searchScan Now — It's Free