In the world of e-commerce, optimizing a website’s crawling is essential to ensure its visibility in search engines. Shopify, as one of the most popular e-commerce platforms, offers online store owners the possibility of optimizing their website’s crawling through the correct configuration of the robots.txt file.
That's why below we will explain in detail how to create and edit the robots.txt file in Shopify.
What are Robots.txt files used for?
Robots.txt files are used to give specific instructions to search engines on how they should crawl and access a website. Below we list some of its most important functions. Read carefully!- Control access to image files. One of the main functions of the robots.txt file is to prevent search engines from indexing and displaying image files on your page in search results.
- Control access to web pages. This allows you to block search engine access to those pages that you consider do not add value to your strategy.
- Block access to resource files. The robots.txt file can also be used to block access to other files, such as scripts and stylesheets, that are not critical to the functioning of your site. This helps reduce the load on your servers and improve the loading speed of your site.
Steps to follow to examine the Robots.txt file
Using the robots.txt Tester tool, you can check which URLs are being restricted and which are not. Follow these steps to perform the check:- Run the tool: Open your web browser and search for "robots.txt tester." Make sure you use a reputable and trusted tool.
- Enter the URL of the page to check: At the bottom of the tool page, you will find a field where you can enter the URL of the page you want to check. Enter the full URL, including the protocol (http:// or https://).
- Select the appropriate User-Agent: A User-Agent is a text string that identifies the crawler or browser that is accessing your website. The robots.txt Tester tool will allow you to select the user-agent you want to use for testing.
- Hit the "Test" button: Once you have entered the URL and selected the User-Agent, simply hit the "Test" button to start testing.
- Check the status of the "test" button: After the tool has performed the check, the status of the "test" button will change. If the URL is being blocked by robots.txt, you will see the "blocked" option. On the other hand, if the URL is not being restricted, it will appear "accepted".
How to edit the Robots.txt file in Shopify
If you're looking to edit the robots.txt.liquid file, we recommend working with a Shopify expert or someone who has experience in code editing and SEO. You can use Liquid to add or remove directives in the robots.txt.liquid template . This allows Shopify to automatically keep the file up to date in the future. For a complete guide on how to edit this file, you can check out the Shopify developer page "Customizing robots.txt.liquid." Before editing the robots.txt.liquid file, remove any customizations such as using a third-party service like Cloudflare. Here are the steps to follow:- In your Shopify admin, click "Settings" and then "Apps and sales channels."
- From the "Apps and sales channels" page, click "Online store" .
- Click on "Open sales channel" .
- Next, select "Themes."
- Click the options button (...) and then click "Edit code".
- Select "Add a new template" and choose "robots".
- Click "Create Template" .
- Make your desired changes to the default template. If you need more information about Liquid variables and common use cases, you can check out the Shopify developer page "Customizing robots.txt.liquid" .
- Save your changes to the robots.txt.liquid file in your published theme.