Crawl directives archives

Recent Crawl directives articles


How to keep your page out of the search results

25 February 2019 | 10 Comments
Warning: Undefined array key 0 in /home/staging-yoast/staging-platform.yoast.com/versions/20f300c0f75c8233e65becb49cb77c8946367afb/web/app/themes/yoast-com/html_includes/partials/post-author.php on line 39
Michiel Heijmans

If you want to keep your page out of the search results, there are a number of things you can do. Most options aren’t hard and you can implement these without a ton of technical knowledge. If you can check a box, your content management system will probably have an option for that. Or allows …

Read: "How to keep your page out of the search results"
How to keep your page out of the search results

What are crawl errors?

11 April 2018 | 23 Comments
Warning: Undefined array key 0 in /home/staging-yoast/staging-platform.yoast.com/versions/20f300c0f75c8233e65becb49cb77c8946367afb/web/app/themes/yoast-com/html_includes/partials/post-author.php on line 39
Michiel Heijmans

Crawl errors occur when a search engine tries to reach a page on your website but fails. Let’s shed some more light on crawling first. Crawling is the process where a search engine tries to visit every page of your website via a bot. A search engine bot finds a link to your website and starts to find …

Read: "What are crawl errors?"
What are crawl errors?


Closing a spider trap: fix crawl inefficiencies

12 October 2017 | 4 Comments Davey Smeekens

Quite some time ago, we made a few changes to how yoast.com is run as a shop and how it’s hosted. In that process, we accidentally removed our robots.txt file and caused a so-called spider trap to open. In this post, I’ll show you what a spider trap is, why it’s problematic and how you …

Read: "Closing a spider trap: fix crawl inefficiencies"
Closing a spider trap: fix crawl inefficiencies


What’s the X-Robots-Tag HTTP header? And how to use it?

3 January 2017 Davey Smeekens

Traditionally, you will use a robots.txt file on your server to manage what pages, folders, subdomains, or other content search engines will be allowed to crawl. But did you know there’s also such a thing as the X-Robots-Tag HTTP header? Here, we’ll discuss the possibilities and how this might be a better option for your …

Read: "What’s the X-Robots-Tag HTTP header? And how to use it?"
What’s the X-Robots-Tag HTTP header? And how to use it?

Google Panda 4, and blocking your CSS & JS

A month ago Google introduced its Panda 4.0 update. Over the last few weeks we’ve been able to “fix” a couple of sites that got hit in it. These sites both lost more than 50% of their search traffic in that update. When they returned, their previous position in the search results came back. Sounds too good to be …

Read: "Google Panda 4, and blocking your CSS & JS"
Google Panda 4, and blocking your CSS & JS