Resolving rogue robots directives

Last updated: Jun 21, 2023

The video is about a case study where a website's entire site was disallowed from being crawled, resulting in bad snippets and decreased traffic, and how the issue was resolved by changing the robots.txt file.

The video is about a case study where a website's search traffic declined due to bad snippets appearing in search results. The SEO team at Google investigated and found that the entire site was disallowed from being crawled due to a directive in the robots.txt file. The team recommended changing the file, which was a simple fix, and the website's traffic improved. The video also discusses the importance of regularly checking search console and the challenges of implementing best practices when resources are limited.

  • Bad snippets and decreased traffic due to disallowed directive in robots.txt file.
  • Investigation using data and tooling, including search console.
  • Resolving the issue by changing the robots.txt file.
  • Identifying issues in search console and using web analytics.
  • Components of SEO process, including setting up analytics and directing Google to the right parts of the site.
  • Recommended tooling, including search console, Lighthouse, and Core Web Vitals.
  • Preventing future issues through monitoring and changing robots.txt carefully.
  • Fixing the issue by removing the disallow directive from the robots.txt file.
  • Robots.txt is a powerful tool that needs to be used carefully.

Resolving rogue robots directives - YouTube

Resolving rogue robots directives 001

Introduction

  • Jason Stevens, an SEO specialist at Google, shares a case study about a website that had bad snippets and decreased traffic due to a disallowed directive in their robots.txt file.
Resolving rogue robots directives 002

Bad Snippets and Decreased Traffic

  • The website was getting bad snippets in the form of "no information available for the site" for many of their ranked queries.
  • This confused users and impacted traffic to the site.
  • The problem was traced to the robots.txt file, which had a disallowed directive across the entire site.
  • The entire site was not allowed to be crawled, which caused the bad snippets and decreased traffic.
  • The change to the robots.txt file had happened a month ago, and the previous file was boilerplate with no directives.
Resolving rogue robots directives 003

Investigation and Hypotheses

  • The team investigated the issue using data and tooling, including search console.
  • They noticed that the crawl requests had dropped to zero over the last month, indicating a problem with site-wide crawling.
  • They hypothesized that the disallowed directive in the robots.txt file was the cause of the problem.
  • This was based on their experience with similar issues in the past.
  • The team did not know why the change was made to the robots.txt file.
Resolving rogue robots directives 005

Resolving rogue robots directives - YouTube

Resolving the Issue

  • The team recommended changing the robots.txt file, which was a fairly simple change.
  • The website team worked with them to make the change, as they had noticed traffic declines.
  • It was noted that some platforms or tools might make it more difficult to change the robots.txt file.
  • The website team had search console set up, but may not have had the bandwidth to monitor it regularly.
  • Part of the SEO team's process is to configure search console and monitor it for issues.
Resolving rogue robots directives 006

Identifying Issues in Search Console

  • Search Console reports can identify big problems with the site.
  • Warnings and alerts are sent through email.
  • The Crawling section of Search Console can show how Google is crawling the site.
  • A drop in traffic can indicate a problem.
  • Web analytics can also help identify issues.
Resolving rogue robots directives 007

Components of SEO Process

  • Setting up analytics can show if pages are getting traffic from search.
  • Checking if content is indexed in Search Console.
  • Ensuring Google is crawling the right parts of the site.
  • Using different methods to direct Google to the right place in the site.
  • Using tools like Core Web Vitals and mobile friendliness tests.
Resolving rogue robots directives 008

Recommended Tooling

  • Search Console is a comprehensive tool for identifying issues.
  • Lighthouse and GA can also be used to understand traffic and user experience.
  • Rich Results Test can be used for structured data and rich results.
  • Using a mix of tools can help understand the full complexity of the picture.
  • Tools like Lighthouse and Core Web Vitals are commonly used by developers.
Resolving rogue robots directives 010

Fixing the Issue with Robots.txt

  • The issue was resolved by changing the disallowed directive in the robots.txt file.
  • The fix was simple and involved changing the forward slash for disallow all.
Resolving rogue robots directives 011

Fixing the Issue

  • Removed the disallow directive from the robots.txt file.
  • Started seeing crawl report requests going up and traffic returning.
  • Snippets started returning the way they wanted them to.
  • Easy fix if you have the ability to make the change.
  • Not always an easy fix.
Resolving rogue robots directives 012

Preventing Future Issues

  • Put in monitoring for important parts of the website.
  • Use a crawler to see if something has changed.
  • Change detection is useful for SEO.
  • Prevents potentially long-term effects.
  • Configure robots.txt to block crawling for certain parts of the site.
Resolving rogue robots directives 013

Changing Robots.txt

  • Use the robots.txt testing tool to test out how something would be excluded or included before making the change live.
  • Robots.txt needs to be treated with care.
  • Always interesting to see how people debug these issues.
  • Monitoring is important to prevent future issues.
  • Use different tools and services to monitor the website.
Resolving rogue robots directives 015

Conclusion

  • Robots.txt is a powerful tool that needs to be used carefully.
  • Preventing future issues is important through monitoring and using different tools.
  • Changing robots.txt can be done using the testing tool to prevent potential issues.
  • Thank you to Jason for being on the show.
  • Stay safe and take care.

Watch the video on YouTube:
Resolving rogue robots directives - YouTube

Related summaries of videos: