Allow robots to crawl your wp-content folder

An alternate title for this post could be, “How disallowing robots from your wp-content folder could cost you mobile rankings in Google.”

On April 21st, 2015, Google is going to change the way it ranks sites for users on mobile devices. By blocking Googlebot from your plugins folder, you could be preventing Google from deciding that your site is mobile-friendly. If you are skeptical about this Google-is-changing statement I have made or want to dive into the details, read this.

So, why?

Why does Google need to crawl your plugins folder? Plugins often contain CSS or JS files, and those files are necessary to understand what the page actually looks like. Google Webmaster tools told me I was preventing Googlebot from crawling some CSS files in which it was interested. Robots need to download all CSS and JavaScript files or they cannot determine if a page is friendly to mobile users.

I found this line in my client’s robots.txt:
Disallow: /wp-content/plugins/

Why would this line be in robots.txt at all? My client lives on GoDaddy Managed WordPress Hosting, and that service creates a robots.txt file that looks like this (as of the date I published this post):

User-agent: *
Crawl-delay: 1
Disallow: /wp-content/plugins/
Disallow: /wp-admin/

There are a bunch of blogs that discuss the “ideal WordPress robots.txt file” that recommend blocking the plugins folder, and some plugins alter robots.txt to block this directory, too. Before February 2015, even Yoast SEO did this. It’s no longer a good idea.