Enable GoogleBot to fetch your JavaScript and CSS files

Google announced recently that they’ve updated their webmaster guidelines to specifically note that blocking your CSS or JavaScript files may have a negative impact on your indexing and search rankings in Google. They are claiming, that for better understanding your site, they need to access these files too (and, probably next time they will want to take a peek in your laundry bag too...)

You might like this or not, but if you are SEO conscious, you should follow the guidelines. There is a serious debate in Joomla community how you can do this the best way, there are couple of recommendations from wich I recommend you my favourites. By the way - this article soon might become obsolete, the core Joomla team allready announced, that the next releases of Joomla will be following the rules by default.

But let's go back to business, allowing access to your image folder is relatively simple, just comment out the line with /images in your robots.txt. But what about the JS and CSS files?

a. The geeky one:

Add this line to your robots.txt:

Allow: /templates/mytemplate/

Where "mytemplate" is your active template directory. The solution lets you release access to the minimum needed area of your site.

You can refine it, byt adding the CSS and JS directories only, like this:

Allow:/templates/mytemplate/css/
Allow:/templates/mytemplate/js/

If you want even more control, you can acdd this on top of your robots.txt, before the

User-agent:*

line:

 

#Googlebot
User-agent: Googlebot
Allow: /templates/mytemplate /css/
Allow: /templates/mytemplate /js/

This will do the job, generally Google does not complaints about, even if there are still lot of CSS/JS files wich remain unaccessible for Google. To name just some:

media/system/js/
media/jui/js/

But some components, modules also can have their own CSS/JS files, like:

components/com_rsform/assets/
modules/mod_autsonslideshow/js/

So, the solution isn't perfect and to deactivate half of the Disallow rules is something you can't afford (security wise), even if some "Joomla experts"are recommending it.

Here comes in play the secons solution, my preferred one:

b. The Lazy Bear solution

Add this line to the top of your robots.txt:

#Googlebot
User-agent: Googlebot
Allow: *.css
Allow: *.js

This allows GoogleBot to access ALL your JavaScript and CSS files, keeping the other bots locked out, exposing your site minimally to the prying eyes.

And, finally how you can check if any of above tricks works?

Go to the Webmaster tools, and run the utility "Fetch as Google" wich will show you a report on unaccessible  directories and files. If everything is OK, you have done what you should to keep Google happy - and you can hope, that your site won't be downgraded for this reason Evil.