Major SEO mistakes on Drupal Sites

Google Image Search is used every day by millions of people to find products, persons and pictures. There is a possibility that you are not getting all of this traffic, if you’re working with Drupal. And that’s why: A major mistake is included in Drupal’s robots.txt file. It is incredible, that this mistake is existing there for years, but only some people have noticed it. Here is a fragment from the default Drupal robots.txt file. Take a look at it. Do you figure out the mistake? Disallow: /includes/ Disallow: /misc/ Disallow: /modules/ Disallow: /profiles/ Disallow: /scripts/ Disallow: /sites/ Disallow: /themes/ Disallow: /node/ We will tell you, that the problem is in this line: Disallow: /sites/ All the uploaded images are kept somewhere inside the “sites” directory in your Drupal site by default. And, by default, every search engine is blocked by Drupal from searching inside your “sites” directory. In general it means, that your images aren’t getting indexed! This is a huge problem if you’ve got images, which you want to be found by other people on your Drupal site. To highlight the prevalence of this problem, let’s have a look at the blog of Dries Buytaert. Despite the fact that Dries is the creator of Drupal, in addition, he’s a very talented photographer. Actually, Dries has uploaded a huge number of pictures to his blog, involving hundreds of photos from DrupalCon and thousands of great charts and graphs. But the amount of images which were indexed by Google is ridiculous! It is only 13. Sadly, but the standard “Disallow: /sites/” line is involved by Dries’s robots.txt file. You probably suffer from this thing, because even Dries is affected. Do you have an e-commerce site? Google Image Search ignoring all your entire product line. Running a photography blog? Everything you post could be missing by Yahoo and Bing. You’re literally turning your traffic away if no one can search for your images. Pay attention: indexable, high-quality images are a key feature of all the sites with high rate. And you’re making a major SEO mistake if your images aren’t indexable. But the worst thing about this problem is that it doesn’t affect only images. Flash files, PDFs, text documents, and a lot of other information you upload goes into the “sites” folder. Your robots.txt file is stopping GoogleBot, nevertheless Google knows how to index these files. But don’t be shocked. There is a solution and it is pretty easy. The file “Disallow: /sites/” can be edited with a standard text editor and it is located in your main Drupal directory. You just have to remove it from your robots.txt file. Google should start indexing your files shortly after accepting the changes within a few days. Fixing the robots.txt file is a major problem with a simple solution. This should be a priority to fix it for the next Drupal point release.