Bill Hartzer’s recent post reminds us to mask some of the more sensitive spots in your blog. Although this is only applicable to self-hosted blogs, I think everyone will benefit from this tip of keeping your blog secure. It’s another one of those tasks we should do habitually but often forget.
Here’s how it works: we should not allow sensitive directories on our blogs to list their contents publicly. We do not want malicious visitors getting any hints on how they can compromise our websites. We should not let search engines list irrelevant folders in their results.
Hartzer wrote specifically about denying other people access to one’s WordPress plugins directory, but when you go through your site carefully, you’ll notice several more directories you might want to protect:
Folders You Might Want To Protect
Folders for your photos, music, and videos. Unless you uploaded your multimedia so that anyone can download (or hotlink to) them, it’s best to hide the directory index from other people. This can potentially save you lots of bandwidth!
Folders for your blog admin panel. If possible, avoid revealing which locations need to be hacked to get into your blog.
Folders for your blog themes. This specifically applies to bloggers who have a custom-made theme. Don’t make it devastatingly simple for copycats to clone your blog design.
How to Protect These Directories
Here are ways of protecting your important and sensitive blog folders. You can apply what you’ve learned here to other non-blog folders (if you’re running some other type of website, whether static or dynamic).
Disable directory contents from being listed. JavascriptKit explains how to hide files from being listed inside a directory using .htaccess. Check out the other pages to learn how .htaccess works and other things you can do with it.
Password-protect the directories. Michi Kono has written a tutorial on how to rename and protect the WordPress administration folder (wp-admin). You can also apply this to specific directories one by one.
Prevent searchbots and spiders from accessing those directories. Most search engines follow the rules of skipping directories and files that are listed in a robots.txt file. Here’s Google’s own robots.txt file. To make bots and spiders skip directories and files, create a text file called robots.txt and enter the following:
User-agent: *
Disallow: */feed*
Disallow: */trackback
Disallow: */wp-admin
Disallow: */wp-content
Disallow: */wp-includes
Disallow: *wp-login.php
You can add more directories you wish to hide from search engines by following the format above. Learn more about robots.txt at its own website.
The only catch is that you need access to your blog folders. But if you’ve been uploading images for your blog posts, installing plugins, and adding new themes, you probably know how already.
Originally posted on June 20, 2007 @ 2:23 pm