Technical SEO Tune Up

Image used by permission from Nicks Car Blog

Giving your site a tech tune-up is simple

I’m always amazed, even with large corporate brands who should know better, just how many sites are crippled with severe technical obstacles. These issue left unresolved restrict all or part of a site’s content from being crawled effectively by search engines. This translates into poor rankings, meager traffic and unnecessarily poor overall site performance.

Many marketers & webmasters feel overwhelmed by the prospect of tinkering with the more technical side of things. Fortunately, most common technical obstacles are avoidable, and easily resolved once identified. Let’s take a look at some of the most common SEO technical obstacles.

Talk To The Spiders Using a Robots.txt File

I like to start at the Robot.txt file – a simple text file that lives at the root of a website (ie www.website.com/robots.txt) that gives search engines special instructions about what pages they should not crawl/index. The biggest issue with a robots.txt file is insuring you aren’t inadvertently blocking valuable pages.

Believe it or not I’ve had potential clients come to me with a site that ‘just won’t perform in search’ only to find out they had accidentally left the entire site blocked after a recent site redesign.

So check to make sure you DO have a robots.txt file (even if it’s empty), insure it does NOT block valuable pages, and while you’re at it include a link to your sitemap so search engines don’t have to work too hard to find it.

There are a number of useful tools out there to assist in creating and testing your robots.txt files. Personally, I like to use Robots.txt Generator Tool and the Robots.txt tester recently added to Google Webmaster Tools.

Address 404 Errors & Have a Custom 404 Page

The next item I look at is how wrong/broken URLs are treated. I enter a made up URL string (ie:www.epicainteractive.com/IKnowKungFu) paying close attention to the server response. You can check server header responses using any number of online tools or plugins (my favorites are here and here)

A good 404 page should inform the search engine that in fact a page is missing or no longer available with a 404 header response, but the experience should keep the user with-in the site’s structure and provide enough navigation to help the user (or search engine) get back on track.

I always recommend that a custom 404 template be set up that maintains the look and feel of the site, this goes a long way to ensuring users who wander off the beaten path don’t completely leave your site experience.

The custom 404 page template should include the normal site navigation, and perhaps a more sitemap-like sampling of the sites pages for larger sites. For very large sites including a search field might be a good idea.

Some 404 responses occur because a piece of content legitimately no longer exists, but often 404 errors are the result of poor URL management and improper linking.

When moving content from one URL to another be sure to 301 permanent redirect the old URL to the new one. Additionally frequently review the 404 errors report in Google Webmaster Tools to identify opportunities to capture mistyped/mislinked URL traffic.

Clean Up Your URL Structure

There’s a ton that can go wrong with URL structures – dynamic name/value pairs, using underscores instead of hyphens, nonsensical strings instead of legible words & phrases, deep nested hierarchy, case sensitivity and the list goes on.

Try to keep the URL structure basic. Avoid complex, dynamic name/value pair strings. Use real words instead of abbreviations or a nonsensical series of letters/numbers. The folder & file names should make sense and be relevant to the page’s content. Use hyphens to separate words in a phrase rather than underscores (or worse, spaces).

Keep the hierarchy simple – avoid deeply nesting pages under layers of folders. The further a page is nested away from top level pages, the less valuable that page may appear.

Most modern content management systems understand the critical part URLs play in the indexation of a website and automatically create clean, optimized URLs. If your platform does not have that capability built in, the process for manually rewriting URLs is a bit more technical and worthy of several posts of it’s own.

Introduce Canonical Tags

Canonicalization, one of my all-time favorite industry-speak terms, refers to the process by which URLs are standardized for search engines. Search engine spiders see each unique URL string as a completely separate page, even when the strings are essentially the same and even lead your browser to the same place.

For example – www.epicainteractive.com vs epicainteractive.com vs www.epicainteractive.com/index.php – each URL essentially takes you to the same place, but for search engines each of those URLs is a different address and is treated as a different page. This introduces 2 primary problems – diffused authority and duplicate content.

The solution for both is the same. On each version of the page provide Google with the preferred URL string (achieved using the canonical tag – <link rel=”canonical” href=”http://www.epicainteractive.com/blog/” /> – thereby enveloping any link authority or indexed value for the variant URLs into a single URL/page.

Some very simple sites can get away without canonical tags, but most sites missing canonical tags suffer major setbacks in organic search. You can easily check the canonical tags across your entire site using a crawler like Screaming Frog.

Create & Include an XML Sitemap

Give Google a break. An XML sitemap helps search engines understand where ALL of your site’s pages are, how frequently each page is managed or changed and how critical that page is to the overall site. Providing search engines with a clean, well formatted XML sitemap insures that all of your site’s pages get indexed properly and with appropriate frequency.

For added assurance include a link to your XML sitemap with-in your robots.txt file so search engine spiders are sure to find it.

Many content management systems produce XML sitemaps by default, if your system does not coding an XML sitemap by hand can be a little overwhelming. I prefer to use a sitemap generator like XML-Sitemaps.com. It’s free, simple to use and generates properly formatted XML sitemaps in minutes.

Are you A Technical SEO Master Yet?

Perhaps not, after all this is by no means an exhaustive list of technical obstacles, but tackles some of the bigger items. Just know that technical SEO doesn’t have to be beyond your reach. Don’t be afraid to take on some of these critical areas of your website’s SEO, you’ll be glad you did.

Questions?

Share your technical SEO questions in the comments below, perhaps we can supply you with solutions to get your site on track!

Related Posts

Add a Comment

Hey, thanks for reading! We work really hard to provide relevant, useful and informative content. Please, let us know what you thought of this post - agree, disagree, did we miss something, get something wrong? - we value your comments!

* Name, Email & Comment Required