The Ultimate SEO Audit
The purpose of an SEO Audit is to paint an overall picture of what you’re doing right, what needs to be improved, and what issues may be hurting your rankings. If you’re familiar with SWOT analysis (Strengths, Weaknesses, Opportunities, and Threats,) that’s exactly how you should approach the SEO Audit.
Usability Review
- Evaluate the visual design. Navigate your site through the eyes of a visitor. Is the site aesthetically appealing? What type of user experience does your site offer? Is there a logical hierarchy between pages and subpages? Is there a comfortable flow between content areas? Does your site look like it was made in 1999 or 2009?
If the visual design is driving visitors away, then no amount of SEO efforts will help increase traffic and conversions. If your web site is in rough shape or built entirely in Flash, now is the best time to redesign it.
- Check browser compatibility. When designing and optimizing a site it’s important to see how your site renders in operating systems (e.g. Windows 7 or Mac OSX) and browsers (e.g. Firefox, Safari, or IE7) other than your own. Browser compatibility has a huge impact on usability. I suggest using Browser shots and/or NetRenderer.
- Custom 404 page. Generally if a user reaches a 404 page (”page not found”) they’ll bounce off your site. Improve your visitor retention by customizing your 404 page. At the very least, it should include an apology, a prominent search box, a link to your sitemap, a link to your homepage, and your standard navigation bar. Add in a funny picture and some self-deprecating humor and you may just keep that visitor.
Accessibility / Spiderability Review
- Look through the eyes of a SE spider. Run your site through the YellowPipe Lynx Viewer to see what your site looks like from a search engine spider’s perspective.
- Turn off Flash, Javascript, and cookies on your browser and see if you can still navigate your site.
- Look at the Robots.txt file. If you’re using a Robots.txt file make sure you aren’t excluding important sections of content. A misplaced Robots.txt file can prevent whole sections of your site from being indexed. Use the Robots.txt checker in Google’s Webmaster Tools.
- Navigation location. Your primary navigation should be located at the top or top-left of your site and appear consistently at the same place on every page. Make sure your navigation links to all major content areas of your site and includes a link back to your homepage.
- Javascript in navigation = Bad! When Javascript is embedded inside of navigation elements (e.g. to create a drop down menu) it renders the links in the menu invisible to search engine spiders. This is also a cross-browser compatibility issue, and your fancy drop down menu will likely break on some browsers. Don’t use any Javascript or Flash navigation, but if for some reason you need to (really?) then be sure to have the same menu elements appear in an HTML-only navigation bar in your footer.
- Breadcrumbs. Use ‘em. Not only do breadcrumbs show visitors where in the site hierarchy the current page is located and provide a shortcut to other pages in the hierarchy, but optimizing your Breadcrumbs with keywords will help your SEO efforts.
- Frames. Don’t use them.
- Splash page. Don’t use one.
- Flash. Flash is ok in small doses, but if the site built entirely in flash then just give up now; don’t even try to optimize the site. You won’t be able to rank competitively until you do a top-down redesign.
- Clicks-from-homepage. Every page on the site should be accessibly from the homepage within 3 or 4 clicks, or it becomes harder to find for visitors and risks being ignored by Spiders.
- Restricted access. Are there any pages on the site that (1) require a login or are accessible only via (2) a search box or (3) a select form + submit button? These pages will not be indexed by Spiders. That’s not a problem unless there’s restricted content that you want the spiders to index.
- Broken Links. Use a tool such as the W3C Link Checker to check for broken links. Pages linked to by broken links might not be accessible to SE spiders, and search engines may penalize sites with an abundance of broken links.