A lot of companies tend to over-think or over-engineer their SEO efforts but they forget things like keeping the site up, site speed etc. After working with over 20 companies in the last year or so and having been in-house, I’ve seen a lot of these issues first hand.
Here are eight of the most common technical issues that I’ve seen or that companies regularly overlook:
1. Site Speed:
A few years back Google announced they were including site speed as a ranking factor. On the many sites I’ve worked on, like Ranker, once we decreased the time it took a page to load, we saw and increase in pages crawled and indexed. You can get your site speed information in Google Webmaster Tools, but, they also have a PageSpeed Tool
as well. The page speed tool will give you a score out of 100, tell you what you’ll save in terms of page size, and how to fix items.
2. Sitemaps: For large e-commerce sites or content sites, sitemaps can be extremely important to ensure all content is being crawled and indexed in a timely manner. Along with that prioritizing important content and the crawling and indexing of that content. Make sure you set sitemap prioritization and crawl rates based on the importance of the content.
3. Crawl Efficiency: For similar reasons to sitemaps, finding out you have too many pages open to search engine robots can eat into your crawl budget. Meaning, search engines typically assign a budget to sites in terms of pages indexed, etc. An example of a fix to this is low value pages that are being crawled by search engines on a regular basis. You can remove those pages through either applying a <noindex> tag or disallowing the content via robots.txt
4. Javascipt/CSS issues: I’ve seen companies either make mistakes or forget to build progressive enhancement into their site architecture. What happens then is that things like filters, on results pages end up being text instead of links for search engines to crawl into deeper content. Here is a quick read on how to build with progressive enhancement in mind:
5. Duplicate Or Missing Meta Data (Titles Tag / Meta Description): I think with every client I have had, when we dug into their Google Webmaster Tools account, I’ve found missing or duplicate meta data. Many times, if the content is there, it doesn’t truly describe the page they are on. For example, we had one client who had “credit cards” as the title tag for all of their credit card product pages rather than the name of the card. Once we switched to the name of the cards we ended up ranking top 10 for many of the card names within 30-90 days.
6. Capitals in URLs:
Capitals in URLs can be tricky or lead to issues because they are seen as a different URLs and can lead to multiple pages being indexed for no reason. For example somedomain.com/SomePage.html
would be seen as an entirely different page than somedomain.com/somepage.html
and this could lead to a loss in crawl efficiency and also lead to duplicate content concerns.
7. Soft 404s: Another example of a status code issue is serving up a soft 404 that delivers a 200 OK status code when it should be delivering a 404 not found status code. You can find these in Google Webmaster Tools and address the problems on a case by case or sitewide basis pretty easily. Here is some more info from Google’s webmaster blog:
8. Canonical Tags:
After the canonical tag was released, a lot of websites implemented the canonical tag. The problem then became, countless sites actually implemented it incorrectly. They were implemented incorrectly on things like product pages, category pages, pagination, and so on. A quick template to think of would be <link rel=”canonical” href=”http://somecompany.com/product1/
” /> and any variation with query strings, etc. would be directed to the original url.