Let’s continue our discussion on the Top 10 Technical SEO recommendations that will help improve your site’s ranking. If you missed the first three recommendations, you missed a discussion about site speed, mobile-first and link building. Check it out right here.
A quick reminder that the goal of this little series is to introduce you to the fundamentals of digital marketing and help you become better clients, thinkers, and innovators. This is not a training manual.
All of these recommendations focus on working within your site content and code to make your site as crawlable (visible) to search engines as possible and making your content relevant for target keyword search results.
OK, on with the show!
4. Conduct a Site Audit
Identifying technical issues starts with a good site audit. Most SEO practices start their engagements with this type of audit but brands can perform an audit themselves. There are several great SEO site audit tools.
SEMrush is the most popular audit tool and one we use regularly. My favorite is Moz (I’m showing my age …) though you can’t go wrong with Ahrefs or SpyFu either. Any of these audits will result in a list of issues that reduce a search engine’s ability to properly crawl and evaluate your site. The next step involves fixing those issues.
5. Fix Your Site/Code Errors
Most Site Audit tools will list a broad variety of “errors”. These issues don’t necessarily impact a user’s experience, but they do slow down a search engine’s ability to review (crawl) your site.
Again, if a search engine can’t quickly crawl your site, it will not be ranked as well as sites that are more quickly crawled. Here are a few examples of common site errors:
- Missing Meta Description: There are many errors that a site audit will identify. According to Raven Tools, over one third of pages analyzed don’t include a meta description. Though not as critical as it once was, including meta data, especially a description, remains an SEO best practice. This piece of copy should be between 155 and 160 characters in length and provide a quick summary of the content that the user (search engine crawler bot) will find on the page. Best of all, the meta description is the content that displays on a SERP result below the page title. Think of the meta description as the page’s elevator pitch for the search engine (as well as the searcher). There are other pieces of meta data, be sure to include those as well!
- Link Errors: Nothing will stop a search engine crawl like a broken link or internal links (links that go to other pages on your site) that have a “nofollow” declaration. Remember that the bot is just trying to go from page to page, review the content and determine if your page should be displayed. Any link that stops the crawl will negatively impact your ranking. Fix broken links; either remove the link or update it so it works correctly. And, on all the links within the site, be sure to remove “nofollow” references unless you want to stop search crawlers.
- Site Structure: Since the goal of fixing site errors is to make it easier for the search crawlers to review your content, most SEO experts agree that sites need to manage the structure of the site. Rather than using a subdomain (cool.XYZdomain.com), search crawlers prefer the use of a subfolder (xyzdomain.com/cool). There is debate on this topic and I don’t recommend rebuilding your site if you do use subdomains but if you have the option, use categories and subfolders. Similarly, try to limit redirects (commonly called a 301 or a 302 redirect). When possible, update those links so the crawlers can get to your content faster.
One of the easiest ways to differentiate your content/page is by eliminating site errors that slow the crawl of the search engine bots.
Once you’ve removed errors that slow the search engine crawlers, take the time to add the tools that will improve your site’s ability to be reviewed. The most important of these tools is a sitemap.
This piece of code provides guidance to search engines on the organization of your site and how to review your site. Before you ask, this is not a required element though Google states, “your site will benefit from having a sitemap, and you’ll never be penalized for having one …”
Sitemaps can be generated in many formats. My experience is that XML is most effective for Google. Once you build a sitemap (be sure that the file is accessible to the search engines), submit it to the search engines:
- For Google, upload the sitemap through the Google Search Console.
- For Bing and Yahoo, upload through Bing Webmaster Tools.
If your audience uses a specific search engine, community, or network site, check to see if there is a way to submit your site or sitemap to that source.
7. Content in the Code
As you go deeper into the SEO audit, you’ll see recommendations to fold content into the code of your site. This adds content density to a page and reinforces your declaration (to the search engines) that your content is relevant for specific keywords.
The biggest opportunities to improve SEO include:
- Image Alt Tags HTML allows you to add text information to an image. Known as an “alt tag” or alternative text tag, this piece of code within the image declaration allows you to add information to every image on a page. You should include an alt tag with every image on a page and it should be descriptive content that reflects the keyword focus of that page. Speaking of images, remember that compressing/optimizing images is essential to site speed which also dramatically impacts search ranking. A secondary benefit of including an alt tag is that your images (and thus your site) will be more searchable when someone does an image search. And, image search results are included in 19% of all Google search results. Image alt tags are a great way to leapfrog your SEO competition.
- Remove Duplicate Content Duplicate content within your site will confuse a search engine. Which page should it rank? Whenever possible, delete duplicate content.
- Use Canonical URLs From a marketing perspective, you can’t always remove duplicate content. You may need the same language, word-for-word in multiple pages. Some eCommerce websites have hundreds of duplicate pages that still rank well. In these situations, use a tool like Yoast or HTML code (rel-canonical tag) to tell the search engine which page it should list (index). Canonical URLs establish authority and help with site management as well.
But Wait … There’s More!
In this article, we’ve focused on edits you can make within a site to improve a search engine’s ability to crawl the site and determine how important (relevant) a page is for any specific keyword.
In the next, we’ll conclude our overview of Technical SEO activities you can pursue to improve your ranking by looking at activities beyond the page content that will give you an advantage over the competition.
Until then, if the idea of executing on an SEO strategy seems overwhelming, know that you don’t need to go it alone. We’re in this with you. If you need a little help, just drop us a line, anytime.
Rainmaker Digital Services