This article concludes our overview of the Top 10 Technical SEO recommendations. Combined with part one and part two, this series includes information our search team shares when trying to help people understand how to improve a site’s search engine ranking.
Key to all these recommendations is the fundamental understanding that a search engine’s goal is to show users content that THEY are seeking and to establish their engine as useful and authoritative for a user’s search needs. All of these Technical SEO recommendations help your site become more friendly to the search engines and more useful to site visitors.
99Signals has a great definition of Technical SEO, “Technical SEO is the process of optimizing the crawling, indexing, and rendering phase of your website to achieve higher search rankings.” As Neil Patel might reminds us, great content is the most important factor in site rankings but this series of Technical SEO recommendations provides tips to leapfrog sites with similar quality content levels.
This article’s final three recommendations are all intended to make sure that your site is ready to compete for the SERP.
Make the Site Secure
In 2014, Google started hinting that sites served under HTTPS would be given preference over HTTP. HTTPS is a secure transmission format that is achieved by placing a Secure Sockets Layer (SSL) certificate on the web server of the site and sending data in an encrypted format to the user.
Today, serving web content over an encrypted SSL connection is the standard security protocol. As web serving has moved to SSL, web browsers and search engines have started to alert users when content is NOT secure. Most modern browsers now flag sites served under HTTP with a “Not Secure” alert.
These flags aren’t simply an annoyance. They are key factors in trying to make web browsing safer. Sites served under HTTP may allow malicious code and data mining to be placed on a user’s computer by simply clicking an unsecured link. We could spend several blog posts talking about the reasons that search engines “worry” about sending users to insecure sites (hint: liability has a cost) but since our topic is SEO, let’s leave the reasons to Google’s claim that “security is a top priority …”
HTTPS has quickly become the norm since Google’s hint in 2014 and research of over one million search results found that “HTTPS correlated with higher rankings on Google’s first page”. Yup, you read that right. If you want on the front page of Google SERPs, you need to make sure your serving content under HTTPS.
And, related to higher rankings, Google Chrome has already declared that after October 2020, the browser will not allow users to download files over HTTP. In other words, even if you dismiss the value of high search rankings, Chrome browser users will not be able to see your site if you fail to use HTTPS.
SSL is no longer an option if you want good rankings.
One of the most useful and easiest Technical SEO recommendations is implementing a robots.txt file that tells a search engine “bot” how to crawl the files that make up the website.
Robots.txt is used primarily to manage crawler traffic to your site. With simple commands like “Allow:” and “Disallow:” robots.txt tells search engines which pages to include and exclude from the search engine’s index (the database that the search engine uses to determine what sites to show for a specific keyword search.)
This text file is vital to strong SERP rankings as it:
- Prevents content that shouldn’t be indexed from showing up in a search result (no one wants a user to see an unfinished page or old content).
- Eliminates duplicate content from being indexed (duplicate content is deadly for good rankings).
- Ensures that the pages you want indexed are given priority. Some crawlers may not be making it through all of the pages on a site and a robots.txt tells the crawler which pages the site believes are the most relevant for indexing.
Similar to the SSL recommendation, the use of a robots.txt file doesn’t necessarily improve your SERP rankings but failing to include the file will decrease a site’s searchability and will result in lower rankings. SSL and robots.txt are the “daily exercise” of Technical SEO. They may not improve your rankings but failing to use them will result in worse results.
No, this is not a sneaky attempt to end the Top 10 Technical SEO recommendations with another reminder that great content is the best way to improve a site’s ranking. The user-friendly content recommendation is becoming more critical as sites look to add technology for popup windows, call-outs, and interstitial pages.
BUT, if you’re putting a page between a user’s click and the content they want (known as an interstitial) or are popping up an ad before sending a user to the content they’re looking to view, your search ranking will decline. This penalty was formalized in 2017 as Google put more priority on a mobile user’s content experience. Known as the mobile intrusive interstitial ad penalty, the idea is that Google will penalize a site that makes its content difficult to reach or view on a mobile device.
Google’s recommendation, and the way to avoid negative SEO, is to only use these tools to enhance the user’s experience. Logins, permission/approvals and placement (stick to the top or bottom of a document) that allow the user to navigate the page are acceptable uses. Beyond that, you run the risk of being unfriendly. Unfriendly content will hurt your ranking.
One More Thing …
SEO is no longer a magical backdoor to search visibility. Over the past 10 years, search optimization has developed into a comprehensive digital marketing discipline. Today, it is critical that you accept the importance of this discipline to maximize the return on your digital investments.
If the idea of executing on an SEO strategy seems overwhelming, know that you don’t need to go it alone. We’re in this with you. If you need a little help, just drop us a line, anytime.
Rainmaker Digital Services