Over 3.5 billion Google searches are made everyday and 35% of these are new search terms. No wonder that over 40% of advertiser revenue is captured only by search engine marketing. Given the immense pressure the website owners are under to rank well, SEO practitioners deal with a lot of factors affecting website ranking, everyday.

Well, there are 200 of them. Good news, not all of them may have a direct or immediate impact on you. However, that does not mean you can’t do anything to rank well on search engines.

You may either have a new website or you may already have well-ranking posts to boast on search engines. Either way, the need for SEO never dies; SEO efforts should not die either. Given that publishers make most of their revenue by running ads on their website, it becomes imperative for them to rank well on search engines. The intent is to bring new users to their site who will ultimately click on the ads which is directly correlated to the payout they receive.

So, what are the possible factors affecting your website ranking or stopping you from appearing on the first page? Let’s go through the factors step by step, and then understand the measures you should take to correct them. Let’s begin!

Site/Page Level Factors Affecting Website Ranking

At first, we’ll discuss some site/page level factors you might want to fix before moving on to the others. Prioritizing site/page level changes makes sense because these are real-time fixes and do not require ongoing experimenting (which we’ll also get to in a bit). Moreover, these fixes are the backbone of your website.

1. Missing Domain Security

Difficulty score: Easy

Domain name is usually the first thing you get for your website. For example a new travel blog:

HTTPS as a ranking signal for increasing website traffic

Notice the ‘https’ at the very beginning of the URL? As you may already know, HTTP stands for HyperText Transfer Protocol. Here, if you’re thinking the ‘S’ stands for ‘Secure’, you’ve guessed right. The ‘S’ followed by the ‘HTTP’ denotes SSL certification, hence the security of domain.

It’s not recent where Google announced HTTPS to be considered as a ranking factor. Talking of security, do you think Google search engines would like something looking like this:

HTTP as a negative factor affecting website ranking

Probably not.

2. Poor Site Architecture

Difficulty score: Easy

Just like HTTPS represents a healthy website domain, contextual sub-domains represent a healthy site structure. Having well-defined subdirectories (URL structure/slug) help your web pages to be seen over other sites. Structured URLs tell search engines who you are and what topics you’re trying to rank for.

For example, if you’re publishing a new blog post around ‘must-visit places in Japan’, a healthy URL should look like this:

https://publisherwebsite.com/travel/best-places-japan

Over something as de-structured as this (not to forget the ‘S’ after ‘HTTP’):

http://publisherwebsite.com/R200804302411772L39+4QPKA&min_pr500

3. Missing Robots.Txt

Difficulty score: Medium

Your site URL proves to be secure with HTTPS. But despite the security, Google needs to know from you (the website owner) whether you want the crawlers/web robots to index your pages or not. Robots.txt is a file instructing web robots to do just that.

Consider the above mentioned URL, https://publisherwebsite.com/travel/best-places-japan.

Before indexing this page, Google’s robots will look for https://publisherwebsite.com/robots.txt to find this:

User-agent: *
Disallow: /

Here, user-agent: * applies to all robots/crawlers and disallow: / means that they shouldn’t index any pages of this site (this usually includes pages that require you to login).

Many times, a missing robots.txt file feels like a small factor affecting website ranking. However, in reality, it’s one of the most initial steps that should be completed. AdSense publishers especially should not overlook these basic crawling issues.

4. High Page Load Time

Difficulty score: Medium

Users like friction free UX, and so goes Google. Typically, websites/pages rendering under three seconds are considered to have a good page load time. It boosts your chances of ranking higher on search engines.

For publishers, page load time happens to be one of the most serious factors affecting website ranking. Why? In addition to the July 2018 Google Search update (post which slow sites are penalised), more ads mean more browser calls, i.e., higher load time.

Consequently, publishers try to implement hacks like lazy loading to decrease load time and deliver better user experience.

To gauge the metrics around your site speed, free tools like PageSpeed Insights come in handy to provide a comprehensive view of a website’s performance (in addition to pointing out what exactly it is that is causing the site to load slow).

This is how a sample PSI report looks like:

Google PageSpeed Insights

5. Schema Markup

Difficulty score: Medium

Schema is basically bringing context to content. It tells the search engines what your content speaks about in addition to what it means. Schema markup is a code snippet telling both Google and the users the exact context of the webpage. There are hundreds of use cases where schema can be applied.

As an example, if you’re running a travel blog to attract visitors (to show ads), you might want to consider the ‘Review’ schema. You might have noticed certain search results appearing different from the typical ones. Your search result having review schema should look like this:

Increase website traffic with schema markup

Schema does not promise the first rank. In fact, no single factor does. However, it sure reflects a more credible and contextual result. If you’re using WordPress, using custom fields or related plugins should help you implement schema. Just because it’s a piece of code, implementing schema is not that complicated and can be done with minimum technical expertise.

6. Overburdened Server

Difficulty score: Medium

One of the most unfortunate factors affecting website ranking is the server. Believe it or not, an overloaded server might actually crash your website if it experiences a sudden hike in traffic. The case of shared server is worse. Here, changes in other websites take you down with them.

Server breakdown may sound old-school, but this still holds  true even in the times of AWS and Azure. To detect server issues, you can use Google’s fetch and render tool to check for errors in your server logs and see how a URL on your site is being rendered.

Content is creative. Opposite of site/page level fixes, content (or creative) factors are ongoing experiments. Hence, what you as a publisher should do is incorporate these basic practices in your everyday content strategies to rank well and increase traffic.

7. Average Content and SEO

Difficulty score: Easy

As mentioned earlier, content is creative, so it does not follow a rulebook. But there are a few factors which are predefined and you’ve got to take care of them. While publishing a new piece, make sure you tick all the checkboxes which contribute to a strong SEO.

Here are the most evident ones you should implement with every new post going out:

Title having the focus keywords, images with alt text, internal linking, headings, SEO-proficient meta description, keyword-focussed slug, and reading analysis.

These results sure make me happy when I see my blogs ranking on the first page even for highly competitive keywords (like AdSense arbitrage, search volume 1600). You know you’re producing the best content; but perhaps you can make it better.

8. Unoptimized Content

Difficulty score: Medium

Probably not a single immediate factor, but a bunch of teeny-tiny factors which affect website ranking. Here, content factors like repetition, keywords stuffing, content length, and formatting style come in.

The average content length of a Google first page result is 1890 words. It is common wisdom that with increasing content length, scope you have to make your content rank grows higher. But, lengthy content does not mean unnecessarily mean repetition or keyword stuffing. As a publisher, you’ve got to make sure the quality remains intact.

9. Google RankBrain

Difficulty score: Medium

Neither old nor a very recent update, RankBrain is the name of Google’s AI used to process the results appearing on search engines. How? The RankBrain algorithm looks at how people are acting on your (publisher) website. Are they leaving in no time (bounce rate)? Are they clicking or not (click-through rate)?

These metrics help RankBrain decipher whether your site is worthy of good rankings or not. Relevant, quality content plays a major role to rank well. For publishers, there are more content metrics they should keep an eye on.

10. Inadequate Linking

Difficulty score: Medium

Quality content is the key. But Google wants more. It wants to give you the benefit of doubt and believe that your content is legit and others on the Internet are talking about it. To get Google to see your credibility, you’d want to focus on these three forms of links viz. inbound, outbound, and internal linking.

Inbound links refer to other websites who have linked to your website in any of their webpages (or in other words, websites sending traffic to you). Getting inbound links from reputed websites should be a constant endeavour. Tools like SEMRush and Ahrefs can help you find websites who have linked to you.

Outbound Links on the other hand are something to have more control over. You can establish them by linking to reputed websites at the right spots (with the right context).

Last on the list (and the most undervalued one) is internal links which involves connecting one of your webpages to your other webpages, but contextually.

11. Winning Competitors

Difficulty score: Medium

In the beginning of this post, we discussed about the large search volumes for the search engines. A large portion of it may not affect you as a publisher. However, the competition in your own niche is no less intimidating either. One of the factors affecting your website ranking, despite keeping all things in mind, could be competitors performing better than you.

It’s possible your competitors are getting more backlinks, quality mentions, more visits, and better engagement which is why Google prefers to rank them over you. In this situation, using social monitoring or SEO tools should prove to be a life saver.

Having good UX is a no-brainer and non-negotiable. You might be following more SEO practices than the ones stated in this blog. But it won’t matter at the end if your site looks awful (awful might sound harsh, but you know it’s true). UX is all about reducing friction of users interacting with your site and you must take into account these UX factors:

12. No AMP Yet

Difficulty score: Medium

2018 recorded nearly 52.2% website traffic coming in via mobile devices. Based on this, a lot of websites have since implemented AMP (short for Accelerated Mobile Pages), which facilitates lightning fast page load speeds on mobile.

AMP helps in increasing website traffic and ranking

Honestly, Google does not explicitly ask websites to adopt AMP and boost rankings. AMP makes more sense for publishers struggling with page latency on their mobile website. Thanks to AMP, publishers improve their page load times, ultimately leading to a better user experience.

13. No Mobile-friendliness/Responsiveness

Difficulty score: Medium

Everybody talks about mobile-friendliness. It’s been a while since Google’s mobile first index stated that it prioritizes mobile friendly pages over desktop. Considering the fact that mobile-driven traffic is surpassing desktop, responsive websites have gradually become a prominent SEO factor affecting website ranking.

Responsive websites contribute in better ranking and increasing traffic

Because of their dependency on content either generated or read by mobile first audience, publishers need to take care of a few things while planning out their mobile experience一a non exhaustive list would include focusing on the readability of fonts on small screens, easy navigation (menu), ensuring interstitial ads do not cover important content.

14. Bad Website Layout

Difficulty score: Medium

For Google, carelessly designed (bad interface) websites are a no no. Similarly for AdSense, recklessly placed ads or too many ads are a turn off. Take a look at this to decide for yourself:

Website layout and user experience affects website ranking

Which site would you rather spend most time on? Likewise, Google ranking parameters demote websites with below-average user experience. Thankfully, there are ad layout optimization tools out there which do the legwork to make your site not lose it’s elegance.

15. Ad Mistakes

Difficulty score: Medium

In conjunction with bad ad layouts, there are a number of ways ads can lead to bad SEO. Ad network’s failure to compress ads, scripting errors, ad blockers, or anything that causes the crawlers to take more time than required prevents your site/page from getting indexed.

Moreover, going overboard with different ad types (like autoplaying video ads with sound on, wrongly timing pop up/interstitial ads) makes for a bad user interaction with the site. Fortunately, there are best practices that publishers can follow to make these kind of ads work.

Quick Good Health tips

We have so far discussed the most prominent factors which are probably affecting your website ranking. It’s time to change tracks and focus on some good health tips. Just so you know, these tips won’t change the face of your website ranking at once. But they’ll surely indicate to Google that you’re making the right efforts to improve your website ranking. Here you go:

  1. Set a Webmasters account, if you haven’t already.
  2. Find and fix all the broken links and redirects. Google despises them.
  3. Take some time and refresh/repurpose old but gold content. Don’t archive them.
  4. Set 404 redirects where needed. Tell Google you’re not ignorant towards inactive pages.
  5. Check and fix bad quality link penalties. Find the source and stay away from them.
  6. Thoroughly monitor the website overhaul. New changes could be the reason of downfall.

Conclusion

Did you know that the first ranking webpage on search results receives almost 42.25% CTR? The second result receives almost 11.94%, and the third result receives almost 8.47%. For all the top positions, the CTR doesn’t seem so bad. But you’ve got to work for it.

Ranking on Google may not be an overnight job. For weeks, you might not see any results. However, that does not mean publishers should feel disheartened. Remember, SEO is a mixed bag where some factors may work instantly while others may take time. The key should be to keep going, keep working, and keep improving.


Author

Shubham is a digital marketer with rich experience working in the advertisement technology industry. He has vast experience in the programmatic industry, driving business strategy and scaling functions including but not limited to growth and marketing, Operations, process optimization, and Sales.

1 Comment

  1. Codeaxia Digital Solutions Reply

    I think this is a really great article, it was amazing, thanks for the help 🙂

Write A Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Recent Posts