“SEO Best Practices”: Finally the Same as What’s Best for Users

May 30, 2019

Make pages primarily for users, not for search engines,” has been Google’s Webmaster Guideline summary of how to do good SEO for some time.

For an equal amount of time, SEO pros (myself included) have always done a partial eye roll and qualified that with rants like “yes, part of our job is to make sure we’re speaking the same language as our users (keyword optimization and content development) but even before we do that, there are technical best practices we have to make sure are in place.  More than that, though, is that if we’re going to win in competitive markets, there are both on and off-page ranking signals we have to generate that are pretty well independent of what user we’re targeting”.

But how’s this holding up over time?  I’d argue, not so well – but it’s not just because SEO people have become more astute marketers.  I propose it is because Google’s own ranking factors are becoming more user-centric and, accordingly, targeting the algorithm is now the same as “making pages for users”.    We’re not smart – we’re just targeting new ranking factors that truly reflect Google’s users-first mantra.

Let’s dig a bit deeper into some of these ranking factors – some of which are not necessarily new, but have at least been given a user-centric spin in the last year or so.

Precursor 1 – Google’s Panda update penalizes ad-heavy sites

Back in 2011, Google’s Panda series of updates began penalizing “thin” content sites that featured excessive advertisements.  The SEO world found out after the fact, but the message was clear: annoying users is now bad for SEO.

Precursor 2 – Google’s Penguin update penalizes unnatural links

In 2012, Google’s Penguin series of updates began, delivering SEO penalties to sites with unnatural/manipulated link profiles.  The mechanical SEO rule was not to build spammy links.  The underlying intent was that links needed to be earned by developing good content that was useful to visitors and webmasters.

“If your pages contain useful information, their content will attract many visitors and entice webmasters to link to your site.” Steps to a Google-friendly Site

Precursor 3 – We’re told to allow JS and CSS indexation

We’d argue that Google’s 2014 update of their Webmaster Guidelines – specifically their request/requirement that webmasters allow Googlebot to crawl JS and CSS files – was one of the most important changes that signaled SEO’s imminent move from a mechanical discipline to one focused on user needs.  Why? Because this meant Google had moved to rendering-based indexation and SEO’s world was now one and the same as the world users experienced.

Mobile Friendliness – Moving SEO from mechanics to users’ needs 

2015’s Mobilegeddon update was one of the first times Google expressly told the SEO world about a future update and how to prepare for it. At the time, a basic SEO interpretation was:

  • Proper redirects needed to be in place to get users from the desktop version of the site to the mobile version
  • The mobile site needed to – well – actually work on mobile. Page content needed to look right, links or buttons needed to be click-able and the whole experience needed to be fast enough to be usable
  • Follow the rules on how to avoid being picked up as duplicate content (whether through responsive design or proper site canonicals)

All of these things were good examples of fairly mechanical SEO with user needs as an undertone. (The same thing happened again when Google established the intrusive mobile interstitial ranking signal – a mechanical “SEO rule” was delivered that, when followed, effectively forced sites to build sites/pages intended to improve users’ experience.)

The overarching theme that ran through Google’s mobile friendliness documentation was to become the model for future updates – these instructions were ways to “Make it easy for customers” and we were counseled to “Measure the effectiveness of your website by how easily mobile customers can complete common tasks.” Sure, we needed to make sure mobile users got to our mobile sites, but the big shift was that Mobilegeddon aligned ranking “rules” with formerly-diffuse requirements on matching users’ needs.   This would later be extended by RankBrain’s focus on user intent.  SEO has rules but those rules are changing to be about matching a site experience and content with the device a user is on at a specific point in time in their personal journey through search.

Critical Render Path – What is a “fast” page?

“…think broadly about how performance affects a user’s experience of their page and to consider a variety of user experience metrics.” Google Webmaster Central Blog

“…fast pages … have been set based on human perceptual abilities…” About PageSpeed Insights

Here’s an age-old SEO checkbox that Google has reformed: technical performance.   Page speed has been a topic of focus for SEOs for some time.  But the more recent change is that, instead of looking at total page load time (all assets called by the page), we’re now placing increasing emphasis on how a user actually experiences a page load (how long it takes for users to see “above the fold” elements and start using the page.)

In the most basic terms, critical render path refers to the series of events that must take place to render, or display, the initial view of a webpage.” In a practical definition by Ilya Grigorik, optimizing the critical render path is the process of, prioritizing the display of content that relates to the current user action.” So, how do you test your site pages to ensure that very process is happening? Luckily for SEOs, Google introduced a new tool in 2016 called Lighthouse – a step above the historical page speed analysis, and now the go-to testing tool for the mobile-first index. Search Engine Journal recently provided an excellent guide to understanding the performance metrics of Lighthouse, and using them to improve the overall quality of the webpage experience for the end user.

A few key practices for optimizing the critical render path:

1. Lazy Loading: Picture an e-commerce site with product pages that house over 100 product images. Or, a publisher’s site that has several rich image and video content elements throughout any one of its online articles. While the user may, at some point, engage with those page elements by scrolling down their screen, there’s also a case where they might never see it. Not only does this slow down initial page load time, but it also has an impact on device data usage, ultimately causing a negative user experience. Enter the “lazy load” technique, which, “defers loading of non-critical resources at page load time, and instead loads them at the moment of need.” In practice, a “placeholder” version of an image may be initially presented, with a clear image appearing once the user has scrolled to that portion of the page. Consider initiating a Lighthouse performance audit to identify potential candidates for lazy loading.

2. In-lining Above the Fold CSS, Defer the Rest: Before rendering a page’s content, the browser must process all style and layout information, which can often cause the rendering to slow down. One way to combat this is by inserting CSS into the HTML document itself – if the CSS is small enough. By doing this, the browser will only call the styles required to make the “first screen view” (above the fold) work, deferring everything else. See below:

<style>

CSS goes here

</style> 

3. In-lining Above the Fold JavaScript, Defer the Rest: Though not quite as simple to execute, it’s also possible to defer loading of JavaScript – meaning, you can allow a webpage to fully load before loading external JS. Instructions can be found here.

In tandem with the above techniques, our teams are also beginning to test into template-based CSS – for instance, having one set of CSS and JS for category pages, product pages, blog pages and homepages – in efforts to further minimize “load bloat”.

RankBrain – Want to build pages “for users”? For which users? On what device? Looking for what? 

 RankBrain is one of Google’s top three ranking factors but – given sub-topics like user intent and artificial intelligence – it is the most foreign yet to site owners & traditional SEO practitioners.  When asked about RankBrain, Gary Illyes’ perhaps over-simplified explanation was,

RankBrain is a PR-sexy machine learning ranking component that uses historical search data to predict what would a user most likely click on for a previously unseen query…

The “historical” part glazes over the importance of semantics and machine learning’s use of structured data in understanding a vague query (yeah, keyword stuffing and 1 keyword per page is dead) but it does point us to an important way that this SEO ranking factor is all about meeting users’ needs: RankBrain relies on historical user interactions with content to determine what should be within future search results.  How so?

1. CTR from SERPS: In a 2015 tweet, Danny Sullivan shared confirmation from former Googler, Udi Manber, that Google does, in fact, watch clicks to evaluate results quality and their rankings in the SERP. In essence, Google wants to learn what’s successful in the results page by watching how users interact with content, and more specifically, when the signals are saying, “this result no longer belongs in the top spot”.   Title tags and meta descriptions are the old, mechanical SEO rule but they’re being put to new, user-centric use: attracting CTR from SERPs where that is the ranking factor, not so much keywords present in the tags.

2. Pogo-Sticking: Though Google fervently denies that they explicitly use “pogo-sticking” as a ranking signal, we suspect that at the very least, they are always monitoring the behavior, which may in turn influence results. Pogo-sticking happens when a user clicks a result, goes to a website or landing page, and almost immediately returns to the search results page – theoretically indicating that the website or landing page content didn’t match with the actual intent behind the search.  (From the site-owner’s side, you’d call these bounces back to the SERP, but Google can’t depend on site-side analytics to monitor this factor so, instead, they watch their own pages for users to leave and quickly return.)

3. Intent:  Overall, deciphering a user’s intent and matching content to that intent is one of the key functions of RankBrain.   If a user is on mobile and wants to “find a plumber”, the intent requires largely local results.  Further (back to Mobilegeddon), Google looks for sites to make it easy for mobile users to find local contact information.   RankBrain is how Google “pays attention” to the type of language, the device and search history the searcher is using to determine what to return in their results page: whether that’s local, transactional or informational… and they serve a direct answer, a local pack, a paid ad or an opportunity to refine your query through organic faceting.   I don’t know of any traditional, mechanical SEO skill that is analogous to this ranking factor. It represents a search engine that has moved through an ancient, math-based algorithm, into mechanical compliance with rules based on user needs and that has graduated to content marketing and user experience-based factors.  Is RankBrain really an “SEO ranking factor”?  I don’t think you can call it that.

To reinforce the complexity of RankBrain, consider how the algorithm must think through searches for coupon codes: the user will query for coupon codes, see and click through to a coupon site, find their code and then go back to the search results or another site very quickly, even though their needs have been fulfilled.  RankBrain cannot have a universal set of rules like SEO is used to.

Where’s this leave us? 

So what’s my point?  Other than highlighting the evolution of Google’s algorithm, we have three takeaways:

1. SEO can and should still follow “mechanical” rules but should understand those aren’t about beating a math formula, they’re about satisfying users, so perceived load time and bounce rate are of preeminent importance.

2. Long form content and even ecomm landing pages should explore the users’ side of the experience. I don’t mean A/B testing button color, I mean develop content that fills the needs, answers the questions or highlights the unique selling propositions of your site’s products or services. This is natural language – not sales or “SEO copy”.  ~shudders~

3. Everything Google is doing with their algorithm today is focused on voice search. How have I reached that conclusion?  Other that the fact that this is where the search market is headed (according to multiple resources, voice searches as predicted to exceed other methods of search by 2020),  satisfying a voice search query means a search engine must do exactly the type of language and user-intent-interpretation we’re seeing in play with RankBrain and search features like Direct Answers.  “Feed” RankBrain today and you’ll win in voice search tomorrow.

Follow ForwardPMX

You May Find These Interesting

Retail’s Biggest Shopping Season is Expanding

December 6, 2019Flashback moment…. The year is 2005. You’re at work with a few free hours away from the family, ready to knock off some more of the items on your holiday shopping list that you didn’t get to on Black Friday. Maybe you’re snagging the amazing deal on a...

read more

Giving Thanks Some Meaning

November 22, 2019“This Thanksgiving, I am so thankful for generous supporters like you and all the wonderful work we’ve accomplished together this year.” Cue: eyes glazing over, and a finger hovering over the DELETE button. As a nonprofit, of course you’re endlessly...

read more