Archive for December, 2016

No More Ads on the Right Side of Google Search Results – except…

Google test worldwide

It was confirmed earlier this year. Google will stop showing ads on the right side of their search results, everywhere in the world.

The only places you will see paid ads are at the top and bottom of the search page.

This might tidy up the SERPS, but there are two exceptions to this hard and fast rule.

google-four-ads

Ah, but there’s more

Google instead now shows more ads at the top of the results, up to four, for “highly commercial queries”.

It has been confirmed by Google that this change is global. No more ads on the right hand side of search results, except for…

  • Product listing ad boxes. These are the ones where you search for a camera and you get to see them for sale right on the search page. They may still appear on the right and have ads in them.
  • The knowledge panel. This very convenient Google feature that can save you from having to click through to a website to get the basic information that you queried. Can also appear on the right with ads included.

Google has been testing the four-ads-at-the-top layout since 2010. Mostly in countries outside of the US. The results, over six years of testing and tweaking, have convinced Google that this is the way to go worldwide.

Here’s what they officially have to say on the matter:

“We’ve been testing this layout for a long time, so some people might see it on a very small number of commercial queries. We’ll continue to make tweaks, but this is designed for highly commercial queries where the layout is able to provide more relevant results for people searching and better performance for advertisers.”

The “commercial queries” mentioned are the competitive strong buyer searches, such as “great gifts for christmas” or “the best life insurance”.

Effects on your marketing

If you’re running Adwords, then there’s a benefit in knowing that Google has done extensive testing of this new layout, and it most probably generates better, more targeted click through rates.

After all, this has been a six year long split test, which only a business the size of Google could afford to carry out, to come to this change.

The lesson we can take from Google is, it pays to test your design – thoroughly.

Extensive SE Study: Why You Still Need Backlinks

A huge study was done, the results are remarkable

Our good friends over at backlinko took it upon themselves to trawl data from search engine results.

They did a deep study of over 1 million search query returns from Google herself.

1 million.

That’s how dedicated to SEO they really are. They did this in an attempt to “reverse engineer” Google’s search algorithms.

What does Google want to see in a website before ranking it at the top?

The results they got from the study are fascinating.

Backlinks, backlinks, backlinks

They found that the inbound link (IBL) remained a very strong ranking factor across the majority of the 1 million search results.

However, it’s not just as simple as lots of IBL’s = top Google positions. Domain diversity is far more important than basic quantity in this case.

The incoming links need to be from different, individual domains. The higher the authority of these domains, the more weight for your ranking.

If the domains are related to your niche, even better. There are some domains that will boost your rankings regardless of domain. These are the mainstream giants such as CNN, BBC, Huffington Post, etc.

For a better overview of how to determine a quality backlink, read this post I made earlier.

Other nuggets from the study

Images, video, media.

The presence of at least one image on a page turned out to be a significant factor.

This is because Google considers content to be more engaging to the user if there is media present other than just the written word.

So the fact that Google likes to see multi-dimensional content should be borne in mind for on-page SEO.

The need for speed

Slow loading pages are strictly barred from the top results on Google. If your hosting providers can’t push out data fast enough when called upon, you will need to ditch them for a better (faster) service.

You might have everything else right about your SEO, good backlinks, great content, rich media, perfect keywords, but if your site is a slow loader, forget about your front page dreams.

Security

HTTPS counts. Google gives a certain preference to secure websites. It is certainly an extra pain in the rear end to manage and keep up to date, but the general advice is that if you are starting up online, build in HTTPS.

Is it worthwhile to switch your whole operation over to HTTPS? The experts advise that it is not necessary for SEO purposes at the moment (but do it if you can).

User retention, no bouncing

Brian Dean has made an interesting video explaining how he discovered this independently from the big study. Due to a keyword mix up, he shows how he was ranking for the “wrong” keyword for a particular post, as well as for the “right” one at the same time.

Due to the users landing on the “wrong” page and then bouncing away, he observed that due to the bounce rate being high, his rankings plummeted for the “wrong” keyword, while they steadily increased where users found what they actually wanted and stuck around.

Here is that video:

Long content

Ever  wondered why you never see an individual tweet on any search result, even though Twitter is easily one of the highest authority domains on the entire planet? Even though some  people may pack a lot of information into 140 characters, it is simply not long enough to show as a result.

You do, however, often see Pinterest or Tumblr posts on the front page, where there is enough content for a user to get stuck in to.

The backlinko study found that content with around 1900 words generally ranked higher than shorter blocks of information.

Make sure your keywords match your topic

The keywords of a page should match the overall topic of that page. Google has the ability to scan for this and figure it out with a high degree of accuracy.

If you’ve used keywords in the usual places, title, header, description, etc, then the body of your content had better match the topic suggested by those keywords.

This measure is to prevent the targeting of popular keywords to get a bigger audience, but then switching to a different context once they are there.

Basically, as should be obvious by now. That won’t work.

Google’s perfect web page

To sum up what Google wants to deliver to their users, we could say they like:

  1. very high quality content
  2. with pictures and interesting artifacts
  3. on a (secure) page which loads at lightning speed
  4. that has lots of backlinks from similar, or better types of sites from a wide range of domains

If you can deliver this within your niche, then your site is in with a chance of hitting the front page. After that point, you just have to hope that your competition is not as diligent as you are.

How Artificial Intelligence has Switched-Up the SEO Game – More than you Imagined

You better have your intelligent head on, because you’ll need it to synergize with this post and stay on the sweet spot of the SEO curve.

Notice I say the “sweet spot” because you don’t want to be too far behind, e.g. spamming keywords (which used to work), nor too far ahead because you don’t know which direction search will take in the future. Get it wrong and you risk dropping off any and all indexes.

I don’t mean you will have to use all your 180 IQ points while you read, I’m going to lay this out in a way that the average, switched-on human can fully understand.

Today we’re going to focus in on artificial intelligence (AI from now on) and how recent advancements in implementation have radically altered the way SEO works on the most cutting edge search engines.

The use of AI in search engine technology is not particularly new. The very simplest idea of counting keywords in a document to score its relevance is actually a form of AI.

Now we have come much further and the game has changed. Let me bring you up to speed.

RankBrain

Google’s RankBrain is one of the newer AI algorithm’s currently deployed by our old friend the big G.

It is far more advanced than a simple keyword counter and backlink checker. This system actually uses a similar method to “think” about a website as does a human brain.

RankBrain relies heavily on neural networks.

AI has two categories: strong and weak. Strong AI is something that you might see in a science fiction movie such as Ex Machina, 2001 An Odyssey, or JARVIS in Iron Man. These have intelligence capabilities that approach and even surpass human abilities.

hal

Weak AI is pretty much what we have in real life. Everything from IBM’s Watson to contextual advertisements on web pages.

No surprise then that RankBrain falls into the camp of weak AI, but it is several steps closer to the “strong” category than say, a very good chess playing program.

Since Google’s public announcement early 2014 that they were going to buy the highly specialist AI company company DeepMind, we can deduce the types of technology that they are integrating into their platform.

It’s about 95% certain that backpropagating neural networks are being integrated into Google’s search process, as that is what DeepMind were particularly good at, producing almost scarily impressive machine learning results from their labs and think tanks.

Backpropogation

If you are not trained in the black arts of AI, you may well be wondering what in the sweet blue and purple is a “backpropagating neural network”?

A simplified explanation is as follows:

A neural network takes at least one input and either gives an output, or not, based on whether the total value of the inputs reaches a threshold value (a predetermined value of some amount, whether energetic or numeric).

Bear with me, we’ll get to the relevance to SEO very shortly.

With backpropagation, an expected output can be compared with the actual output. If the actual output is not what is expected, the values between the input and threshold are adjusted, from output back to input, for the next pass.

Read that last paragraph again carefully. When the concept clicks in your mind, you will see how the machine “learns” new information. Your own brain will have also gone through a very similar process.

backprop

What it means for SEO

It means the way that current SEO experts have been evaluating Google’s algorithm changes has become almost redundant.

So now every time Google looks at a site, it has learned something new about it. Either how good the information is within the content, or how it relates to changes to all the other content in the same field.

RankBrain also constantly learns about particular search results.

The kind of information taken into account by the intelligent search engines is not a blanket view as may have been the case in the past. By that I mean that not every search will be judged by the same criteria.

For example, traditional SEO advice has been to optimize keywords, have great content and get as many high quality backlinks as possible, no matter what your site is about.

Sound advice indeed, but now the search engine may not give a damn about your backlinks, depending on the specific search carried out by the user, and where your content sits within their ecosystem.

On another type of site and/or search, backlinks may be more important than ever, but keyword density not so much.

Again this depends on what the back propagating algorithm has learned about that specific search, and possibly even who is doing the searching.

Google never got rid of their pre-update algorithms

They have certainly maintained and updated those earlier, core algorithms that count keyword density, inbound links, read META titles, tags and descriptions, etc, etc.

RankBrain uses all those “old” algorithms, plus the newer developed ones, on a case-by-case, search-by-search basis.

To stay ahead of the game, you will need to know how Google ranks your site, according to a particular search term, so that you know where to best invest resources.

This will mean a lot of testing and observing results, as well as studying where high ranking competing web sites and pages are optimizing, in order to glean some insight where possible.

If you are doing SEO, you will need to literally dissect these sites at every level to try and figure out what Google finds important for your target keyword.

Specialize

A short term optimization that makes sense within this relatively new search paradigm, is to dig in deeper within your niche, and do not diverge from it.

RankBrain will be learning constantly about your niche and already has a profile of what it wants to put at the top of Google’s results pages for your target keyword.

Study what’s already there, emulate the consistencies that you find and make improvements wherever possible.

Competition for search engine rank has become orders of magnitude more focused.

This specialization extends to your backlink profile. RankBrain is certainly checking out what type of backlinks are attached to top quality websites and you will need to emulate that profile to remain competitive.

The general advice here is to keep your backlinks topic related as well, so as not to be seen stepping out of line according to RankBrain’s expectations.

Where do we go from here?

RankBrain and similar machine learning algorithms represent a quantum leap in search engine evolution. Will we reach the point of hyper-intelligent machines functioning alongside us with god-like computing power? At this point, almost anything seems possible.

SEO is becoming more and more technical, specific and specialist. Vagueries and generalities in SEO are mere shots in the dark nowadays. SEO practitioners are going to have to swot up on the technical nature of artificial intelligence in order to stay relevant and competitive.

Search terms and keywords must now be analyzed in isolation of one another.

Narrow field, niche sites are the logical choice to stay above water for the time being.

Those who can navigate this new environment will be in great demand. The computer scientists win again.

Show Buttons
Hide Buttons