SEO Tip of the Week: Onsite Trust Optimisation – Make it easy to read your content

90 Digital CEO Nick Garner talks about another important part of Trust Optimisation which is making sure that users get what they expect to see in this edition of CalvinAyre.com’s SEO Tip of the Week.

Short intro to Trust Optimization.

Along with on site optimization and link acquisition, there seems to be a third element to ranking well on Google, it’s called Trust Optimization. Trust Optimization is based on the information we have got from anecdotal evidence on click through rates and rankings along with explicit guidelines from Google stating what they are looking for in a trusted website.

On with the post…

In a previous post we talked about accessing the main content of a page without having too many ads and get in the way. Here we go into disruptive and destructive elements on the page which gets in your way.

In Google quality Rater Guidelines 2014, they say:

“ • Ads and SC should be arranged so as not to distract from the MC—Ads and SC are there should the user want them, but they should be easily “ignorable” if the user is not interested.

• It should be clear what parts of the page are Ads, either by explicit labeling or simply by page organization or design.”

SC = supporting content

It’s tied into trust once again because users trust Google will deliver them content which satisfies their search query.  If you land on a page and the first thing you see is nothing  other than the superstitials  asking you to join a mailing list, you’re not going to be happy.

seo-tip-of-the-week-onsite-trust-optimisation-make-it-easy-to-read-your-content-video-1

If Google are asking quality raters to account for this kind of thing it means they are probably looking for it within their algo.

Of course, that’s just me speculating.  But I think it’s a reasonable speculation to make, why?

Google have been subtly but repeatedly asking webmasters to make sure they can access your CSS and JavaScript.

seo-tip-of-the-week-onsite-trust-optimisation-make-it-easy-to-read-your-content-video-2

https://www.youtube.com/watch?v=B9BWbruCiDc

From Google:

“Posted: Monday, October 27, 2014

Webmaster level: All

We recently announced that our indexing system has been rendering web pages more like a typical modern browser, with CSS and JavaScript turned on. Today, we’re updating one of our technical Webmaster Guidelines in light of this announcement.

For optimal rendering and indexing, our new guideline specifies that you should allow Googlebot access to the JavaScript, CSS, and image files that your pages use. This provides you optimal rendering and indexing for your site. Disallowing crawling of Javascript or CSS files in your site’s robots.txt directly harms how well our algorithms render and index your content and can result in suboptimal rankings. “

http://googlewebmastercentral.blogspot.co.uk/2014/10/updating-our-technical-webmaster.html

My thoughts

I just get the sense Google are looking much more at user experience than ever before. If your page meets their criteria, you have some links and the right key phrases on page,  I think you have a good chance of ranking better.

Recapping

Of course site owners have to run adverts, it’s how you pay the bills sometimes.  The judgment you have to make is how heavily you go on disrupting users over how much you care about traffic from Google.

Bringing this back to trust. If users trust you, then you get repeat visits, Google is seen as the good guy for sending you to that site in the first place.

Nick Garner

 

Nick Garner is  founder of 90 Digital, the well-known and respected iGaming search marketing agency.  

Nick is obsessed with SEO and whatever it takes to rank sustainably on Google.