They have asked for rollout to be delayed in order to fully verify the terms are not in breach of EU data protection laws. From a search engine optimisation (SEO) perspective, this likely is the most transparent Google has ever been about how it utilises consumer click through rate (CTR) data in its algorithm. It gives credence to conclusions which showed Google was already doing exactly this.
“If our engineers can see that people are consistently clicking on the top result for any given query, they know they are doing something right. If people are hitting “next page” or typing in another query, they know they’re not delivering the results that people are looking for, and can then take action to try and improve the search algorithms.”
From a search engine optimisation (SEO) perspective, it may well be the most transparent Google has ever been about its utilising click through rate (CTR) data in its natural search algorithm, something I have long suspected.
- Using micro-formatting to create “rich snippets” that enhance your search engine result page (SERP) listing with additional information such as product listings, comments, post counts, ratings and reviews
- Optimising your title and meta description so that they are compelling and readable, not just keyword rich
- Having a clean, concise URL structure so that users can see from the display URL that the page they are going to is the right one
There’s an interesting link between the cyclical rank fluctuations I saw in an in-house experiment (what we termed “Algorithm Testing or User Testing Fluctuations”) and Google using CTRs in its algorithm. Google was testing how well users respond to a particular page ranking in the search results by artificially improving its ranking for intermittent periods, looking at user signals like CTRs, bounce rates, engagement time and so on. Google’s transparency over using CTR adds a lot of credibility to this theory.