My Take on Google Removing Authorship Photos
Jun29

My Take on Google Removing Authorship Photos

12SHARESFacebookTwitter On the 25th of June, Google’s John Mueller announced that they were going to remove the authorship pictures and Google Plus circle counts from the search results page on a global scale. It’s now the 28th of June (as I write this) and they have already implemented the ‘change’ to the SERPs. The search results already look significantly different. I guess for the casual user they won’t really see the difference and this has really only affected article related searches and not really the product side of search. So, why Google? What gives? Google has put this down to a visual design “clean up” and with Google talking a lot more about the growth of mobile – they are simply trying to cater to a less cluttered GUI… or so they say. There are some theories going around saying that the authorship snippet profile pictures were drawing too much attention away from Google’s much lucrative ads. And some even saying that they have removed the pictures because Google’s attempt to force people to use G+ has been a failure. Whether Google is telling the truth on why they are removing it is all too difficult to conclude, as they could have done this for any number of reasons. Hell, perhaps those images were contributing to a slower load time for mobile users and were adding that extra bit of latency? Or perhaps their analysis shows that too many people are clicking on the actual profile pictures (leading to a G+ profile page) rather than the actual articles that people searched for. It could be anything! In my humble opinion, I put this move to remove the authorship images down to multiple reasons that have factored into this abandonment. I believe one of the reasons is this is just another update part of a mobile SERP update and just to be consistent they have rolled out the same update on desktop & tablet. If you look into it a bit more, Matt Cutts, Google’s head of search spam, stated at the recent SMX West conference that he “wouldn’t be surprised” if mobile search would surpass desktop queries this year. In 2010, Google’s Eric Schmidt announced that Google would do everything via “mobile first”. This could be fueling these changes and with Google wanting to be “consistent” across all platforms they’ve applied this sort of reasoning across all verticals. It sort of makes sense that they don’t just have one reason for this move, but multiple reasons. I believe it’s based on these reasons: It’s drawing attention away from PPC ads and gives more visibility to organic search results. (which by the way already...

Read More
Using the Structured Data Highlighter in Webmaster Tools
Jun05

Using the Structured Data Highlighter in Webmaster Tools

14SHARESFacebookTwitter I’ve never really used this lovely tool located in Webmaster Tools too much, but I’ve decided to invest some time in it to see the results of it. I’m quite impressed actually and I’ve used it on my personal blog to see what would happen and how my pages would end up looking in the search results, as a result of using this feature. So what is the Data Highlighter? It’s a nifty tool in Google Webmaster Tool that allows you to implement Schema markup without having to edit the hard code on your website. You can essentially tell Google that a certain part of your site equals the certain Schema.org markup. Here’s a list of what it allows you to define: Title Author Date Published Image Category Average Rating Rating Votes How do I use it? The tool essentially allows you to define each of the above Schema.org markup by highlighting parts of your page directly via Google Webmaster Tools: Click “Data Highlighter” and it will take you to a page where you’ll have to click on “Start Highlighting”. Once you click this button, type in the URL you want to highlight with structured data. For me, I picked a page on my blog that had a bit of Schema.org markup, but was not showing what I wanted in the SERPs.  It was for a review of a Chinese restaurant in Westfield Shopping Centre in Stratford, of which I had star ratings, review date, and reviewer fields at the bottom of the article filled in. Tag these properties up with the data highlighter by dragging you cursor, whilst holding down on left click, over the text you would like to highlight. Then right click and it will give you options to tag what you just highlighted – as you can see from what I’ve done here: Hit publish and simply wait for Google to update with this newly crafted information. Here’s the result of that and as you can see I have the stars setup in my listing along with all the other cool features: Google will also use this tagging up of a page to learn how other pages work and I have a similar setup of Schema on a different page. Here’s another review that I did that is now showing star ratings:*Note: I realise that the “Review by” bit is off, and this is simply due to the fact that I entered the wrong information in the wrong field when setting it up on that blog post.  Google will also tell you what is showing up with Schema markup in the SERPs directly from Webmaster Tools: I...

Read More
Halifax Bank Google Penalty Lifted (?)
May26

Halifax Bank Google Penalty Lifted (?)

9SHARESFacebookTwitter At the end of January Halifax Bank had a partial penalty imposed on its website, which saw a massive visibility drop across many of its product categories for many highly searched generic terms. The only untouched category, from what I could see using the finance keywords that I’m tracking, were mortgages where they saw no visibility drop. Graphs of Halifax.co.uk’s SEO/organic visibility You can see from the following graph the keywords/groups with which they dropped in rank after Google had obviously identified their link schemes/widgets that were being used to enhance their search visibility: These graphs are intended only to paint a picture of a signal change within Google’s algorithm. A more recent graph paints a different picture: As you can see, a change occurred on the 8th of May – way before Google’s recent announcement that it had launched Panda 4.0, which is suspected to have been launched on the 18th of May. A SearchMetrics visibility graph also confirms they’ve seen a visibility increase (before any mention of Panda 4.0 actually – confirming the validity of my graphs above): So, Halifax has essentially been able to get out of their bind in a matter of 4 months. And from what I’ve monitored since they took the hit, they took down the site that hosted their widgets, which essentially made the widgets look like this: (i.e. a CSS-less and image-less widget) This is what the widget looked like before: A smart move on them being able to control the way the widget looks Essentially, a very smart move on their part, was to host elements of the widget on their own servers. So when push came to shove, they could simply remove the CSS/styling that would no doubt induce website owners to remove the widget in its entirety from their own websites. Just to take one example of a keyword they’ve seen a rise in ranking would be “loans calculator”. When they was penalised they saw that keyword drop from the face of the earth. Halifax.co.uk now ranks 3rd for that term, which has a search volume total of 70,000+ searches per month. So, have they been de-penalised? One would assume so with the these results. However, they are still not back to where they originally were before they was penalised. It’s something no doubt they are working on very carefully...

Read More
Google Releases New Panda 4.0 Algorithm Update
May21

Google Releases New Panda 4.0 Algorithm Update

10SHARESFacebookTwitter Matt Cutts recently announced on his Twitter feed that Google would be rolling out Panda 4.0. It’s however believed that Panda 4.0 was rolled out much earlier and it’s likely that it originally was released over the weekend or on the 19th according to many of the tools that I subscribe to. The latest update affects ~7.5% of English related queries, which is a noticeable change that even normal users of Google would likely notice.  Matt Cutts later retweeted an article on Search Engine Land that detailed Google were rolling out an updated Payday Loans Algorithm as well that would target “spammy search queries.” A great resource that is useful if you’d like to see the timeline of Panda updates would be via Search Engine Roundtable, which lists all of the Panda related updates. Additionally, feel free to read this guest post written by Amy Harris in January this year called “6 SEO Trends for 2014 by Amy Harris“. It’s really relevant, as she talks about how you should be looking at your on-page content to see if there is actually enough content on the page – essentially showing Google that you are an authority on the subject you’re writing about. What is Panda? Panda mainly looks at content, whether that be duplicated content, thin content, or generally low quality content. It’s heavily content focused unlike its sister, Penguin, which looks specifically at the quality of backlinks going to a site. These are essentially variations of specific and big algorithms updates that Google rolls out every so often to keep SEOs, such as myself, on their toes. What have I noticed? The update definitely has focused on rich content. I’ve seen my almost 2,000 word article on what I think are the “Best SEO Tools” increase in rank by 4 positions – from 10th to 6th position. Although not scientifically proven to be Panda, I’ve not actually updated this site in a while and I haven’t made any recent on-site changes that would have affected rankings for that article for the keyword “best SEO tools” (720 searches per month on Google US and 210 searches per month on Google UK). The only thing I can put this down to is the Panda update. Has eBay been penalised? Also, from the looks of this eBay has also suffered as a result of this update. This story did make the rounds around my office today. Pete Meyers of Moz.com wrote a brilliant analysis of this on the Moz blog. Further to this, Rishi Lakhani of RefuGeeks did some very interesting analysis behind why eBay may have been penalised. Could it have been site architecture? It’s looking like that may be the case. So, those that...

Read More
Blogger Outreach is like Online Dating
Apr27

Blogger Outreach is like Online Dating

10SHARESFacebookTwitter Well, it’s more like match making. If your degree is in creative writing then you’re likely going to be attracted to someone that likes reading, writing and doing creative things. You aren’t likely to fall for someone who is the exact opposite and doesn’t like all these things. It’s not natural and this is what Google looks at and applies weight to. If your client is, for example, in the fashion industry then you’ll want to connect with bloggers in the fashion niche, and not blogs about car finance, insurance or a plethora of other niches. Not only will you be connecting directly to your audience, but you’ll be doing something that actually make sense. It’s a no-brainer. MajesticSEO & Dixon Jones @ BrightonSEO Last week I attended an SEO event in the UK called BrightonSEO, a yearly event that is hosted in the lovely beach town of Brighton. Dixon Jones from MajesticSEO presented with a presentation entitled “Do links still matter in 2014?” It was an excellent presentation backed with data and a new MajesticSEO feature that categorises websites’ backlink profiles. For example, if we look at Toyota UK’s website purely based on backlinks you’ll find that its ‘topical trust flow‘ is mostly based around websites about “Recreation/Autos”. I believe this essentially means that Toyota’s main pool of backlinks is sourced from motor enthusiasts or many of the popular motor advice forums that exist out there. This tool essentially allows you to see what this site is about — based purely on links. It doesn’t look at the website, but the links associated with that website. This paints a picture as to what Google’s spiders can be seeing when they go through the indexation process. So if Google knows what sort of sites are linking to Toyota then why wouldn’t they use this as a factor in ranking a website for motor related terms? Why would Google rank your website higher for a motor related term when the site linking to you isn’t about cars/motors? I feel it’s all about relevancy and this will prove vital for many websites in the future. It’s essentially why you should be targeting websites that are actually related to your website; and don’t ever forget the quality of that website or the links even pointing to that website; are they relevant? I mean, you wouldn’t date someone who is interested in American football, when you are interested in (rest of the world) football? Or would you. 🙂 Do links really matter anymore in 2014? In my notes during Dixon Jones’ speech I wrote down that Dixon said the following about links: ‘Of course links...

Read More
Can Google Crawl Textual Content Within JavaScript
Apr11

Can Google Crawl Textual Content Within JavaScript

12SHARESFacebookTwitter Google has increasingly become pragmatic when it comes to crawling textual content hidden within JavaScript. A perfect example of where Google has confirmed this would by using Google’s head of spam (or at least using a video that explains it all). Matt Cutts has openly stated in a Google Webmaster Help video that you should not block Googlebot from crawling JavaScript or CSS; this was 2 years ago. Since then Google has advanced the way Googlebot detects legitimate content on a page that might have ‘hidden’ content within JavaScript for the purpose of UX. New …or Not So New Webmaster Help Video Just 4 days ago, Matt Cutts, released another video on this subject where he talked about content being hidden essentially within JavaScript/AJAX on the basis of UX. In this video he stated that Google “has become pretty good at indexing JavaScript and being able to render this in to our search results.” See the video below: I do have to add, before I continue, that Matt Cutts posted this video 4 days ago, but states in the video that it was “recorded on May 8th, 2013”. Misconception of JavaScript in SEO There’s a common misconception that Google cannot render anything in JavaScript or that it’s not best practice to have content hidden within JavaScript. But then you’re then thinking too much about optimising for search engines, rather than people. People don’t want to read a full page of text (unless you’re on Wikipedia via desktop) and would rather have content segmented with perhaps the help of AJAX. A good example of this is when it comes to mobile, is to have content segmented, so that the person on mobile has the ability to easily navigate to part of the content they wish to see. If you’ve ever browsed Wikipedia via mobile you’ll see that it uses AJAX to hide content, so that you don’t have to navigate through a massive blob of text to get to the part you want to. This is all down to user design, and Google has obviously identified this as a common theme on many websites. Using iframes to pull content externally into Lightboxes What’s interesting, for me at least, is the fact that in that same video I mentioned above, Matt Cutts explained how Google is working on pulling content via iframes and states they are just a “couple months’ away” from achieving this. Remember: This was on the 8th of May, 2013 and that 2 months has long gone. So I checked to see if this was the case, as 4 months ago, a client for the agency I...

Read More
Page 3 of 812345...Last »