Halifax Bank Google Penalty Lifted (?)
May26

Halifax Bank Google Penalty Lifted (?)

9SHARESFacebookTwitter At the end of January Halifax Bank had a partial penalty imposed on its website, which saw a massive visibility drop across many of its product categories for many highly searched generic terms. The only untouched category, from what I could see using the finance keywords that I’m tracking, were mortgages where they saw no visibility drop. Graphs of Halifax.co.uk’s SEO/organic visibility You can see from the following graph the keywords/groups with which they dropped in rank after Google had obviously identified their link schemes/widgets that were being used to enhance their search visibility: These graphs are intended only to paint a picture of a signal change within Google’s algorithm. A more recent graph paints a different picture: As you can see, a change occurred on the 8th of May – way before Google’s recent announcement that it had launched Panda 4.0, which is suspected to have been launched on the 18th of May. A SearchMetrics visibility graph also confirms they’ve seen a visibility increase (before any mention of Panda 4.0 actually – confirming the validity of my graphs above): So, Halifax has essentially been able to get out of their bind in a matter of 4 months. And from what I’ve monitored since they took the hit, they took down the site that hosted their widgets, which essentially made the widgets look like this: (i.e. a CSS-less and image-less widget) This is what the widget looked like before: A smart move on them being able to control the way the widget looks Essentially, a very smart move on their part, was to host elements of the widget on their own servers. So when push came to shove, they could simply remove the CSS/styling that would no doubt induce website owners to remove the widget in its entirety from their own websites. Just to take one example of a keyword they’ve seen a rise in ranking would be “loans calculator”. When they was penalised they saw that keyword drop from the face of the earth. Halifax.co.uk now ranks 3rd for that term, which has a search volume total of 70,000+ searches per month. So, have they been de-penalised? One would assume so with the these results. However, they are still not back to where they originally were before they was penalised. It’s something no doubt they are working on very carefully...

Read More
Google Releases New Panda 4.0 Algorithm Update
May21

Google Releases New Panda 4.0 Algorithm Update

10SHARESFacebookTwitter Matt Cutts recently announced on his Twitter feed that Google would be rolling out Panda 4.0. It’s however believed that Panda 4.0 was rolled out much earlier and it’s likely that it originally was released over the weekend or on the 19th according to many of the tools that I subscribe to. The latest update affects ~7.5% of English related queries, which is a noticeable change that even normal users of Google would likely notice.  Matt Cutts later retweeted an article on Search Engine Land that detailed Google were rolling out an updated Payday Loans Algorithm as well that would target “spammy search queries.” A great resource that is useful if you’d like to see the timeline of Panda updates would be via Search Engine Roundtable, which lists all of the Panda related updates. Additionally, feel free to read this guest post written by Amy Harris in January this year called “6 SEO Trends for 2014 by Amy Harris“. It’s really relevant, as she talks about how you should be looking at your on-page content to see if there is actually enough content on the page – essentially showing Google that you are an authority on the subject you’re writing about. What is Panda? Panda mainly looks at content, whether that be duplicated content, thin content, or generally low quality content. It’s heavily content focused unlike its sister, Penguin, which looks specifically at the quality of backlinks going to a site. These are essentially variations of specific and big algorithms updates that Google rolls out every so often to keep SEOs, such as myself, on their toes. What have I noticed? The update definitely has focused on rich content. I’ve seen my almost 2,000 word article on what I think are the “Best SEO Tools” increase in rank by 4 positions – from 10th to 6th position. Although not scientifically proven to be Panda, I’ve not actually updated this site in a while and I haven’t made any recent on-site changes that would have affected rankings for that article for the keyword “best SEO tools” (720 searches per month on Google US and 210 searches per month on Google UK). The only thing I can put this down to is the Panda update. Has eBay been penalised? Also, from the looks of this eBay has also suffered as a result of this update. This story did make the rounds around my office today. Pete Meyers of Moz.com wrote a brilliant analysis of this on the Moz blog. Further to this, Rishi Lakhani of RefuGeeks did some very interesting analysis behind why eBay may have been penalised. Could it have been site architecture? It’s looking like that may be the case. So, those that...

Read More
Blogger Outreach is like Online Dating
Apr27

Blogger Outreach is like Online Dating

10SHARESFacebookTwitter Well, it’s more like match making. If your degree is in creative writing then you’re likely going to be attracted to someone that likes reading, writing and doing creative things. You aren’t likely to fall for someone who is the exact opposite and doesn’t like all these things. It’s not natural and this is what Google looks at and applies weight to. If your client is, for example, in the fashion industry then you’ll want to connect with bloggers in the fashion niche, and not blogs about car finance, insurance or a plethora of other niches. Not only will you be connecting directly to your audience, but you’ll be doing something that actually make sense. It’s a no-brainer. MajesticSEO & Dixon Jones @ BrightonSEO Last week I attended an SEO event in the UK called BrightonSEO, a yearly event that is hosted in the lovely beach town of Brighton. Dixon Jones from MajesticSEO presented with a presentation entitled “Do links still matter in 2014?” It was an excellent presentation backed with data and a new MajesticSEO feature that categorises websites’ backlink profiles. For example, if we look at Toyota UK’s website purely based on backlinks you’ll find that its ‘topical trust flow‘ is mostly based around websites about “Recreation/Autos”. I believe this essentially means that Toyota’s main pool of backlinks is sourced from motor enthusiasts or many of the popular motor advice forums that exist out there. This tool essentially allows you to see what this site is about — based purely on links. It doesn’t look at the website, but the links associated with that website. This paints a picture as to what Google’s spiders can be seeing when they go through the indexation process. So if Google knows what sort of sites are linking to Toyota then why wouldn’t they use this as a factor in ranking a website for motor related terms? Why would Google rank your website higher for a motor related term when the site linking to you isn’t about cars/motors? I feel it’s all about relevancy and this will prove vital for many websites in the future. It’s essentially why you should be targeting websites that are actually related to your website; and don’t ever forget the quality of that website or the links even pointing to that website; are they relevant? I mean, you wouldn’t date someone who is interested in American football, when you are interested in (rest of the world) football? Or would you. 🙂 Do links really matter anymore in 2014? In my notes during Dixon Jones’ speech I wrote down that Dixon said the following about links: ‘Of course links...

Read More
SEMrush Site Audit Tool
Apr21

SEMrush Site Audit Tool

7SHARESFacebookTwitter I’ve recently been using this new Site Audit Tool that SEMrush has launched that is actually quite powerful and I would say that is very comparable to another tool called DeepCrawl. The unfortunate thing about this tool is that it is still in beta mode and this for some reason means that you will be unable to export the site audit reports into Excel or any other format. The fact you can’t export this data is actually quite annoying as I’m having this issue with a client where Screaming Frog won’t for some reason tell me whether my clients website’s external images have alt-text defined, as they are hosted on Amazon’s content delivery network (CDN). I don’t understand why this is an issue for Screaming Frog, and perhaps there’s something I am missing. Update: You can indeed identify alt-text on images hosted on Amazon or any other sort of content delivery network (CDN) via Screaming Frog, this option is just not where you’d expect it to be. I messaged Dan Sharp via Twitter, the founder of Screaming Frog, and he promptly sent me a response linking to the FAQ where it highlights how you can do this. Anyways, before I derail let’s get back to the Site Audit Tool! Here you can see the overview tab: Duplicate Content This would be a great feature, but I’m afraid it’s just not accurate enough. If there were 223 duplicate pages then I’m pretty sure I would have already worked that out without the use of this tool (BrightEdge for example identifies duplicate content issues), but it has reported back quite a few pages that aren’t similar/duplicate at all. However to give this function a little credit it has found some pages that are duplicate, but the majority are not duplicate/similar pages. Duplicate Title SEMrush Site Audit Tool has found titles that have been duplicated due to parameters at the end of the URLs. With this information I can action changes to the robots.txt or via Google Webmaster Tools to block these URLs with parameters. External Links Broken This is quite useful, however, unfortunately has the same problem as what I’ve written in the “Internal Links Broken” section. If the URL that is externally being linked to has a space in there somewhere, SEMrush’s Site Audit Tool seems to cut this off at the space in the file name and count this as a broken link. If you take a look at the screenshot above, there are 103 listed broken links, but only one of those links are actually broken links — making this feature not very useful. Internal Links Broken It gives you...

Read More
Can Google Crawl Textual Content Within JavaScript
Apr11

Can Google Crawl Textual Content Within JavaScript

12SHARESFacebookTwitter Google has increasingly become pragmatic when it comes to crawling textual content hidden within JavaScript. A perfect example of where Google has confirmed this would by using Google’s head of spam (or at least using a video that explains it all). Matt Cutts has openly stated in a Google Webmaster Help video that you should not block Googlebot from crawling JavaScript or CSS; this was 2 years ago. Since then Google has advanced the way Googlebot detects legitimate content on a page that might have ‘hidden’ content within JavaScript for the purpose of UX. New …or Not So New Webmaster Help Video Just 4 days ago, Matt Cutts, released another video on this subject where he talked about content being hidden essentially within JavaScript/AJAX on the basis of UX. In this video he stated that Google “has become pretty good at indexing JavaScript and being able to render this in to our search results.” See the video below: I do have to add, before I continue, that Matt Cutts posted this video 4 days ago, but states in the video that it was “recorded on May 8th, 2013”. Misconception of JavaScript in SEO There’s a common misconception that Google cannot render anything in JavaScript or that it’s not best practice to have content hidden within JavaScript. But then you’re then thinking too much about optimising for search engines, rather than people. People don’t want to read a full page of text (unless you’re on Wikipedia via desktop) and would rather have content segmented with perhaps the help of AJAX. A good example of this is when it comes to mobile, is to have content segmented, so that the person on mobile has the ability to easily navigate to part of the content they wish to see. If you’ve ever browsed Wikipedia via mobile you’ll see that it uses AJAX to hide content, so that you don’t have to navigate through a massive blob of text to get to the part you want to. This is all down to user design, and Google has obviously identified this as a common theme on many websites. Using iframes to pull content externally into Lightboxes What’s interesting, for me at least, is the fact that in that same video I mentioned above, Matt Cutts explained how Google is working on pulling content via iframes and states they are just a “couple months’ away” from achieving this. Remember: This was on the 8th of May, 2013 and that 2 months has long gone. So I checked to see if this was the case, as 4 months ago, a client for the agency I...

Read More
Page 5 of 19« First...34567...10...Last »