Using the Structured Data Highlighter in Webmaster Tools
Jun05

Using the Structured Data Highlighter in Webmaster Tools

14SHARESFacebookTwitter I’ve never really used this lovely tool located in Webmaster Tools too much, but I’ve decided to invest some time in it to see the results of it. I’m quite impressed actually and I’ve used it on my personal blog to see what would happen and how my pages would end up looking in the search results, as a result of using this feature. So what is the Data Highlighter? It’s a nifty tool in Google Webmaster Tool that allows you to implement Schema markup without having to edit the hard code on your website. You can essentially tell Google that a certain part of your site equals the certain Schema.org markup. Here’s a list of what it allows you to define: Title Author Date Published Image Category Average Rating Rating Votes How do I use it? The tool essentially allows you to define each of the above Schema.org markup by highlighting parts of your page directly via Google Webmaster Tools: Click “Data Highlighter” and it will take you to a page where you’ll have to click on “Start Highlighting”. Once you click this button, type in the URL you want to highlight with structured data. For me, I picked a page on my blog that had a bit of Schema.org markup, but was not showing what I wanted in the SERPs.  It was for a review of a Chinese restaurant in Westfield Shopping Centre in Stratford, of which I had star ratings, review date, and reviewer fields at the bottom of the article filled in. Tag these properties up with the data highlighter by dragging you cursor, whilst holding down on left click, over the text you would like to highlight. Then right click and it will give you options to tag what you just highlighted – as you can see from what I’ve done here: Hit publish and simply wait for Google to update with this newly crafted information. Here’s the result of that and as you can see I have the stars setup in my listing along with all the other cool features: Google will also use this tagging up of a page to learn how other pages work and I have a similar setup of Schema on a different page. Here’s another review that I did that is now showing star ratings:*Note: I realise that the “Review by” bit is off, and this is simply due to the fact that I entered the wrong information in the wrong field when setting it up on that blog post.  Google will also tell you what is showing up with Schema markup in the SERPs directly from Webmaster Tools: I...

Read More
Halifax Bank Google Penalty Lifted (?)
May26

Halifax Bank Google Penalty Lifted (?)

9SHARESFacebookTwitter At the end of January Halifax Bank had a partial penalty imposed on its website, which saw a massive visibility drop across many of its product categories for many highly searched generic terms. The only untouched category, from what I could see using the finance keywords that I’m tracking, were mortgages where they saw no visibility drop. Graphs of Halifax.co.uk’s SEO/organic visibility You can see from the following graph the keywords/groups with which they dropped in rank after Google had obviously identified their link schemes/widgets that were being used to enhance their search visibility: These graphs are intended only to paint a picture of a signal change within Google’s algorithm. A more recent graph paints a different picture: As you can see, a change occurred on the 8th of May – way before Google’s recent announcement that it had launched Panda 4.0, which is suspected to have been launched on the 18th of May. A SearchMetrics visibility graph also confirms they’ve seen a visibility increase (before any mention of Panda 4.0 actually – confirming the validity of my graphs above): So, Halifax has essentially been able to get out of their bind in a matter of 4 months. And from what I’ve monitored since they took the hit, they took down the site that hosted their widgets, which essentially made the widgets look like this: (i.e. a CSS-less and image-less widget) This is what the widget looked like before: A smart move on them being able to control the way the widget looks Essentially, a very smart move on their part, was to host elements of the widget on their own servers. So when push came to shove, they could simply remove the CSS/styling that would no doubt induce website owners to remove the widget in its entirety from their own websites. Just to take one example of a keyword they’ve seen a rise in ranking would be “loans calculator”. When they was penalised they saw that keyword drop from the face of the earth. Halifax.co.uk now ranks 3rd for that term, which has a search volume total of 70,000+ searches per month. So, have they been de-penalised? One would assume so with the these results. However, they are still not back to where they originally were before they was penalised. It’s something no doubt they are working on very carefully...

Read More
Google Releases New Panda 4.0 Algorithm Update
May21

Google Releases New Panda 4.0 Algorithm Update

10SHARESFacebookTwitter Matt Cutts recently announced on his Twitter feed that Google would be rolling out Panda 4.0. It’s however believed that Panda 4.0 was rolled out much earlier and it’s likely that it originally was released over the weekend or on the 19th according to many of the tools that I subscribe to. The latest update affects ~7.5% of English related queries, which is a noticeable change that even normal users of Google would likely notice.  Matt Cutts later retweeted an article on Search Engine Land that detailed Google were rolling out an updated Payday Loans Algorithm as well that would target “spammy search queries.” A great resource that is useful if you’d like to see the timeline of Panda updates would be via Search Engine Roundtable, which lists all of the Panda related updates. Additionally, feel free to read this guest post written by Amy Harris in January this year called “6 SEO Trends for 2014 by Amy Harris“. It’s really relevant, as she talks about how you should be looking at your on-page content to see if there is actually enough content on the page – essentially showing Google that you are an authority on the subject you’re writing about. What is Panda? Panda mainly looks at content, whether that be duplicated content, thin content, or generally low quality content. It’s heavily content focused unlike its sister, Penguin, which looks specifically at the quality of backlinks going to a site. These are essentially variations of specific and big algorithms updates that Google rolls out every so often to keep SEOs, such as myself, on their toes. What have I noticed? The update definitely has focused on rich content. I’ve seen my almost 2,000 word article on what I think are the “Best SEO Tools” increase in rank by 4 positions – from 10th to 6th position. Although not scientifically proven to be Panda, I’ve not actually updated this site in a while and I haven’t made any recent on-site changes that would have affected rankings for that article for the keyword “best SEO tools” (720 searches per month on Google US and 210 searches per month on Google UK). The only thing I can put this down to is the Panda update. Has eBay been penalised? Also, from the looks of this eBay has also suffered as a result of this update. This story did make the rounds around my office today. Pete Meyers of Moz.com wrote a brilliant analysis of this on the Moz blog. Further to this, Rishi Lakhani of RefuGeeks did some very interesting analysis behind why eBay may have been penalised. Could it have been site architecture? It’s looking like that may be the case. So, those that...

Read More
Blogger Outreach is like Online Dating
Apr27

Blogger Outreach is like Online Dating

10SHARESFacebookTwitter Well, it’s more like match making. If your degree is in creative writing then you’re likely going to be attracted to someone that likes reading, writing and doing creative things. You aren’t likely to fall for someone who is the exact opposite and doesn’t like all these things. It’s not natural and this is what Google looks at and applies weight to. If your client is, for example, in the fashion industry then you’ll want to connect with bloggers in the fashion niche, and not blogs about car finance, insurance or a plethora of other niches. Not only will you be connecting directly to your audience, but you’ll be doing something that actually make sense. It’s a no-brainer. MajesticSEO & Dixon Jones @ BrightonSEO Last week I attended an SEO event in the UK called BrightonSEO, a yearly event that is hosted in the lovely beach town of Brighton. Dixon Jones from MajesticSEO presented with a presentation entitled “Do links still matter in 2014?” It was an excellent presentation backed with data and a new MajesticSEO feature that categorises websites’ backlink profiles. For example, if we look at Toyota UK’s website purely based on backlinks you’ll find that its ‘topical trust flow‘ is mostly based around websites about “Recreation/Autos”. I believe this essentially means that Toyota’s main pool of backlinks is sourced from motor enthusiasts or many of the popular motor advice forums that exist out there. This tool essentially allows you to see what this site is about — based purely on links. It doesn’t look at the website, but the links associated with that website. This paints a picture as to what Google’s spiders can be seeing when they go through the indexation process. So if Google knows what sort of sites are linking to Toyota then why wouldn’t they use this as a factor in ranking a website for motor related terms? Why would Google rank your website higher for a motor related term when the site linking to you isn’t about cars/motors? I feel it’s all about relevancy and this will prove vital for many websites in the future. It’s essentially why you should be targeting websites that are actually related to your website; and don’t ever forget the quality of that website or the links even pointing to that website; are they relevant? I mean, you wouldn’t date someone who is interested in American football, when you are interested in (rest of the world) football? Or would you. 🙂 Do links really matter anymore in 2014? In my notes during Dixon Jones’ speech I wrote down that Dixon said the following about links: ‘Of course links...

Read More
SEMrush Site Audit Tool
Apr21

SEMrush Site Audit Tool

7SHARESFacebookTwitter I’ve recently been using this new Site Audit Tool that SEMrush has launched that is actually quite powerful and I would say that is very comparable to another tool called DeepCrawl. The unfortunate thing about this tool is that it is still in beta mode and this for some reason means that you will be unable to export the site audit reports into Excel or any other format. The fact you can’t export this data is actually quite annoying as I’m having this issue with a client where Screaming Frog won’t for some reason tell me whether my clients website’s external images have alt-text defined, as they are hosted on Amazon’s content delivery network (CDN). I don’t understand why this is an issue for Screaming Frog, and perhaps there’s something I am missing. Update: You can indeed identify alt-text on images hosted on Amazon or any other sort of content delivery network (CDN) via Screaming Frog, this option is just not where you’d expect it to be. I messaged Dan Sharp via Twitter, the founder of Screaming Frog, and he promptly sent me a response linking to the FAQ where it highlights how you can do this. Anyways, before I derail let’s get back to the Site Audit Tool! Here you can see the overview tab: Duplicate Content This would be a great feature, but I’m afraid it’s just not accurate enough. If there were 223 duplicate pages then I’m pretty sure I would have already worked that out without the use of this tool (BrightEdge for example identifies duplicate content issues), but it has reported back quite a few pages that aren’t similar/duplicate at all. However to give this function a little credit it has found some pages that are duplicate, but the majority are not duplicate/similar pages. Duplicate Title SEMrush Site Audit Tool has found titles that have been duplicated due to parameters at the end of the URLs. With this information I can action changes to the robots.txt or via Google Webmaster Tools to block these URLs with parameters. External Links Broken This is quite useful, however, unfortunately has the same problem as what I’ve written in the “Internal Links Broken” section. If the URL that is externally being linked to has a space in there somewhere, SEMrush’s Site Audit Tool seems to cut this off at the space in the file name and count this as a broken link. If you take a look at the screenshot above, there are 103 listed broken links, but only one of those links are actually broken links — making this feature not very useful. Internal Links Broken It gives you...

Read More
Page 4 of 19« First...23456...10...Last »