Can low quality parts of your site impact other areas of your site?
Jun28

Can low quality parts of your site impact other areas of your site?

22SHARESFacebookTwitter I brought this question up with John Mueller, Google’s Webmaster Trends Analyst, who hosts a weekly Webmaster Hangouts video where he answers webmaster related questions. I was going to attend one of those hangouts, however, you can simply create threads on the Google Webmaster Central forum and ask questions there and PM those questions directly to John Mueller — if you want an answer from Google. Not a lot of people know that. How did John respond to this question? His response to this question was just as I had thought. It’s obvious, but it’s good qualifying this and confirming this. They do look at websites on a domain level and if one section of your site has low quality content on it, then that may even impact the rest of your website, regardless if the quality on those other pages are of a high standard. JohnMu said: In general, if you host the content on your site, and it’s indexable, then Google will count that as a part of your site’s content. If you’re blindly posting user-generated content, then that can certainly affect how Google’s algorithms view your site overall. We have a few tips in our help center on this topic: [on user-generated spam] and [on ways to prevent comment spam]. In general, I’d recommend making sure that you’re comfortable hosting what you’re providing to search engines (and users!), and if you feel that the overall quality isn’t what you’d like it to be, then take appropriate measures to either restrict low-quality content from being published, or to prevent it from being indexed. I’ve responded and am attempting to discover if whether the core algorithmic updates this year (the quality update and very recently the largest updates recorded by MozCast and Algoroo in June) have given more weight to domain level signals. John has said in the past that their algorithm attempts to process websites on a page-by-page level, but clearly updates such as Panda have sitewide implications. I’m of the belief, at the moment, that Google’s core updates that occurred on May 3rd and the 16-17th of June were quite possibly changes that put more weight on sitewide quality aspects. Looking at examples of websites that suffered after these updates gives a good indication. And the sorts of sites that did suffer were sites that find it difficult to control the quality of the content that gets uploaded their site – due to user generated content. An example of where this has occurred is with HubPages.com: To quote the CEO of Hub Pages: 22% of Google search traffic disappeared! — Paul Edmondson http://pauledmondson.hubpages.com/hub/May-Day-2015-Google-Update Hub Pages suffered as...

Read More
Large Google Update on the 16th of June?
Jun17

Large Google Update on the 16th of June?

14SHARESFacebookTwitter There have been talks of small scale and high scale Google updates occurring in June, with webmasters reporting that they’ve either seen a drop in rank or an increase in rank for their respective niches. Looking at the SEO weather tools, they have identified that there have been big changes in the SERPs in June. Tools 1: MozCast http://mozcast.com/  Moz has their own tool called MozCast, which attempts to identify any turbulence in Google’s algorithm by assessing the volatility of a sample of 1,000 keywords. Its most recent report shows that on Tuesday, 16th of June, there was major turbulence that pushed the MozCast metric to 102°. To put this in perspective, the first ever Penguin update clocked in a figure of 93.1°. Tool 2: Algoroo https://algoroo.com/ Another tool called Algoroo samples 17,000 keywords and looks for fluctuations in a similar way to MozCast. This tool is also reporting a high amount of SERP flux and volatility in the same time periods as MozCast. If we look at when Google’s Phantom II / Quality Update occurred we can see that a recent increase, on the 16th of June, shows a far larger fluctuation in the SERPs than the Google Quality Update on the 3rd of May: Google’s Response Google have confirmed that this is neither a Panda, Penguin, nor HTTPS update. It was thought that this could have been an HTTPS update because recently (12th of June) Wikipedia encrypted their entire website in HTTPS/SSL. With a large percentage of Wikipedia pages being dominantly displayed in the top 10 search results this could have caused fluctuations, as Google indexes the HTTPS version of the site. This may have been the cause for the fluctuations in the SERPs where tools have shown high volatility. Pete Meyers at Moz explains this in more depth, here. However, Pete asked Google’s Webmaster Trends Analyst, Gary Illyes, if there was an HTTPS update that could have caused these high fluctuations. He wouldn’t go into further detail and responded on Twitter stating: At the time of writing, further details explaining the extent of this change haven’t been made available by Google. However, they did confirm to Search Engine Land that this was not a Panda update, stating: “This is not a Panda update. As you know, we’re always making improvements to our search algorithms and the web is constantly evolving. We’re going to continue to work on improvements across the board.” The reason there was speculation around a Panda update is because Gary Illyes announced on June 2nd at SMX Advanced that webmasters should expect a Panda refresh within the upcoming weeks. At the same time, he announced that...

Read More
Why has Search Interest in Free Web Hosting Declined Over Time?
Jun15

Why has Search Interest in Free Web Hosting Declined Over Time?

19SHARESFacebookTwitter I used to be quite big in the free web hosting industry. After all, I did run my own free web hosting company from 2007 to 2009. Today, I thought I’d check up on a few forums that I used to post on and essentially where I used to get my leads. Back then, for me at least, it wasn’t about making money. It was about providing a reliable web hosting service and supporting it in any way that I could. It became my first ever project and passion. You’d think, as being an SEO, that I got most of my leads from Google, but it wasn’t. I used to get my leads by signing up on relevant forums and contributing to discussions to make a name for myself. Whilst I was in my senior years in secondary school, I tried to run this free web hosting service. It was amazing at first, because I had a forum community that backed me at the time. They were voluntary staff who I’d give free web hosting to in return for their help. The business model was to make money from Google AdSense by basically forcing community members to add my Google AdSense ads to their website. For the majority of the time, I was working at a garden centre every weekend, trying to pay for the upkeep of 2 dedicated servers running on cPanel/WHM. I was paying up to $500 per month at one point. With the GBP being worth double that of USD, it was affordable and doable as I rented those servers in the US. Free Web Hosting Downward Trend So, that was then. The industry was on a downward trend according to the following Google Trends graph, but at the time, my web hosting service was averaging 30,000 unique visitors per month. It was significant enough for me. If you look at the chart below, I looked at the top 3 ‘free web hosting’ related terms, and as you can see in May 2015, interest has almost dropped to insignificant levels when in comparison to 2004 – 2009 periods. Why has search interest died for ‘free web hosting‘? Free web hosting used to be quite popular, as well as free forum creation services, which started to sprout up from 2005 on-wards. I used to be a real big fan of Invision Free in 2005, and that was a massive forum creation service that allowed anyone to create discussion boards on any particular subject. I remember at the time, Invision Free had around 100 active users on their support forum concurrently. If we look now, there are only...

Read More
Adding sub-directories to Google Webmaster Tools
Feb15

Adding sub-directories to Google Webmaster Tools

10SHARESFacebookTwitter Google Webmaster Tools (GWT) is a powerful tool that’s completely free. It gives you keyword data, page data, the ability to mark-up pages via the data highlighter, who’s linking to you etc along with a whole host of features aimed at providing more insight into organic search and your website. There are limitations, however, that become much more apparent with the larger types of websites that have various directories that drive high amounts of traffic from organic search. Unfortunately, Google clumps all this data together in the [search queries] report so you’ll have to do the guessing on what part of a site ranks for a specific term. The [top pages] report in GWT isn’t much help either as you can only view each page one by one and will not be able to download keyword data for each page if you want a holistic view. However, in order to fragment this information you can actually setup profiles in Google Webmaster Tools (GWT) for specific sub-directories. So, in the example I’m about to show for my travel blog, I’ve actually setup a profile in GWT for a specific blog post and for a specific sub-directory. Now for a small site this is completely unnecessary as the information is easy to sort for smaller sites. But for sites that are obtaining traffic from keywords ranging in the tens of thousands then this might give you more data about specific sub-directories on your website. Let’s take a look at the results… If you look at the below you can see that I’ve got my main GWT account under ‘jargoned.com’. That houses all of my data. For ‘jargoned.com/featured’ that houses only data specifically for that sub-directory. So, any articles that share the same URL path will also have data listed under that profile. Finally, in the ‘jargoned.com/featured/things-to-do-in-taichung1’ you’ll find only data specifically about that post or even anything after that URL path (if there is anything). What does that look like? Sub-directory data Single blog post data There’s much more than this… Links to your site [sub-directory] Most linked to content within a sub-directory. Internal links. Crawl errors. Device usability by directory or page. Manual actions on a sub-directory level. And more. For the larger sites this data can be invaluable as you’ll likely have different sections of the site being run by ‘journey managers’ who may find this data to be very useful. Larger sites usually segment data via Google Analytics, so why not do the same in Google Webmaster Tools? Hopefully this was useful!...

Read More
How Aunt Bessie’s Redesign has Ruined Their Search Visibility
Nov04

How Aunt Bessie’s Redesign has Ruined Their Search Visibility

25SHARESFacebookTwitter Aunt Bessie’s is a well-known UK brand that produces processed frozen foods. Notably their Yorkshire Pudding range of frozen processed foods is what they are known for. In October, Aunt Bessie’s website was updated and from the looks of it, it was updated with interactivity and design in mind only. This of course has led to Aunt Bessie’s fall in search visibility, as there has been no thought in regards to SEO. One glance in Search Metrics of their SEO Visibility reveals quite a shocking drop: This drop in visibility occurred throughout September – October. You might be thinking that this was either a Panda 4.1 or Penguin 3.0 update that caused this. However, I ruled this out once I had identified what they had done when they had re-designed their website. The technical elements (redirects) have also played a large part in the demise of their visibility. It also hasn’t helped that they’ve deleted pages and not really replaced them with like-for-like pages. What Keywords Dropped? KeywordURLPos.TrendSearch Volume yorkshire puddingwww.auntbessies.co.uk/products/yorkshire-puddings/home-bake-yorkshires/16-666,850 roast potatoeswww.auntbessies.co.uk/products/potatoes-1/homestyle-roast-potatoes/13-130,191 yorkshire puddingswww.auntbessies.co.uk/products/yorkshire-puddings/home-bake-yorkshires/17-57,130 yorkshire pudding mixwww.auntbessies.co.uk/products/store-cupboard/yorkshire-pudding-mix-1/9-14,308 mashed potatoeswww.auntbessies.co.uk/products/potatoes-1/homestyle-mashed-potato-1/22-86,154 rice puddingwww.auntbessies.co.uk/products/store-cupboard/classic-rice-pudding/31-1017,916 spotted dickwww.auntbessies.co.uk/products/great-british-puddings-1/spotted-dick/17-27,570 dumplingswww.auntbessies.co.uk/products/savoury-additions/light-fluffy-dumplings/28-211,133 onion gravywww.auntbessies.co.uk/products/store-cupboard/onion-gravy-granules/30-512,441 If we look at the table above, Aunt Bessie’s has dropped for these highly searched-for keywords. Their website re-design has been the culprit for this. If you look at the above keywords and the respective URLs, you’ll find out if you go to any of those URLs they all redirect to one location: Original URL301 redirect to www.auntbessies.co.uk/products/yorkshire-puddings/home-bake-yorkshires/http://www.auntbessies.co.uk/Product www.auntbessies.co.uk/products/potatoes-1/homestyle-roast-potatoes/http://www.auntbessies.co.uk/Product www.auntbessies.co.uk/products/yorkshire-puddings/home-bake-yorkshires/http://www.auntbessies.co.uk/Product www.auntbessies.co.uk/products/store-cupboard/yorkshire-pudding-mix-1/http://www.auntbessies.co.uk/Product www.auntbessies.co.uk/products/potatoes-1/homestyle-mashed-potato-1/http://www.auntbessies.co.uk/Product www.auntbessies.co.uk/products/store-cupboard/classic-rice-pudding/http://www.auntbessies.co.uk/Product www.auntbessies.co.uk/products/great-british-puddings-1/spotted-dick/http://www.auntbessies.co.uk/Product www.auntbessies.co.uk/products/savoury-additions/light-fluffy-dumplings/http://www.auntbessies.co.uk/Product www.auntbessies.co.uk/products/store-cupboard/onion-gravy-granules/http://www.auntbessies.co.uk/Product It’s because they’ve mass redirected everything to a single page that they’ve lost search visibility. They’ve not created pages to replace the old; instead they’ve created one HTML5 interactive website that tries to cater for multiple landing pages. However, there is no relevancy considering there is near to no textual copy on the page and when it comes to meta data they aren’t clearly defining pages. In effect, they aren’t helping Google at all with regards to defining their own pages, which in turn is impacting how Aunt Bessie’s website ranks. All Title Tags are the Same If we take a closer look at their page title tags, one of the most important SEO elements, they’ve effectively duplicated their title tag throughout their entire website: AddressPage Title http://www.auntbessies.co.uk/Product/great-british-puddings/morello-cherry-pieAunt Bessies | Home Page http://www.auntbessies.co.uk/Recipe/70Aunt Bessies | Home Page http://www.auntbessies.co.uk/Recipe/73Aunt Bessies | Home Page http://www.auntbessies.co.uk/Product/store-cupboardAunt Bessies | Home Page http://www.auntbessies.co.uk/Recipe/72Aunt Bessies | Home Page http://www.auntbessies.co.uk/Recipe/71Aunt Bessies | Home Page http://www.auntbessies.co.uk/Product/potatoesAunt Bessies | Home Page http://www.auntbessies.co.uk/Recipe/74Aunt Bessies | Home Page http://www.auntbessies.co.uk/PromisesAunt Bessies | Home Page http://www.auntbessies.co.uk/promises/board-packagingAunt Bessies | Home Page http://www.auntbessies.co.uk/Aunt Bessies | Home Page http://www.auntbessies.co.uk/promises/eggsAunt Bessies | Home Page http://www.auntbessies.co.uk/promises/meatAunt Bessies |...

Read More
BrightEdge Data Cube Time Machine
Oct18

BrightEdge Data Cube Time Machine

9SHARESFacebookTwitter I’ve been a user of BrightEdge for around 18 months now and at the beginning of this year I started to distance my ‘liking’ for the platform over other tools such as Search Metrics that provide a more holistic view of a website’s performance. When you’re tracking a defined list of keywords, you can very easily not pick up on successes of other keywords that may have started ranking as a result of your hard work. I always thought that it would be too expensive to track every keyword that a website ranks for until I learned that another SEO company in London had built its own keyword tracking tool that was literally tracking millions of keywords. This is exactly why I’m impressed with the launch of Data Cube – a new feature in BrightEdge that opens up their large database of “1 billion keywords and 150 billion URLs” of which they have converted into information that is actionable. In March 2014, BrightEdge launched a platform called ‘Data Cube‘, which makes use of the massive amounts of data that BrightEdge collects from search engines. BrightEdge says that it digs through over 100 terabytes worth of data, which equates to 1 billion keywords and a 150 billion URLs (as pointed out above). This data is stored on a month to month basis, so you can go back and compare results vs historic records. BrightEdge has named this new feature (historic comparisons) ‘Data Cube Time Machine’. How does it work? What does it actually do? Data Cube – Performance for my ‘Best SEO Tools’ article If I want to see how well my ‘Best SEO Tools’ article is doing and what keywords it is ranking for then you simply paste the URL into the search field and it will look at where that page ranks in the SERPs: Not doing too badly. I do however rank in 6th position for ‘best seo tools’ in the US search results, which has a search volume of 720. It would be pretty neat if you could add another column for rankings and search volume for US and other market keyword positions and search volume. Data Cube Time Machine – Performance for my ‘Best SEO Tools’ article With ‘Data Cube Time Machine’ you can see the performance of an entire website or even a specific page. In the example above, I went from 2 keywords ranked on page 1 in April (for the ‘Best SEO Tools’ article only) to around 6 keywords ranked on page 1 in August. That’s progress! What is Data Cube Score? Data Cube Score is a really simple metric and is based...

Read More
Page 2 of 1112345...10...Last »