I brought this question up with John Mueller, Google’s Webmaster Trends Analyst, who hosts a weekly Webmaster Hangouts video where he answers webmaster related questions. I was going to attend one of those hangouts, however, you can simply create threads on the Google Webmaster Central forum and ask questions there and PM those questions directly to John Mueller — if you want an answer from Google. Not a lot of people know that.
How did John respond to this question?
His response to this question was just as I had thought. It’s obvious, but it’s good qualifying this and confirming this. They do look at websites on a domain level and if one section of your site has low quality content on it, then that may even impact the rest of your website, regardless if the quality on those other pages are of a high standard.
In general, if you host the content on your site, and it’s indexable, then Google will count that as a part of your site’s content. If you’re blindly posting user-generated content, then that can certainly affect how Google’s algorithms view your site overall. We have a few tips in our help center on this topic: [on user-generated spam] and [on ways to prevent comment spam].
In general, I’d recommend making sure that you’re comfortable hosting what you’re providing to search engines (and users!), and if you feel that the overall quality isn’t what you’d like it to be, then take appropriate measures to either restrict low-quality content from being published, or to prevent it from being indexed.
I’ve responded and am attempting to discover if whether the core algorithmic updates this year (the quality update and very recently the largest updates recorded by MozCast and Algoroo in June) have given more weight to domain level signals. John has said in the past that their algorithm attempts to process websites on a page-by-page level, but clearly updates such as Panda have sitewide implications. I’m of the belief, at the moment, that Google’s core updates that occurred on May 3rd and the 16-17th of June were quite possibly changes that put more weight on sitewide quality aspects.
Looking at examples of websites that suffered after these updates gives a good indication. And the sorts of sites that did suffer were sites that find it difficult to control the quality of the content that gets uploaded their site – due to user generated content.
An example of where this has occurred is with HubPages.com:
To quote the CEO of Hub Pages:
22% of Google search traffic disappeared!
— Paul Edmondson
Hub Pages suffered as a result of the February 24, 2011 Panda update (the first ever), which saw its SEO visibility suffer greater than the extent of the recent updates in 2015. They went to great lengths to control the sort of content they were pumping out by no indexing newly registered users’ content, making use of sub-domains, and employing people to rate the content going up on the site. Yet, they still suffered as a result of the May quality update.
Other examples of UGC websites that saw impact after the Quality update
Of course, it’s all good showing one example of a UGC website that suffered as a result of recent updates. Another good example is Answers.com. The Search Metrics graph above for HubPages.com looks very similar to the graph below for Answers.com, wouldn’t you say?
Both these sites have very high quality content on them, which likely saw good engagement from Google search. That, however, didn’t seem to matter because Google simply rated these sites on a domain level – as was mentioned by John Mueller earlier on in this post. It’s a scary thought, but like he said, if you aren’t comfortable with hosting content on your site, then you should either 1) remove it or 2) no-index it. So if you do have user generated content sections that exist on your site, then it should be monitored heavily.
That’s why it’s important not to silo your efforts to specific sections of your site when, in fact, parts of the site that you aren’t monitoring so closely could also be impacting other parts of your site.