What Guest Posting Patterns Can Google Really Pick up on?
Mar13

What Guest Posting Patterns Can Google Really Pick up on?

21SHARESFacebookTwitter Guest posting is a way to increase recognition while driving traffic back to your blog. It’s a method that has proven effective for link-building and for increasing popularity. Based on these facts, bloggers – especially new bloggers – are often eager to find guest posting opportunities. They reach out, write posts and are published without a second thought. This could be detrimental based on the advances in search engine technology and the patterns new algorithms can pick up on. Before starting a guest posting campaign, consider the facts below to ensure your attempt at driving traffic doesn’t backfire. Repetition Can Work Against You With the onset of new search engine technology, like Google’s Hummingbird algorithm, repetition can be penalized. If a byline is the same across multiple sites, search engines are likely to pick up on the trend; understand the SEO tactic at work and penalize it. Bylines that are the same, or too similar, create a footprint, which makes it easier to track information across the Internet. It can also create duplication issues for the sites you’re guest posting on – for example, if you use the same byline for five sites, Google may see those sites as linked, and penalize them as well as you. To create guest post bylines, think about the information that matters for the blog you’re writing for and incorporate it. Use two or fewer links and consider taking advantage of Google+’s rel=author tool. Remember, variety is just as critical in bylines as in content. Commercial Byline Links Are a Trend of the Past In the past, using a commercial link, especially an exact-match keyword link, was typical of guest posts and generally accepted. In some cases, it was advantageous for link-building but now the opposite is true. While some blogs still allow the trend, search engines tend to ignore such links. To maintain credibility, or possible search ranking penalties in the future, it’s best to do away with the practice now. The Quality of the Site You’re Posting to Matters Some blogs depend on guest posts entirely, while others have a core team and offer occasional guest posting opportunities. This matters. When a blog is comprised solely of guest posts, its purpose is probably to increase search rankings. When you take the time to write for a site like this, it could increase your visibility, however, it could also hurt your online reputation, especially where search engine crawlers are concerned. Of course, there are always a few exceptions – the Moz blog and Search Engine People are two examples that come to mind in the SEO niche. When posting to a...

Read More
The Benefits of Ugly SEO
Mar12

The Benefits of Ugly SEO

10SHARESFacebookTwitter SEO is, essentially, an exercise in efficiency. Much of what we do hinges on the ease of access of both the user and the multitude of crawlers that establish how “good” or “quick” a site is to use. This, one would think, couldn’t gel with an ugly site experience, what with tonnes of widgets, pictures and links plastered all over a site’s front page, usually there to stall load times and try a user’s patience, while giving the designer a chance to bask in their own glory. However, there’s many ways in which ugly SEO can greatly benefit a site, and it’s right now my job here to show you some methods I’ve seen that benefit SEO that could be thought of at the cost of the user. Implemented well can give SEO’ers and users something to think about. Scrap the flash Install the Web Developer Tool. Then, disable images, javascript and linearize the page. Now what you see is ultimately what Google does too, more or less. So, what do you see? An abundance of text? Or lack of it due to overt use of image, flash and so on. I took a look at Jonny’s ‘website’ (“vlexo.net”), below: Dull, I’m sure you’ll agree. (Sorry Jon, you might want to work on that font) but something that can be easily analysed by all varieties of crawlers. This is not so for a majority of older sites, and especially when designers are on board, then things can get a little tricky. Now, Flash is especially relevant today: the proliferation of mobile devices that can’t utilise flash have likely set back older websites who feature much of it. I wouldn’t be very surprised if I’d heard of sites losing rank over overt flash use, so try to minimise the amount of flash or java on your website. Unfortunately (for some) our friends at Google deem use of images vital to relevancy, although of course they can’t quite distinguish between what we see and what they’re told we’re seeing. Regardless, stripping your site down to composite (?) linear elements can give a great boost to both usability and crawlability (a word I’ve just coined.) A great example of this is gov.uk, who use the simplest fonts, colours and designs, while providing useful and concise information. It may seem dull, but there’s little better. Big Footers  I’m a big fan of this concept that I had recently discovered over here on SEW, which is essentially to create the biggest footer known to man, an internal-linking-site-map-all-within-a-footer, if you will. I’ve borrowed the image, but I hope it’s a tactic that will be...

Read More
Matt Cutts didn’t expect this response via Twitter (Scraper URL)
Feb28

Matt Cutts didn’t expect this response via Twitter (Scraper URL)

19SHARESFacebookTwitter Google’s head of search spam, Matt Cutts, recently tweeted a link to a form that allows website owners to report websites that have scraped content from other sites and that rank higher than those of the original source. Cutts is referring to website owners that scrape content from other websites and post that content on their own websites. This sort of means that Google might not be able to tell, in some cases, what site is the original source. Probably not what Cutts expected .@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f — dan barker (@danbarker) February 27, 2014 To put it simply, someone has spotted that Google does exactly what a scraper does. It scrapes content from sites like Wikipedia, and uses it for its semantic search functions by providing the information in the SERPs, which means everyone should report Google for exactly what it is telling others to report. Whilst Google only extracts an excerpt of information from articles via Wikipedia, the irony is actually quite...

Read More
After just 10 days Rap Genius is back
Jan04

After just 10 days Rap Genius is back

12SHARESFacebookTwitter It’s taken just 10 days for annotation lyrics website, Rap Genius, to be allowed back into Google’s search results. On the 25 December, 2013 Rap Genius was penalised by Google for engaging in “link schemes”. In a long-winded blog post in Rap Genius’ news section, they described how they managed to get back into Google. They explained that they downloaded the RapGenius.com backlink profile and analysed this data by finding websites that would potentially violate Google’s guidelines. They would then either contact those websites by asking them to remove the links pointing to RapGenius or by asking blog owners to add the rel=”nofollow” attribute to those links. For any links they could not get removed or changed they would add to a disavow list, which they would later submit to Google. They delved into the more technical side of what they did to get back into Google’s search results, which included writing scripts to speed up the process of finding sites that were potentially in violation of Google’s guidelines. However, the question remains whether they’ve managed to succeed in gaining back their old rankings for search terms that were lucrative to their success. Either way, it’s time for them to start the analysing process to see what their rankings look like now. Take a look at these comparison screenshots of what the search results looked like when they were first penalised on the 25 December 2013 to their updated status on the 4 January 2014. When searching for “Rap Genius” on the 25 December, 2013 When searching for “Rap Genius” on the 4 January, 2014 I found this little nugget. Rap Genius appears to back in the search results for Eminem’s song “I’m Back” It’s listed as second place in the SERP after AZ Lyrics, one of the many other lyrics websites that Rap Genius tried to expose when they themselves were exposed. In my opinion they’ve managed to get back in the search results relatively quickly due to the fact they’ve done quite a bit to find unnatural links, by creating scripts, analysing their backlink profile, and being quite open throughout the whole process. If they did indeed communicate with Google throughout this, then their response is one to mark as an example for other big websites that may in the future go through the same...

Read More
Best SEO Tools & Resources You’ll Need in 2014
Jan01

Best SEO Tools & Resources You’ll Need in 2014

39SHARESFacebookTwitter UPDATE: See my new post on the best SEO tools in 2015. I’ve found lists with SEO tools that either aren’t very good or that are outdated, so I thought I’d make my own list of SEO tools that I’m currently using – whether they be extensions for Chrome/Firefox or independent software/programs to help with your search engine optimisation needs. Here are my top 14 SEO tools for 2014: 1. Feed The Bot (On-page recommendations engine) Feed the bot is a website by Patrick Sexton. The website is essentially a tool whereby you enter a URL for page on a site you want to check to verify if it meets Google’s Webmaster Guidelines. Another feature allows you to see whether your mobile SEO is up to scratch and will crawl the page and check for the following issues: User experience problems via mobile Mobile performance problems Googlebot access / robots.txt Site load speeds & enhancement recommendations It’s very useful for a quick snapshot to find potential issues. And because ‘mobile SEO’ is becoming a bigger and bigger issue, this tool will be essential for those that are not getting mobile right. I for sure need to look at the mobile UX on this website! 🙂 Website: Feed The Bot 2. URL Profiler URL Profiler is an amazing tool that is suitable for anyone that runs a website. You can use this tool in conjunction with Screaming Frog. I’ve used Screaming Frog to crawl a competitors website and then I’ve run those URLs through URL Profiler to get social metrics such as Facebook and Twitter shares – along with backlink data for those respective pages. It’s also able to give you the data that you’d get from the PageSpeed Insights and Mobile-Friendly tool, to check if your website is mobile-friendly and to give you the score out of 100 that Google gives you when assessing your site’s mobile usability. It’s a great tool and comes at a really cheap cost. There’s a 14 day trial that you can sign up for if you want to take give it a go without having to pay. Website: URL Profiler 3. Screaming Frog I praise this program like no other and it’s definitely a multi-purpose tool that will help you with your SEO needs. The program is primarily for on-site optimisation requirements. If you’re migrating your site (changing URL structures & planning 301 redirects), finding duplicate content, creating meta descriptions, giving an overview of a particular site, and creating technical SEO audits then this tool is definitely for you. There is a paid and a free version of this tool; however, the free version will...

Read More
Page 1 of 41234