Best SEO Tools & Resources for 2015
Jul08

Best SEO Tools & Resources for 2015

20SHARESFacebookTwitter 2014 was so last year, so I thought I’d create a new post on tools to use in 2015. Of course, preferred tools change all the time and tools that I used last year, I may no longer use anymore. Last year, I created a list of 14 of the best SEO tools to use in 2014 and now I look back at some of them, and think, are they still relevant or as important as they were last year? I’ve also changed jobs and have gone from agency to in-house, so the dynamic has changed in that respect. I’ve gone from doing ‘agency SEO’ to ‘in-house SEO’, which are vastly different in my opinion, but that’s another subject for another time. Introducing a twist to this article: I’ve included real life examples/scenarios of where I’ve used each of these tools. Click on the drop down bars below each section to view more information. Now, let’s get on with the list of the best SEO tools so far in 2015: 1. URL Profiler Website: URL Profiler URL Profiler is by far one of the most useful tools that I’ve come across in 2015. Its features are almost similar to another tool called Netpeak (free). I was introduced to that tool back in 2014, and was impressed with Netpeak, but I’m even more impressed with URL Profiler. It does come with a cost, but it’s well worth the cost with regard to the technical abilities you gain out of using this tool. This is a definite MUST for agencies and anyone working in-house. Its useful for analysing websites you work on and competitor websites. At £9.95 per month for a solo license, this tool is affordable for anyone doing serious SEO work on a website. The next biggest packages come at a cost of £12.95, £19.95, and £29.95. The various packages allow you to use the software on more than one machine, increased connections speeds, and allow for larger URL imports. More info on this tool (click to expand) Real Life Scenario with Using this Tool I used URL Profiler to do competitor analysis on Compare The Market and their new ‘2 for 1 cinema deals.’ I basically wanted to find out, as a result of their new marketing giveaway, how many websites and what websites were linking to CTM. Ahrefs & Majestic were only showing 21 referring domains to a certain section of their site where the 2 for 1 deal exists as a landing page. However, I knew that they’d seen far greater exposure. So I used a Chrome Plugin called OS Scraper to scrape relevant search results about...

Read More
SEMrush Site Audit Tool
Apr21

SEMrush Site Audit Tool

7SHARESFacebookTwitter I’ve recently been using this new Site Audit Tool that SEMrush has launched that is actually quite powerful and I would say that is very comparable to another tool called DeepCrawl. The unfortunate thing about this tool is that it is still in beta mode and this for some reason means that you will be unable to export the site audit reports into Excel or any other format. The fact you can’t export this data is actually quite annoying as I’m having this issue with a client where Screaming Frog won’t for some reason tell me whether my clients website’s external images have alt-text defined, as they are hosted on Amazon’s content delivery network (CDN). I don’t understand why this is an issue for Screaming Frog, and perhaps there’s something I am missing. Update: You can indeed identify alt-text on images hosted on Amazon or any other sort of content delivery network (CDN) via Screaming Frog, this option is just not where you’d expect it to be. I messaged Dan Sharp via Twitter, the founder of Screaming Frog, and he promptly sent me a response linking to the FAQ where it highlights how you can do this. Anyways, before I derail let’s get back to the Site Audit Tool! Here you can see the overview tab: Duplicate Content This would be a great feature, but I’m afraid it’s just not accurate enough. If there were 223 duplicate pages then I’m pretty sure I would have already worked that out without the use of this tool (BrightEdge for example identifies duplicate content issues), but it has reported back quite a few pages that aren’t similar/duplicate at all. However to give this function a little credit it has found some pages that are duplicate, but the majority are not duplicate/similar pages. Duplicate Title SEMrush Site Audit Tool has found titles that have been duplicated due to parameters at the end of the URLs. With this information I can action changes to the robots.txt or via Google Webmaster Tools to block these URLs with parameters. External Links Broken This is quite useful, however, unfortunately has the same problem as what I’ve written in the “Internal Links Broken” section. If the URL that is externally being linked to has a space in there somewhere, SEMrush’s Site Audit Tool seems to cut this off at the space in the file name and count this as a broken link. If you take a look at the screenshot above, there are 103 listed broken links, but only one of those links are actually broken links — making this feature not very useful. Internal Links Broken It gives you...

Read More
Exciting New Release By BrightEdge: Data Cube
Mar16

Exciting New Release By BrightEdge: Data Cube

12SHARESFacebookTwitter The new security enhancement feature that BrightEdge today announced is not as exciting as the new “Data Cube” feature that allows BrightEdge customers to leverage over 100 terabytes of data that BrightEdge processes each week. It’s actually a feature very similar in another platform called SEMRUSH, which I have recently been trialing as I’ve been given a three month trial. Before this, BrightEdge would only give you keyword-level data on competitors you would have to submit to them and this makes the “SEO Performance” tab quite useful, but I see this new feature being even more useful. Now you can enter any website you want via the “Data Cube” tab – similar to how search engines work or a better version of Google’s Keyword Planner. However, it’s far more extensive with options such as “Discover Long Tail Keywords” and “Explore Content Strategies” being two features within this feature being the eye candy. BrightEdge has had access to this sort of data for over 6 years and in a .pdf they recently released to customers  stated that “we are giving you direct access!” I’m quite frankly surprised that this feature wasn’t already available as it is something that other platforms such as SEMRUSH have been refining for several years. Either way, I feel that it is a great feature that I welcome and I’ll likely be using. BrightEdge are marketing this as a feature that provides “on-demand research capabilities” and they certainly have the data to back this up. Here’s a small screenshot of the tool in action (via BrightEdge): I really like the fact that you can enter a domain or keyword. For example, if I wanted to leverage BrightEdge for my military website I would simply have to enter “military vehicles” and should hopefully find a full list of keywords that I could then target. Then if I had my military website plugged into BrightEdge I could track these keywords and start creating content around the newly tracked keywords or I could start creating content utilising the “Discover Long Tail Keywords” tab to find the long tail keywords that I should be ranking for. The idea is to identify the keywords that I want to rank for, track them, and then create content around them. And because these keywords are being tracked I can then identify if whatever SEO work I do on my military website, for example, is working or not working. Best feature yet The fact they are now giving us access to this data is going to make the Google Keyword Planner somewhat irrelevant. With this new tool you’ll be able to find keywords that...

Read More
What Guest Posting Patterns Can Google Really Pick up on?
Mar13

What Guest Posting Patterns Can Google Really Pick up on?

21SHARESFacebookTwitter Guest posting is a way to increase recognition while driving traffic back to your blog. It’s a method that has proven effective for link-building and for increasing popularity. Based on these facts, bloggers – especially new bloggers – are often eager to find guest posting opportunities. They reach out, write posts and are published without a second thought. This could be detrimental based on the advances in search engine technology and the patterns new algorithms can pick up on. Before starting a guest posting campaign, consider the facts below to ensure your attempt at driving traffic doesn’t backfire. Repetition Can Work Against You With the onset of new search engine technology, like Google’s Hummingbird algorithm, repetition can be penalized. If a byline is the same across multiple sites, search engines are likely to pick up on the trend; understand the SEO tactic at work and penalize it. Bylines that are the same, or too similar, create a footprint, which makes it easier to track information across the Internet. It can also create duplication issues for the sites you’re guest posting on – for example, if you use the same byline for five sites, Google may see those sites as linked, and penalize them as well as you. To create guest post bylines, think about the information that matters for the blog you’re writing for and incorporate it. Use two or fewer links and consider taking advantage of Google+’s rel=author tool. Remember, variety is just as critical in bylines as in content. Commercial Byline Links Are a Trend of the Past In the past, using a commercial link, especially an exact-match keyword link, was typical of guest posts and generally accepted. In some cases, it was advantageous for link-building but now the opposite is true. While some blogs still allow the trend, search engines tend to ignore such links. To maintain credibility, or possible search ranking penalties in the future, it’s best to do away with the practice now. The Quality of the Site You’re Posting to Matters Some blogs depend on guest posts entirely, while others have a core team and offer occasional guest posting opportunities. This matters. When a blog is comprised solely of guest posts, its purpose is probably to increase search rankings. When you take the time to write for a site like this, it could increase your visibility, however, it could also hurt your online reputation, especially where search engine crawlers are concerned. Of course, there are always a few exceptions – the Moz blog and Search Engine People are two examples that come to mind in the SEO niche. When posting to a...

Read More
Useful SEO Tools & January SEO Recap
Feb01

Useful SEO Tools & January SEO Recap

10SHARESFacebookTwitter I’ll be introducing a weekly update to this blog from now on. Essentially what I’ll be writing about is what I’ve done in the week that is noteworthy. Simply, anything noteworthy will be listed here and hopefully it will be useful to those reading. If anything, it might only be useful for myself — in that case, my apologies. Tools To start off with I was recently introduced to a new SEO tool called Netpeak. The tool practically runs on APIs and it’s another of those tools that I’ve listed in the “Best SEO Tools for 2014” article. The tool is quite versatile and allows you to check hundreds if not thousands of URLs’ metrics. The metrics that it includes are: Domain Authority; Page Authority; PageRank; Citation Flow; Trust Flow; Ahrefs backlinks; MozScape (OSE) backlinks; MajesticSEO backlinks; Domain Age; Google Index; Social metrics: StumbleUpon, Facebook Likes/Shares, Twitter, Google+ and more. This is very useful for a number of reasons. If you’re doing outreach and you want to get your content on a high quality site then there is no better way to identify a decent website (aside from reading said site’s content) by analysing it with Netpeak. I’ve not only used this for outreach, but for analysing clients’ websites; it’s just another way of identifying backlinks where you did not know you had them at all. I actually recommend using Screaming Frog to scrape your website and then run all your URLs within this tool. It really gives you a nice overview of how your site is doing, especially within the social media aspect of things. Netpeak essentially takes all the manual work out of it all and is a nice package. I really do recommend it. A big thanks to James Phillips, a new co-worker, who recommended it. Methods Two of my clients at the agency I work at recently moved to new CMSs (It’s sort of the reason why I wrote that article on checking the backlink profile of 5,000 URLs), and this of course meant that the URL structure changed. Setting up 301 redirects is obviously one of the most important things to do when a change like this occurs; otherwise, you lose any value that you had in the search results page to a 404 page if there is no redirect in place. What I did actually identified issues that my client were not aware of. We had redirects going to totally irrelevant pages, and simply pages that had been deleted without any redirect taking place. How do you find these? Our relationship with our client is moreover one where we do a lot...

Read More
Page 1 of 41234