Exciting New Release By BrightEdge: Data Cube
Mar16

Exciting New Release By BrightEdge: Data Cube

12SHARESFacebookTwitter The new security enhancement feature that BrightEdge today announced is not as exciting as the new “Data Cube” feature that allows BrightEdge customers to leverage over 100 terabytes of data that BrightEdge processes each week. It’s actually a feature very similar in another platform called SEMRUSH, which I have recently been trialing as I’ve been given a three month trial. Before this, BrightEdge would only give you keyword-level data on competitors you would have to submit to them and this makes the “SEO Performance” tab quite useful, but I see this new feature being even more useful. Now you can enter any website you want via the “Data Cube” tab – similar to how search engines work or a better version of Google’s Keyword Planner. However, it’s far more extensive with options such as “Discover Long Tail Keywords” and “Explore Content Strategies” being two features within this feature being the eye candy. BrightEdge has had access to this sort of data for over 6 years and in a .pdf they recently released to customers  stated that “we are giving you direct access!” I’m quite frankly surprised that this feature wasn’t already available as it is something that other platforms such as SEMRUSH have been refining for several years. Either way, I feel that it is a great feature that I welcome and I’ll likely be using. BrightEdge are marketing this as a feature that provides “on-demand research capabilities” and they certainly have the data to back this up. Here’s a small screenshot of the tool in action (via BrightEdge): I really like the fact that you can enter a domain or keyword. For example, if I wanted to leverage BrightEdge for my military website I would simply have to enter “military vehicles” and should hopefully find a full list of keywords that I could then target. Then if I had my military website plugged into BrightEdge I could track these keywords and start creating content around the newly tracked keywords or I could start creating content utilising the “Discover Long Tail Keywords” tab to find the long tail keywords that I should be ranking for. The idea is to identify the keywords that I want to rank for, track them, and then create content around them. And because these keywords are being tracked I can then identify if whatever SEO work I do on my military website, for example, is working or not working. Best feature yet The fact they are now giving us access to this data is going to make the Google Keyword Planner somewhat irrelevant. With this new tool you’ll be able to find keywords that...

Read More
What Guest Posting Patterns Can Google Really Pick up on?
Mar13

What Guest Posting Patterns Can Google Really Pick up on?

21SHARESFacebookTwitter Guest posting is a way to increase recognition while driving traffic back to your blog. It’s a method that has proven effective for link-building and for increasing popularity. Based on these facts, bloggers – especially new bloggers – are often eager to find guest posting opportunities. They reach out, write posts and are published without a second thought. This could be detrimental based on the advances in search engine technology and the patterns new algorithms can pick up on. Before starting a guest posting campaign, consider the facts below to ensure your attempt at driving traffic doesn’t backfire. Repetition Can Work Against You With the onset of new search engine technology, like Google’s Hummingbird algorithm, repetition can be penalized. If a byline is the same across multiple sites, search engines are likely to pick up on the trend; understand the SEO tactic at work and penalize it. Bylines that are the same, or too similar, create a footprint, which makes it easier to track information across the Internet. It can also create duplication issues for the sites you’re guest posting on – for example, if you use the same byline for five sites, Google may see those sites as linked, and penalize them as well as you. To create guest post bylines, think about the information that matters for the blog you’re writing for and incorporate it. Use two or fewer links and consider taking advantage of Google+’s rel=author tool. Remember, variety is just as critical in bylines as in content. Commercial Byline Links Are a Trend of the Past In the past, using a commercial link, especially an exact-match keyword link, was typical of guest posts and generally accepted. In some cases, it was advantageous for link-building but now the opposite is true. While some blogs still allow the trend, search engines tend to ignore such links. To maintain credibility, or possible search ranking penalties in the future, it’s best to do away with the practice now. The Quality of the Site You’re Posting to Matters Some blogs depend on guest posts entirely, while others have a core team and offer occasional guest posting opportunities. This matters. When a blog is comprised solely of guest posts, its purpose is probably to increase search rankings. When you take the time to write for a site like this, it could increase your visibility, however, it could also hurt your online reputation, especially where search engine crawlers are concerned. Of course, there are always a few exceptions – the Moz blog and Search Engine People are two examples that come to mind in the SEO niche. When posting to a...

Read More
The Benefits of Ugly SEO
Mar12

The Benefits of Ugly SEO

10SHARESFacebookTwitter SEO is, essentially, an exercise in efficiency. Much of what we do hinges on the ease of access of both the user and the multitude of crawlers that establish how “good” or “quick” a site is to use. This, one would think, couldn’t gel with an ugly site experience, what with tonnes of widgets, pictures and links plastered all over a site’s front page, usually there to stall load times and try a user’s patience, while giving the designer a chance to bask in their own glory. However, there’s many ways in which ugly SEO can greatly benefit a site, and it’s right now my job here to show you some methods I’ve seen that benefit SEO that could be thought of at the cost of the user. Implemented well can give SEO’ers and users something to think about. Scrap the flash Install the Web Developer Tool. Then, disable images, javascript and linearize the page. Now what you see is ultimately what Google does too, more or less. So, what do you see? An abundance of text? Or lack of it due to overt use of image, flash and so on. I took a look at Jonny’s ‘website’ (“vlexo.net”), below: Dull, I’m sure you’ll agree. (Sorry Jon, you might want to work on that font) but something that can be easily analysed by all varieties of crawlers. This is not so for a majority of older sites, and especially when designers are on board, then things can get a little tricky. Now, Flash is especially relevant today: the proliferation of mobile devices that can’t utilise flash have likely set back older websites who feature much of it. I wouldn’t be very surprised if I’d heard of sites losing rank over overt flash use, so try to minimise the amount of flash or java on your website. Unfortunately (for some) our friends at Google deem use of images vital to relevancy, although of course they can’t quite distinguish between what we see and what they’re told we’re seeing. Regardless, stripping your site down to composite (?) linear elements can give a great boost to both usability and crawlability (a word I’ve just coined.) A great example of this is gov.uk, who use the simplest fonts, colours and designs, while providing useful and concise information. It may seem dull, but there’s little better. Big Footers  I’m a big fan of this concept that I had recently discovered over here on SEW, which is essentially to create the biggest footer known to man, an internal-linking-site-map-all-within-a-footer, if you will. I’ve borrowed the image, but I hope it’s a tactic that will be...

Read More
Matt Cutts didn’t expect this response via Twitter (Scraper URL)
Feb28

Matt Cutts didn’t expect this response via Twitter (Scraper URL)

19SHARESFacebookTwitter Google’s head of search spam, Matt Cutts, recently tweeted a link to a form that allows website owners to report websites that have scraped content from other sites and that rank higher than those of the original source. Cutts is referring to website owners that scrape content from other websites and post that content on their own websites. This sort of means that Google might not be able to tell, in some cases, what site is the original source. Probably not what Cutts expected .@mattcutts I think I have spotted one, Matt. Note the similarities in the content text: pic.twitter.com/uHux3rK57f — dan barker (@danbarker) February 27, 2014 To put it simply, someone has spotted that Google does exactly what a scraper does. It scrapes content from sites like Wikipedia, and uses it for its semantic search functions by providing the information in the SERPs, which means everyone should report Google for exactly what it is telling others to report. Whilst Google only extracts an excerpt of information from articles via Wikipedia, the irony is actually quite...

Read More
If Guest Blogging is Dead, Which Link Building Methods Still Work
Feb16

If Guest Blogging is Dead, Which Link Building Methods Still Work

20SHARESFacebookTwitter All over the blogosphere this week comes the conclusion ‘Guest Blogging is Dead’, largely due to Matt Cutts’ recent post on the matter in which he warns if you’re using guest blogging as a way to gain links in 2014, you should probably stop. Certainly, this does mean that one of the last obvious methods for gaining link traction in a way which wasn’t going to get you imminently penalised has come to an end. For most real online marketers however, the news has neither come as a surprise nor a reason for despair. It’s been quite obvious for a long time that writing an article in five minutes then sticking a thin bio link underneath with a keyword anchor text link in it wasn’t going to make the good people at Google especially happy. Most of these articles were the thinnest excuse for prose anyway and by no means adding value to the web. Nevertheless, for an industry already crouching down in the bunker, this news does appear like the final assault in a very long campaign of attack by Google. Does SEO have any kind of a future, some people ask? Are there any link building methods left which still work and which won’t risk a penalty? Guest Blogging has a little Life Left in It Yet The first thing to say here is that contributing valuable, relevant content to other websites in your niche remains a valuable way to get a link. Certainly if you’re promoting a marketing company and you write an article about gardening, linking in the bio, you’re going to get into trouble. But if you’re writing an article of genuine quality in a similar niche to your own you’re in the right ballpark and, if you watch Matt Cutts videos on the subject, he confirms this. Secondly, if you do choose to link in your bio don’t go for that most obvious tactic of spam: the anchor text link. That’s like waving a red flag under Cutts’ nose and is just a total waste of time. Better to link to your G+ authorship page and your brand by name or as a raw URL link. Even better would be to write an article in which your own website could appear as a contextual link. So if you’re writing about ‘Five Killer Landing Pages’ for example, you can link to one of your own and 4 competitors. This gives you a contextual link in the most natural setting possible, within an article which is a genuine resource. Even in the case of a manual review, this is always going to pass muster....

Read More
Find Out Who Shared Your URL on Facebook
Feb09

Find Out Who Shared Your URL on Facebook

20SHARESFacebookTwitter Finding out who linked to you on social media I noticed that when I published an article yesterday, I had a few referrals come through from Facebook. But when I clicked to find out exactly where on Facebook my post was shared I couldn’t actually find the exact location of the referral page on Facebook. I came to conclusion that I’d need some sort of tool as Google Analytics wasn’t showing me exactly where the referral came from and nor was WordPress’ Jetpack. I searched and searched with long-tail inputs such as “find out who linked to you on facebook” and came up with results from 2011 that really weren’t relevant today. Suggestions included using Facebook’s internal search system, which no longer works how it once did. Social Mentions – A social listening tool I then somehow came across a website called “Social Mentions“, and it provided me with the ability to find exactly who was sharing my post and where. The tool itself works like a search engine and you simply have to type in the keywords or in my case I typed parts of the title of the page that was being shared on Facebook. Again, that was the post from yesterday about Halifax being partially penalised. You can see the results here: It picked up exactly where these shares were coming from, and from this I could identify who was sharing and why. For example, yesterday’s post was shared by someone because I had commended their detective work on Halifax.co.uk penalty, which happened to be a case study on Link Research Tools. Perhaps, I should commend a lot more often! What I found A little surprise. That’s pretty neat. I’ll be using this tool for a lot more as it such a useful...

Read More
Page 5 of 11« First...34567...10...Last »