Google Think Finance 2015, Dublin, Ireland #gfinance
Oct07

Google Think Finance 2015, Dublin, Ireland #gfinance

10SHARESFacebookTwitter I was lucky enough to be invited to Google’s Think Finance Conference in Dublin, on Wednesday and Thursday on the 16th and 17th of September. The speakers were great and were more importantly thought provoking and really did engage with the audience. It also helped that most of them were quite funny, too and are known industry experts. It all started off with with a choir singing and then Anita O’ Flynn, Sales Manager – Google introduced everyone to what was on the agenda and how exactly everything would work. Then Emmanuel Dolle, Head of LCS – Google France gave a funny presentation himself. He was also in the choir, though it was unclear what exactly what he was doing which made the situation quite funny. You can see this in the picture below (credit: Twitter): Connect: Owning The Moment Hanne Tuomisto-inch – Industry Head – Google Google have talked a lot on the Think Finance website about making the most out of micro-moments using display and have an entire section dedicated to micro-moments. It was all about brands identifying the moments that may not exactly drive commercial intent, but may drive more of a branding argument. If user x visits an informational page, then the next time that user x does a search he may pick your brand because of the previous association he has had with getting the valuable information from your site. It becomes all the more important because of a lot more people are now using their smartphones to search, which brings on the point that 91% of smartphone users will use their mobile to inform their decision when purchasing a product. It’s new battleground for brands, the — want-to-know moments, want-to-go moments, want-to-do moments, and want-to-buy moments. It looked at a lot of this from a Paid Media perspective, and I can imagine this was a push from Google to get finance companies to start bidding in PPC for informational terms – not exactly high ROI terms or easy-to-prove return on investment keywords/phrases. In organic search, you can argue that this is already something that we do. We cover the moments that simply won’t give us conversions, but gives our users the information they need to make a purchase, for example. The only thing I’d say is that they could have mentioned something about collaborating with content teams or your SEO team to create these pages. There was one example of a moment in Ireland about providing information on the National Car Test, a mandatory car test procedure, and then being able to be front of mind when selling car insurance for example because you provided...

Read More
Best SEO Tools & Resources for 2015
Jul08

Best SEO Tools & Resources for 2015

20SHARESFacebookTwitter 2014 was so last year, so I thought I’d create a new post on tools to use in 2015. Of course, preferred tools change all the time and tools that I used last year, I may no longer use anymore. Last year, I created a list of 14 of the best SEO tools to use in 2014 and now I look back at some of them, and think, are they still relevant or as important as they were last year? I’ve also changed jobs and have gone from agency to in-house, so the dynamic has changed in that respect. I’ve gone from doing ‘agency SEO’ to ‘in-house SEO’, which are vastly different in my opinion, but that’s another subject for another time. Introducing a twist to this article: I’ve included real life examples/scenarios of where I’ve used each of these tools. Click on the drop down bars below each section to view more information. Now, let’s get on with the list of the best SEO tools so far in 2015: 1. URL Profiler Website: URL Profiler URL Profiler is by far one of the most useful tools that I’ve come across in 2015. Its features are almost similar to another tool called Netpeak (free). I was introduced to that tool back in 2014, and was impressed with Netpeak, but I’m even more impressed with URL Profiler. It does come with a cost, but it’s well worth the cost with regard to the technical abilities you gain out of using this tool. This is a definite MUST for agencies and anyone working in-house. Its useful for analysing websites you work on and competitor websites. At £9.95 per month for a solo license, this tool is affordable for anyone doing serious SEO work on a website. The next biggest packages come at a cost of £12.95, £19.95, and £29.95. The various packages allow you to use the software on more than one machine, increased connections speeds, and allow for larger URL imports. More info on this tool (click to expand) Real Life Scenario with Using this Tool I used URL Profiler to do competitor analysis on Compare The Market and their new ‘2 for 1 cinema deals.’ I basically wanted to find out, as a result of their new marketing giveaway, how many websites and what websites were linking to CTM. Ahrefs & Majestic were only showing 21 referring domains to a certain section of their site where the 2 for 1 deal exists as a landing page. However, I knew that they’d seen far greater exposure. So I used a Chrome Plugin called OS Scraper to scrape relevant search results about...

Read More
Tweets from Twitter Showing Up in Google News Results
Jul04

Tweets from Twitter Showing Up in Google News Results

18SHARESFacebookTwitter If you’ve been following what has been going on with reddit, then you would have known about the big revolt that has taken place on the front page of the internet. To sum it up in one sentence, popular sub-reddits have been temporarily closed down after a popular community manager (officially, communications director) was fired recently by reddit. It wasn’t handled well, as she was integral to the management of many of the popular celebrities who posted on /r/IAMA, yet the moderators of that sub-reddit complained at the lack of communication. There’s been resentment against the people running reddit, especially the interim CEO of reddit, Ellen Pao, who has been ripped to pieces for her apparent mismanagement of reddit. Now redditors are looking for alternatives to reddit. One alternative is a website called voat.co, which has been bombarded so hard with new visitors that their website displays an error message when you try to visit it. Now, on to the point of this post. This is actually the first time I’ve come across normal Twitter users appearing in Google News results. Back on May 19th, Google and Twitter announced a data partnership that allows Google to access data from Twitter. As a result of this, Twitter has a prominent area in Google’s search results that shows tweets that are trending in real-time. What has surprised me, however, was the fact that Google has also given Twitter another source of traffic – real estate in Google News:   It’s interesting how Unidan, once a popular user on reddit, and a normal user on Twitter (not even a verified Twitter account), is appearing in Google News. For those that don’t know, Unidan was shadow banned in 2014 after employees at reddit discovered that he had created several reddit accounts to manipulate upvotes on his own comments. That’s not to say that Unidan is now gaming Google, but perhaps his popularity on Twitter, with 2.8k followers, has given his account enough authority to be listed on Google News? My theory is, because Voat is a relatively new brand, Google are still trying to pull in trusted sources for news associated with that brand and somehow a random user on Twitter who has mentioned the brand is being shown as a result. Whatever the case, could Twitter users showing up in Google News become the norm? There was also no inclusion of a hash tag, but the mere mention of the term ‘Voat’ seems to have triggered this. It’s interesting that this sort of thing can happen when topics or brands are trending, as this is quite obviously a valuable source of traffic. Update:  I’ve re-checked 5...

Read More
Can low quality parts of your site impact other areas of your site?
Jun28

Can low quality parts of your site impact other areas of your site?

22SHARESFacebookTwitter I brought this question up with John Mueller, Google’s Webmaster Trends Analyst, who hosts a weekly Webmaster Hangouts video where he answers webmaster related questions. I was going to attend one of those hangouts, however, you can simply create threads on the Google Webmaster Central forum and ask questions there and PM those questions directly to John Mueller — if you want an answer from Google. Not a lot of people know that. How did John respond to this question? His response to this question was just as I had thought. It’s obvious, but it’s good qualifying this and confirming this. They do look at websites on a domain level and if one section of your site has low quality content on it, then that may even impact the rest of your website, regardless if the quality on those other pages are of a high standard. JohnMu said: In general, if you host the content on your site, and it’s indexable, then Google will count that as a part of your site’s content. If you’re blindly posting user-generated content, then that can certainly affect how Google’s algorithms view your site overall. We have a few tips in our help center on this topic: [on user-generated spam] and [on ways to prevent comment spam]. In general, I’d recommend making sure that you’re comfortable hosting what you’re providing to search engines (and users!), and if you feel that the overall quality isn’t what you’d like it to be, then take appropriate measures to either restrict low-quality content from being published, or to prevent it from being indexed. I’ve responded and am attempting to discover if whether the core algorithmic updates this year (the quality update and very recently the largest updates recorded by MozCast and Algoroo in June) have given more weight to domain level signals. John has said in the past that their algorithm attempts to process websites on a page-by-page level, but clearly updates such as Panda have sitewide implications. I’m of the belief, at the moment, that Google’s core updates that occurred on May 3rd and the 16-17th of June were quite possibly changes that put more weight on sitewide quality aspects. Looking at examples of websites that suffered after these updates gives a good indication. And the sorts of sites that did suffer were sites that find it difficult to control the quality of the content that gets uploaded their site – due to user generated content. An example of where this has occurred is with HubPages.com: To quote the CEO of Hub Pages: 22% of Google search traffic disappeared! — Paul Edmondson http://pauledmondson.hubpages.com/hub/May-Day-2015-Google-Update Hub Pages suffered as...

Read More
Large Google Update on the 16th of June?
Jun17

Large Google Update on the 16th of June?

14SHARESFacebookTwitter There have been talks of small scale and high scale Google updates occurring in June, with webmasters reporting that they’ve either seen a drop in rank or an increase in rank for their respective niches. Looking at the SEO weather tools, they have identified that there have been big changes in the SERPs in June. Tools 1: MozCast http://mozcast.com/  Moz has their own tool called MozCast, which attempts to identify any turbulence in Google’s algorithm by assessing the volatility of a sample of 1,000 keywords. Its most recent report shows that on Tuesday, 16th of June, there was major turbulence that pushed the MozCast metric to 102°. To put this in perspective, the first ever Penguin update clocked in a figure of 93.1°. Tool 2: Algoroo https://algoroo.com/ Another tool called Algoroo samples 17,000 keywords and looks for fluctuations in a similar way to MozCast. This tool is also reporting a high amount of SERP flux and volatility in the same time periods as MozCast. If we look at when Google’s Phantom II / Quality Update occurred we can see that a recent increase, on the 16th of June, shows a far larger fluctuation in the SERPs than the Google Quality Update on the 3rd of May: Google’s Response Google have confirmed that this is neither a Panda, Penguin, nor HTTPS update. It was thought that this could have been an HTTPS update because recently (12th of June) Wikipedia encrypted their entire website in HTTPS/SSL. With a large percentage of Wikipedia pages being dominantly displayed in the top 10 search results this could have caused fluctuations, as Google indexes the HTTPS version of the site. This may have been the cause for the fluctuations in the SERPs where tools have shown high volatility. Pete Meyers at Moz explains this in more depth, here. However, Pete asked Google’s Webmaster Trends Analyst, Gary Illyes, if there was an HTTPS update that could have caused these high fluctuations. He wouldn’t go into further detail and responded on Twitter stating: At the time of writing, further details explaining the extent of this change haven’t been made available by Google. However, they did confirm to Search Engine Land that this was not a Panda update, stating: “This is not a Panda update. As you know, we’re always making improvements to our search algorithms and the web is constantly evolving. We’re going to continue to work on improvements across the board.” The reason there was speculation around a Panda update is because Gary Illyes announced on June 2nd at SMX Advanced that webmasters should expect a Panda refresh within the upcoming weeks. At the same time, he announced that...

Read More
Page 2 of 1912345...10...Last »