Google deprecating AJAX crawling scheme

On Wednesday the 14th of October Google are no longer recommending the AJAX crawling proposal we made back in 2009 (
They openly advise now to use #! in URLs over ?_escaped_fragment_= .

“Times have changed. Today, as long as you’re not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers. To reflect this improvement, we recently updated our technical Webmaster Guidelines to recommend against disallowing Googlebot from crawling your site’s CSS or JS files.”

Questions and answers

Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you’ve deprecated your recommendation?
A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you’re making the next update for your site. Instead of the _escaped_fragment_ URLs, we’ll generally crawl, render, and index the #! URLs.

Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects?
A: If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls. .

Q: I use a JavaScript framework and my webserver serves a pre-rendered page. Is that still ok?
A: In general, websites shouldn’t pre-render pages only for Google — we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user’s experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.


Google aggressively tackles hacked web spam

On Monday October the 5th Google has officially confirmed that it has been rolling out the updated algorithms geared more specifically towards identifying spam in SERPs coming from the hacked sites.
With large amounts of legitimate sites being hacked by spammers and used to engage in abusive behaviour Google has been “forced” to take decisive action. Common practices such as malware download, promotion of traffic to low quality sites, porn, and marketing of counterfeit goods or illegal pharmaceutical drugs are some of the most common offences Google is currently paying attention to.

As per information on Webmaster Central Blog. The algorithmic changes will eventually impact roughly 5% of queries, depending on the language. Webmasters are being warned as Google rolls out the new algorithms, users might notice that for certain queries, only the most relevant results are shown, reducing the number of results shown.
Google also indicated that due to the large amount of hacked spam being removed and fine tuning of their systems to weed out the bad content while retaining the legitimate results some of the SERPs may effectively clear out substantially.

cleared SERPs

All these changes put now more and more responsibility on webmasters to ensure they are site security up to date.

HTTPS Content Mismatch Errors May Not Earn Ranking Boost

In recent August 28th Google hangout on Google+, John Mueller indicated a very important fact by saying at 18:30 mark that Google may not give HTTPS web pages the desired HTTPS ranking boost in case the page serves up HTTPS content mismatch errors.

What does it actually mean? It’s relatively simple, when your web page uses HTTPS protocol but somehow there are some elements on the page that are not secure e.g. images not over HTTPS or any social media plugins etc.
So far Google has been quite liniment and your pages might have benefited from move to HTTPS, even if they are not fully HTTPS secure.

However according to John Mueller this may change moving forward;

I imagine at some point, we will say you will have to serve really clean HTTPS for us to show that in the search results. I don’t know. I don’t think that there will be any kind of penalty around that. Just that we may say, we won’t count this as HTTPS, so it won’t get that small HTTPS ranking boost.

Summarising, think twice when planning to move to HTTPS as many of my industry piers have been already noticing site visibility drops after move to HTTPS even despite keeping all clean and tidy in the process. I’m still of the opinion that there always should be strong business case to support move to HTTPS rather that doing it for the sake of compliance with Google’s advice.

Trail Marathon Wales 2015

On Saturday 20 June 2015 I had run my first marathon. I wanted to choose the course I will remember forever and I have to say Trail Marathon Wales has actually surpassed my wildest imaginations. The course is in the most beautiful mountains of the Snowdonia National Park Coed y Brenin. The 42km course features single track paths over 3500ft elevation gain across up 20% inclines. I made it in 4:49 minutes which gave me 167 place out of 340 participants. Testimony to how hard the marathon was that some of the previous marathon runners didn’t manage to finish the full race. After talking to some of the fellow runners pre and post race it was necessary to add on average 1 hour to the regular marathon time to cope with TMW.

For running down hills which are abundant and steep there I used sound advice on technique from Sage Canaday

A couple of pictures including course map and elevation profile of the race;

tshirt nearlythere crossingline 2015-06-21 2015-06-21 21-46-21


Sascon 2015

Today I had an opportunity to take part in the panel Technical vs Content during first day of Sascon Manchester ; We were discussing what has the biggest impact on organic performance, the technical architecture of a website, or the content.? A view on the market concensus on where you should be spending your time. Speakers included: myself on behalf of iProspect Manchester; James English, BBC and Edward Cowell, Mediacom.

Content below has been originally posted by Emily Diment on

The panel for this session comprises of Radek Kowalski Group Technical Search Manager from iProspect Manchester, Teddie Cowell, Director of SEO at MediaCom and James English, Senior SEO Analyst for BBC Sport.

Everyone on the panel believes that the content vs technical debate is an interesting one. It’s something that is sometimes a bit of a contentious issue, but it’s one that always needs to be considered when planning an SEO campaign or organic strategy.

Technical SEO has moved away from just a tech audit to mobile, platform builds and commerce platforms which takes tech away from short term projects to more long-term tech SEO projects, according to Radek, which makes having technical knowledge a requirement for an SEO.

So, how much do other people within agencies know or care about SEO?

At MediaCom, SEO is a core focus within other departments’ agendas. There is a big desire within the industry to get content marketing to work harder. SEOs in general have a really good understanding of how the web works. The majority of the budgets being spent on brands aren’t being spent on SEO; they’re being spent on brand budgets.

SEO considerations are increasingly becoming a part of brand marketing conversations from the very outset because SEOs are good at joining the pieces of a campaign together from the beginning. SEO helps to maximise value from budgets.

There has been a huge change in SEO over the last three years, says Radek, moving heavily towards content. In fact, people are becoming obsessed with content: how to do it better, how to get better ROI. iProspect Manchester has become very good at creating SEO content which supports massive campaigns. Technical optimisation can’t be forgotten. It’s almost like a vehicle for how content can be deployed. If the range of clients is small it can be very easy to technically deploy content, but if you’re trying to work on massive eCommerce systems for very large brands it can be difficult to deploy content. The technical element to content-based SEO is now more important because it has to work with your content.

You could have good content with a good awareness but it could be more successful if you have the technical element sorted.

Whatever you do for a project you have to make key stakeholders happy by educating them about how SEO can make things better but still keeping technical standards high.

Educating clients of the value of SEO is one of the most important aspects of any SEO campaign, says Teddie.

SEO at the BBC

One of the most interesting things about doing SEO at the BBC is being able to see that they are competing on a different landscape – they’ve got the brand authority but it’s more about the way people are able to find the content that is becoming one of the challenges, as is the growing amount of competition the BBC has for news. They can’t just rely on the fact people will be searching for their content via brand searches. Its easy to become complacent but it is a highly competitive landscape.

Some big technical SEO priorities for the BBC at the moment are sitemaps and newsfeeds as well as being able to react quickly to big news stories and high profile events such as the World Cup or Olympics.

Educating clients

We can create a piece of content that is technically sound but it still needs to go on a client’s website and that is generally when errors and problems appear, says Radek. Changing wireframes can be expensive and daunting for clients. Similarly, UX does not always go alongside technical SEO priorities. Some elements that are great for UX may not work well with the UI and can break on different devices such as mobile or tablet.

Teddie believes it is an ongoing process because IT professionals sometimes have different ideas and different priorities to people working in and doing SEO.

What would you say works well for smaller websites? Should they focus on technical or content?

According to Radek, small sites with fewer than 500 landing pages should choose a good platform like WordPress with free plugins which can do 80% of technical work for you.

The smaller the site is, the more important the technical excellence is for you. He has seen small sites with no backlinks and a really well optimised technical elements that are ranking really well. Sites need to comply with technical SEO standards and make sure themes pass site validations for mobile friendliness and page speed amongst others.

Teddie agrees: there is still the role for technical SEO for smaller sites. They need to consider essential technical things including consistency for URL and header elements like h1s.

When you see content being produced that is great, what do you see are key technical elements that are often forgotten or missed?

With storytelling content using parallax pages, which is a popular design trend, the technical issue for SEOs is that is parallax pages are losing performance for searchers and some search queries.

James feels that it is important to experiment with different content formats to try to give the best experience for the user online. We are always trying to match the best experience for the user and match it with the site hierarchy.

Something designers love is Flash and JS but try to stick to HTML5. This is because it’s very flexible and easy to digest for the search engines. Sometimes different types of content make a page not mobile friendly and you are running the risk of making the page unconsumable on certain devices, says Radek.

70% of your audience may access your content using mobile (think about that). If you work in an industry that focuses on young adults they don’t really care about desktops, laptops or even tablets. They see the world via mobile phones and any campaign should be mobile friendly.

You have to remember that different clients have different issues, says Teddie. Not every brand was well prepared for the mobile update app. What Google did by forcing the point of the mobile agenda was a really good thing to do. Brands needed a wakeup call and that’s what the Google update did.

How much of a role does UX play for SEO?

SEOs need to define the two different elements – UI and UX. User interface will help to make your site more accessible and more compliant. BBC has been focused for years on making their website accessible for different users. BBC creative should be used as a standard for accessibility UI.

Google has been giving people massive amounts of technical information and guidance which people should be listening to. Google will start to become more and more picky, and is giving messages like “be mobile friendly”, “be clean”, “provide a good UX”. This is because they care, but they want to save their own money because their infrastructure to crawl the www most likely costs billions.

What is content to you? What is a good piece of content, what are you talking to clients about?

At MediaCom content can be lots of things, says Teddie. Anything that connects a brand to their consumers or something that people choose to engage with such as YouTube, press partnership or the website itself. It’s about how the users are moving around the content.

James believes like Teddie that it is anything that connects users to the brand. He tries not to use the word content much around journalists because to journalists content is like a dirty marketing term. They call content other words such as features, articles, interviews, video content, TV content.

The final question for the panel was a good one.

Where do you spend a limited budget that has to get results? Technical or content? Consider that the client is a retail company of a medium size.

James: Technical

Teddie: Depends on the nature of the business. but for medium size retail, technical.

Radek: investing in technical can be immediately seen as having a positive impact.


Google Webmaster Tools Search Queries No Longer Rounds Data

In the recent days we’ve seen a very promising development in Google Webmaster Tools console. John Muller, Google’s Webmaster Trends Analyst has announced that site data & metrics will be more actionable than ever with GWT search queries no longer be rounded & bucketed.

Google collects “impressions” & “clicks” to the site and displays in Webmaster Tools console for the last 90 days. This is definitely a great development in the light of sharp increase in the “not provided” dat

Google Matt Cutts on guest blogging video

Matt Cutts Head of Spam at Google has just released new video about guest blogging. In the video Matt explains when guest blogging can be perceived as spammy and what to avoid when guest blogging. He advises to use it in moderation and always to approach it with “how I can add value by sharing my knowledge” and also embark on creating actual relationship with blog masters rather than using it just for the link building.

Here is the video

Future of SEO and Google Semantic Search

We all ask this burning question – how will future of SEO look like?. In the wake of Google Penguin 2.0 and 2.1 killing backlinks,Google Humingbird algorithm update and sudden growth of “not provided” data triggered by Google defaulting to SSL search, just like any other SEO I’ve been trying to systemise and answer my own questions and doubts caused by all the changes we’ve recently experienced.

Some of my recent thoughts and questions worth crunching on;

• Discovering user intent using keyword data
• Predictions based on location, search history, circles and type of the device
• Google Hummingbird, Knowledge Graph, Voice Search and Google Now
• Matt Cutts search experience optimisation is the future of SEO – more info in the video here
• What experience do we want a visitor to have on our site?
• What are touch points, key actions and conversion points SEO and CRO.
• Technical SEO will prevail
• From links to answers
• Search engines want to “understand” more using structured data / microdata – I recommend you study schema markups at if you haven’t already
• Schema as technical language creating identifiers or entities that Google is very and will increasingly be interested in. That is how together with Knowledge Graph Hummingbird will be “connecting dots”.
• Google is rapidly adopting semantic search technologies
• Schema as technical language creating identifiers or entities
• Entity search, knowledge graph helping bots to better understand, match and create better response to a query

Use of semantic search coupled with semantic markup allows will be sharply increasing as Google needs more information and quickly for its Knowledge Graph. That is how Hummingbird will be able to increasingly provide better answers to user queries on various devices – wow, that is nearly like intelligent & learning algorithm. I therefore recommend to focus on technical SEO and definitely embracing semantic markup.

To conclude this space is definitely worth watching…….

Google says thank you to top contributors

Google has launched last week second Top Contributors Summit after testing the idea for the first time in 2011.

If you don’t know who the Top Contributor is let me explain; Top Contributors are highly experienced individuals helping (for free) to resolve various problems or sharing useful tips on how to get more out of Google range of products. They’ve have been a vital line of help in resolving many Google related products issues for the growing range of official Google product forums;

According to Google Top Contributor volunteers contribute across 250 product communities in as much as 26 languages.
During summit Top Contributors were also able to test latest Google products and technologies including Google Glass. It’s really nice to see that Google utilises and appreciates advocacy of such individuals.

My own experience with Top Contributors has always been a positive one, particularly for Webmaster Tools and Google Analytics support forums.

So thank you all, job well done for being there and helping us all when it seems as Google itself is not able or willing to help.

You may find more details on the event in the original blog here –

Tough Mudder Google Expert

Google Expert has now officially acquired rights to brag about being the toughest and muddiest Google Expert.

Team effort has resulted in nice amount being collected for The Mustard Tree – – a fantastic organisation that looks after the homeless and those marginalised by society.

Thank you all for your generous donations:

tough mudder google expert
More event pics here

Google Penguin 2.1 spam filtering algorithm update now live

Google’s is seriously busy this month. We all still try to get our heads around the recent new Google “engine” Hummingbird and now we get confirmed update to this new algorithm already.

If you are not familiar with what Hummingbird algorithm is, I encourage you to read my previous post –
Head of webspam Mutt Cutts has confirmed launch of the new update on the 4th of October and indicated that roughly 1% of searches will be affected.

Penguin 2.1 is another evolution of Penguin 2.0 which has caused major shift in the whole SEO industry by targeting spammy backlink profiles including paid links and black hat SEO tactics.

If you have already been hit by Penguin 2.0 and taken actions to remove and disavow part of your backlink profile this week make sure you’ll pay close attention to your rankings. You may either see improvements or further drops driven by this recent update.

Here you can find more information on what the Penguin update is –

Is Google PageRank dead?

Google page rank is something we SEOs used to pay substantial attention to. If you remember Google Toolbar used to display values from 0 to 10 which would indicate how much authority page had in Google eyes. It was truly helpful tool for quick competitive comparison of different websites from the same sector or great indicator of site’s quality when scouting for links.
Over the last couple of years we’ve seen Google substantially limiting its support for the PageRank and now Chrome doesn’t support it anymore and Google Toolbar for Firefox has been suspended too.

This year we had only one PageRank update around February. Additionally head of Google webspam team Matt Cutts has recently indicated that it’s highly unlikely for Google to update is in the reminder of 2014.

Soo, it looks like Google Page Rank is dead…

If you are still unsure what the PageRank is I recommend following article -

Google stops mugshot websites

According to the New York Times, Google has released another algorithmic update on Thursday 3rd of October .
Don’t worry, it’s not another Penguin or Panda ready to roam free, it’s a manual fix in the algorithm designed to stop websites featuring mugshot pictures from ranking highly in Google.

If you don’t know what the mugshot site is than let me explain. It’s a website which makes money by obtaining and publicising mugshot pictures and extorting money from the affected individuals to get them removed. This is the story that apparently triggered head of search spam Mat Cutts to take action –

I think as it’s high time for such action as apparently FBI has received thousands of complaints related to such practices. Mugshot sites can charge anywhere between $70 to $1400 USD for the removal of the mugshot picture, often for the picture to re-emerge on another site of this type later.

Time will show how effective the algorithmic update will be.

Help – 100% not provided – Google is now defaulting to the https

SEOs and webmasters brace yourselves! Google is now defaulting to the https:// (encrypted) pages even when users are not signed in, with the impact of this being that soon 100% of organic keyword data from Google will be displayed as (not provided). The whole thing has been well illustrated by this graph from

Be aware that no tracking solutions are immune to this change and moving forward, it’s important to devise new reporting logic and methodology in order to keep on top of your SEO campaigns.

One of the solutions you can try is cross reference brand / non-brand data from both Google Webmaster Tools and Google Analytics. Webmaster Tools search query report can provide you with understanding of how people are searching for and arriving on your site via Google, whilst the Bing / Yahoo / Ask keyword data from Google Analytics will provide an accurate representation of how non-Google users are organically arriving on your site.

Obviously such a solution is far from ideal as Google Webmaster Tools is rounded but at least such a process can be replicated on a monthly basis. Still, by combining the two data sets, and also looking into Bing data, you will gain good understanding of how searches are being made across all search engines, and allows us to accurately apportion brand and non-brand visits.

So dear friends – good luck in the new 100% not provided soon age.

What is Google Hummingbird Algorithm?

Busy, busy time for all SEOs . So far we’ve been fire fighting increase in not provided data sets, algorithmic updates and “penguins” (Penguin 2.0) running havoc across whole SEO industry.

But that’s not all…, Google has broken the news on 26th of September about completely new Google Hummingbird Algorithm. And yes, it’s not another algorithmic update but completely new shining algorithm. More interestingly new algorithm has been in place for a month now and no one has noticed!
Google claims results coming from use of new algorithm are to be faster and more precise hence name Hummingbird.

New algorithm is designed to provide better answers to searchers’ questions utilising Google’s Knowledge Graph in much more advanced way then before.
More information on how Google evolves in use of the Knowledge Graph and addresses multi-device usage I encourage to have a look at following post-

A couple of facts to digest:
• Google has quietly changed algorithm around a month ago and announced it only on 26th of September
• It’s supposed to be faster and more precise
• New algorithm is more “intelligent” looking into not only keyword matching but also intent behind search query
• New algoritm takes integration with knowledge graph to new level
• New algorithm is supposed to better handle complex long tail queries
• Social signal even more important than before
• New engine still uses learning’s from all Google updates like Venice, Panda and Penguin series.

I recommend following Article from Search Engine Land which provides nice Google Hummingbird FAQs –

Google Webmaster Tools search query reports delayed

Google Webmaster Tools search query reports delayed
After recent Google Encrypted Search roll out which shook the whole SEO industry eyes of all SEO’s and webmasters has inevitably shifted to Google Webmaster Tools data and its search query reports specifically.

However, as it has been reported search query data in GWT has been suffering from a “bug” causing query reports to be delayed right now.

As per Search Engine Land article – – tool stopped showing data on September 23rd. Google claim the issue will be resolved shortly.

Google helps hacked websites

On the 30th of October has announced new very cool feature within Webmaster Tools. Whole new section is called now Security Issues and offers substantially more information about potential hacking attacks, malware and any other security issues on your site, all conveniently wrapped in one place. Google can help now with recovery of a hacked website and enable webmasters to easier locate code injection instances with examples rather than just let you know that something is wrong without further advice. On top of all new features Google has also set up a dedicated help portal for hacked sites with detailed articles explaining each step of the process to recovery, including videos.

Now every webmaster will be able to:
• Discover more information about any existing security issues – all in one place.
• Identify source of the problem using detailed code snippets.
• Request review for all issues

It truly reassuring to know that Google has been so proactive in helping affected sites. Once you’ve performed a clean up and ensured that site’s security will not be compromised again, you can request a review for all issues with one click of a button straight from the Security Issues page.
More information can be found on

Google site links for non-branded keywords

Google site links for non-branded keywords

Some of you might have noticed that certain websites which rank number one for a specific keyword or search phrase display now below meta description additional set of site links. Please have a look at ranking for “bikes” for website.

halfords bikes

You will notice that site links will change depending on the search phrase eg “bikes” triggers different selection that “kids bikes“. Site links basically list other number 1 ranked pages (in most cases) and are triggered by consistent and high quality ethical optimization across your entire site.

halfords kids bikes

What you can see is the direct result of Google’s determining that a website in general has become a truly authoritative web site from an optimization perspective for a certain subject. In practice the more pages are well optimized and consistently come up in high positions in search results for their own unique keyword phrases, the more likely those additional pages will be linked below website’s primary listing for a particular phrase.

Google attempts this way to help people doing a search to be able to access deeper and more targeted pages also related to the primary search phrase.

Google’s Disavow Links Tool

Last week Google’s Chief Engineer Matt Cutts has introduced the much anticipated Google Disavow Links Tool which is now available to all webmasters through Google’s Webmaster Tools service.

It’s a follow up to Google’s unnatural links messages sent earlier this year triggered by the evidence of paid links, link exchanges or other link-related schemes violating their quality guidelines.

Google’s Disavow Tool allows webmasters to disavows web addresses linked or pointed to one’s website which is considered malicious or spammy in nature. That action in fact attempts to inform Google about links and sites that one wish to be disregarded from the current link building profile.
Google PageRank system primarily relies on natural links, formulating its basis for page rankings which means that it uses links between pages to help a search engine determine what web pages are reputable, informative and relevant to most end users.
Google has been advising uses to manually remove spammy and/or lower quality links from affected websites as soon as possible however website owners who struggle remove the links completely are now advised to make use of the new Disavow Links Tool.

As explained by Cutts, Google Disavow Links Tool works by allowing a website owner to upload a text file of the one’s ignored links, with one URL per line which may describe either a full domain or single web page. Google promises then to review and consider the untrustworthy set of links which may result in a website regaining some of the lost rankings slowly, however that process can take several weeks.

Summarising, it’s great that there is now a tool which can help deal with spammy links, at the same time it looks like Google will gather a very comprehensive database of spammy websites so be careful of “bad neighbourhoods” when working on your back link profile.

Google Penguin Update

Penguin Update

What does that animal wants with me?
Google has now confirmed that the Penguin update has been fully rolled out. The official launch was on April 24. According to Matt Cuts penguin update isn’t designed as a penalty but it’s more of the full on algorithmic change which is supposed to level the field for all websites.
Levelling the field can be read as Google’s attempt to punish pages that have been spamming Google search results. If you’re not familiar with spam, it’s when people do things like “link schemes”, “keyword stuffing”, others methods which artificially attempt to improve once ranking which in principle violates Google’s guidelines. To learn more, check Google Webmaster Guidelines

Was my website affected?

IF your rankings suddenly disappeared then it’s no brainer. If you’ve seen changes but can’t nail it then check your analytics. Assess organic traffic and keyword data pre and post 24th of April. If you can sudden drops in traffic which coincides with negative ranking movements then you have your proof.

How to future proof yourself?

Assess your present web strategy. Do you maximise all your digital channels or are you mostly reliant on one of them.

Invest in your brand. Building brand authority and citations, concentrating on the user experience, and keeping the technical aspects of your site current will definitely pay off.

Insulate yourself from algorithm changes by choosing right ways of promoting your website particularly in the organic search results. Don’t believe in guaranteed positions as some companies tend to sell it – you can follow the best practice which will be rewarded by Google but no guarantees can be made (those who provide guarantees usually use spammy methods).
Get ready for the change.

Create contingency plans. Forecast whether your business model could afford and survive drop in current rankings. Shall you be forced to rely on PPC – would you go bankrupt or still do well.

Google Six Minutes Evolution Of Search

Very interesting video presenting evolution of Google Search in a short six minutes video – enjoy!

Google announces search algorithm change promoting fresh content

The incredibly fast pace at which information is propagated across the world today constantly increases demand for receiving most up to date search results.

Google engineers constantly strive to keep up with that constantly changing environment and have now introduced new changes to search algorithm.

Google has announced that their algorithm with “freshness” update is designed to provide us with the most up-to-date results able to determine fresh results for searches around recent events, latest topics, recurring event and frequent updates.

The Caffeine update last year was a major infrastructure fix to web indexing system last year, which enables Google to crawl and quickly index the web for the freshly discovered content. Google indicates that new and improved algorithm impacts now around 35 percent of searches.

We are still to see how Google prove their algorithm to work correctly and prevent spam at the same time says SEO Google Expert.

Guest Blogging, Sponsored Posts and Becoming a Blogger

I’m pretty sure you’re absolutely sick of hearing about guest blogs now – heralded as the figurative bamboo to the Panda patrolling the Google pen, they are quickly building in popularity and everyone knows it. Especially the blog owners.

I’ve been doing guest blogging for a while now (since early ’10) and I’ve noticed a disconcerting rising trend in the number of times I am swerved to a sponsored post. Amongst the people I’ve been speaking to, there is a split opinion on the value of sponsored posts. Obviously, it’s nice to actually have a link on a strong blog and, in many cases, you don’t even have to write it – yet, in the grand scheme of things, this is a bought link. Blog owners don’t just want good content now; they (understandably) want an income.

Now, not every blogger will demand cold hard cash for a post – but if they suspect that you’re a grubby SEO just looking for a link, you’d best grab your wallet. SEOMoz contributor, Michael King, did a fantastic and extensive post on relationship management and how to stay sweet with your contacts.

So what instills that trust in the blog owner that you’re not just going to send ‘The Top 5 reasons why you should visit [COUNTRY]’?

Simply put, it’s owning your own blog.

It’s common knowledge that most companies should have their own regularly-updated in-house blog, but simply due to the extra time it takes (or just due to plain naivety), many don’t, immediately putting them on a weaker footing. So if you’re searching for a good blog to guest post on, you have to think like they do. By which I mean, stop being lazy and start a blog.

Fair enough, it’s not easy, it’s going to take time, writing skill and may cause you to lose another -.25 of your eyesight, but the end result would be worth it. One tip – stay general, but realistic – blog about things you know about and enjoy and remember that you can’t be CEO of a shoe firm one day and then the Deputy Manager of Laptop Repairs the next. You are not Mr Benn. You should also refer to’s ‘The 8 Worst Types of Blog on the Internet’ (NSFW).

If you already have a blog and are doing this then congratulations and I apologise for wasting your time and bandwidth, but if you haven’t, then seriously, get on it! Of course, this is a long-term strategy and you’re not going to become a celebrity within the blogosphere with a PA 1, DA 1, but give it some time, put in an hour a day and you’ll be commanding your own authority, boosted by your own quality content and name within the community.

Heck with Google’s rel=”author” tag, now is a better time than ever to have your own blog – you’ll even get to stick your mugshot next to your posts. This will undoubtedly increase trust in blogger relationships, as they’ll be confident you’re not an evil robot just scraping it from another blog.

There will be a few people to disagree with me here and state that a mention of their ‘team of content writers’ in an e-mail to a blog owner has never failed to get them a posting, or client domain e-mail addresses commanding far more authority than a blog, but I just don’t feel it. I think it’s all about being on a level playing field and proving you’re not just a faceless company after a link – to entirely misquote Batman Begins – “To befriend the blogger, you must become the blogger”.

So there we go, thanks for reading and please share with me your opinions on sponsored posts, guest blogging and how you’d go about it.

Thomas Clark is a member of the SEO department of Manchester SEO agency, Lakestar Media.

Google Encrypts Organic Search Data

Google has announced on the 18th October that in order to protect personalised search results of users signed into decided to encrypt any visitors’ related data with SSL protocol stripping any useful data including keywords which has driven them to a website.

That means that any searches can only be seen by Google and the web browser itself but any third party including any tracking solution you may know can’t intercept the search and know what’s being searched on.

The change to SSL is in place now and will be fully released to everyone over the coming weeks.

What is means to SEOs?
The change to SSL search means that any visitors’ data coming sites people visit after clicking on results at Google will no longer receive “referrer” data that reveals what those people searched for, except in the case of paid ads (interesting, isn’t it?).

Google ensures that we’ll still be able to measure SEO traffic and to see conversion rates, segmentations and that a token has been created to help better identify the signed in users coming via organic search visits, which states “(not provided)” within organic search traffic keyword reporting.

From my point of view it’s very clear that we won’t be able to access a portion of intelligence data at the keywords level impacting our ability to better understand SEO non brand traffic and conversion patterns.

Beware as there is also another challenge in sight – mobile users. If you use android phone and want to browse “market” or use any Google owned app you’ll be required to sign in via your Google account. Mobile users tend to forget about being logged in so imagine what sort of impact on mobile data it may have?

We’ve already seen large tracking solution providers, including Omniture, issuing communication to its client base on the predicted impact of recent Google changes.

Google software engineer Matt Cutts states that even at full roll-out, this would still be in the only a single-digit percentages of all searcher’s data on will be affected.
Well, we will see…..says SEO Google Expert

Lakestar Media Wins Best SEO PPC Agency Award

On the evening of May 26th, in the single biggest gathering of media and marketing services people in the North West, prestigious How-Do Awards 2011 were announced in various categories.
Over 500 people packed into The Point at Old Trafford Cricket Ground came to celebrate the achievements of various media, creative and digital agencies.
Lakestar Media came out on top in the SEO and PPC Agency category to claim their first ever award for the unbiased dynamic growth and customer services excellence.

About the Google Expert;

Google Expert is the highly experienced Google Adwords Expert and
SEO Expert delivering ROI driven campaigns for the range of his clients.

Bengt Wedemalm Ministry International

SEO Google Expert is delighted to introduce new Bengt Wedemalm Ministry International website. Bengt Wedemalm is a dynamic preacher and pioneer. Born in 1961 and in ministry for 30 years, he is travelling extensively and has ministered in 60+ nations. Already in the beginning of the 80s he went into the former Eastern Bloc as one of the first pioneers proclaiming the good news of Jesus. God used Bengt in miraculous ways to open up closed nations (like Albania) and he was involved in planting churches in all the former communist countries. Also Bengt has been reaching out extensively to the Islamic nations; he loves and enjoys the challenge of a closed door. For a good number of years he worked as mission director and was one of the spearheads in Ulf Ekman Ministries. As a sought after conference speaker Bengt is known for carrying a spirit of faith and moving strongly in the supernatural. His illustrated and lively sermons are helping people to master their everyday lives by the power of God.Swedish by nationality Bengt is presently based in London and involved in church networks in England and internationally.

Online Businesses Damaged by Googles Farmer Update

Founder of Mahalo, Jason Calacanis announced today that he has cut about 10% of its staff due to traffic and revenue losses from the Farmer update. If there’s any recovery for Mahalo it will come too late for some employees.

Google has become aware that some genuine websites have been hurt by the recent algorithm update and is working to fix the problem already. Amit Singhal from Google, comments that the new algorithm’s effects are positive, but unfortunately “no algorithm is 100% accurate.” That obviously presents new challenge to all SEO experts.

Another victim the Cult of Mac, an Apple-centric blog supposedly lost 80% of its keyword rankings shortly after Google announced that the new algorithm had been launched.
In response to above Google has indicated that its engineers are building a new layer on top of the present algorithm to make it even more accurate than it is.

It’s possible that Google’s changes are already rolling out.

About the Google Expert;
Google Expert is the highly experienced Google Adwords Expert and
SEO Expert delivering ROI driven campaigns for the range of his clients.

Google algorithm change – winners and losers

Last week Google has announced changes to its search engine algorithm. According to many internet sources, the change it is going to affect more sites in a shorter time than normal.
Google wants provide original content of interest to readers and spamming pages would put up pages with content scraped (stolen) from other sites and present it as its own.
The change is designed to remove low-quality content from its top search results. That means that content farms may be the primary target here.

To figure out who was hit and who gained it’s worthwhile to read a great recent article by Danny Sullivan at Search Engine Land

SearchReviews has just launched a search engine with more than 40 million reviews

SearchReviews has just launched a search engine with more than 40 million reviews in its system. Company plans to increase that number to 100 million reviews by the end of 2011.

SearchReviews currently cover about four million products and come from more than a thousand sites including of course likes such as TripAdvisor, Amazon, Zappos and etc.
Will SEM and SEO Experts benefit from SearchReviews? Let’s look at some facts below.

SearchReviews appears to have the largest searchable database of strictly review content anywhere.

Main competitors are; Buzzillions and Bazaarvoice. There’s also Google Places, which has an enormous collections of reviews and sources.

SearchReviews launches with apps that are available for both the iPhone and Android devices.

SearchReviews crawls more than thousand websites to obtain review content and aggregates them into its database, supposedly adding about two million reviews every week.
One of the impressive aspects of SearchReviews is that it indexes review content, not just business and product names. So, someone looking for “meatball pizza Vancouver” can find reviews that very specifically mention those words. That provides new opportunities for optimising product reviews.
SearchReviews with some further development should become a valuable source for consumers seeking out review information.

About the Google Expert;

Google Expert is the highly experienced Google Adwords Expert and
SEO Expert delivering ROI driven campaigns for the range of his clients.