Google has announced removing of the FAQs from SERPs

What’s up with the FAQ schema?

Google just dropped the bomb that they’re ditching the FAQs that pop up below your search results. Before, you could add fancy code to your page to display questions and answers. But now, they’re only showing them from super official sites like Gov.uk and NHS. Check out the full scoop here: https://developers.google.com/search/blog/2023/08/howto-faq-changes

So, what does this mean for you?

Well, if your site had the FAQ schema markup, those questions and answers won’t be showing up anymore. You can hop into Google Search Console and see if they were appearing by checking the Search Results Performance Report. Make sure to apply the “FAQ Rich Results” search appearance filter to see the data.

But hey, don’t worry too much about losing all those clicks. The clicks shown in the FAQ Rich Results report are only when someone saw your site in the search results with FAQs attached and clicked on the link to go to your site. The good news is that your page’s ranking won’t be affected, so if people were clicking on it when the FAQs were there, they’re likely to keep clicking even when they’re gone. The only difference is that the report in Google Search Console will show zero impressions and clicks when the FAQ Rich Results search appearance filter is applied.

What do I do now?

Right now, there’s nothing to worry about. If you have FAQ schema code on your website pages, there’s no need to get rid of it. Google will still go through and understand the schema, even though the FAQs won’t appear below your website in search results. Those FAQs could still be useful because they give more information and show your expertise. So, removing them might not be a good idea. If you’re currently adding FAQ schema, keep going because of the reasons mentioned earlier.

Why Web Hosting is Super Important for SEO

Listen up, folks! If you want your website to shine on search engines, you absolutely must have stellar web hosting. It won’t magically make your site rocket to the top, but it lays the foundation for rockin’ SEO. Let me break it down for you.

When you choose a web hosting service, you’re basically choosing the home base for your website. Your site lives on servers, and when people type in your web address, the server sends your pages to their browsers. It’s like having a sweet home for your website to party and show off its stuff.

A lot of businesses concentrate on things like design, development, digital marketing, and SEO, but they often forget about web hosting. Oops! But let me tell you, having a fast, functional, and flexible website is just as vital. And not just any web host will cut it – you need a top-notch one to boost your conversion rates and bring in other perks too.

Here’s why having a reliable web host is a genius move:

  1. Better site performance: Picture this – a good web host makes your site load super fast and smooth, so users have a blast browsing your stuff. No more frustrating loading times, my friend!
  2. Smart data management: Web hosting providers can help you keep your website data organized and easy to find. Say goodbye to the headache of hunting down your important files.
  3. Extra security: Trustworthy web hosts have your back with strong security measures that give hackers, spam, and malware a run for their money. Ain’t nobody messing with your site!
  4. Maximum uptime: With a reliable web host, your site will be up and running all the time. No more embarrassing mid-party crashes – you’ll always be online, pulling in the crowds.

Oh, and that’s not all! A good web host should offer you email accounts, FTP access (geeky stuff), support for WordPress (a must-have), and top-notch security features like SSL certificates (fancy shmancy stuff).

Now, let’s talk SEO. Google wants users to have a mind-blowing experience, and a solid web host sets the stage for that. Websites that load fast and give users a smooth ride often get a big boost in rankings.

Speed is a huge deal for SEO, my friend. If you’re stuck with a slow and cheap web host, it can totally tank your rankings, murder your traffic, and leave you in despair with fewer leads. But fear not! A good web host with high speeds can slash your page load time and keep Google happy. And hold on to your hats – in 2022, Google says your mobile speed should be two seconds or less. That means you gotta make sure your site loads faster than a rocket ship. Zoom!

Now, let’s talk security, baby! It’s a big deal when it comes to web hosting and SEO. Without a badass security system in place, your site becomes easy prey for hackers, spam, and malware. Yikes! Trust me, you do not want that mess ruining your SEO and possibly getting you kicked off search engine pages. But listen up again – a reliable web host with robust security, like SSL certificates, can keep your site locked down tight and protect your precious rankings. Phew!

And hey, let’s not forget about location. Search engines take into consideration where your site is hosted when they’re figuring out search results. Hosting your site in a data center that’s close to your target audience can crank up its speed and performance. And if you’re conducting business all over the world, a content delivery network (CDN) with hubs in different areas will make sure your hosting is speedy no matter where your users are. It’s like having a global fiesta, my friend!

So there you have it. Web hosting is the bee’s knees when it comes to nailing SEO. Don’t skimp on a reliable web host with lightning-fast speeds, ironclad security, and the perfect location. Invest in greatness and get ready for your site to soar, climb those rankings, and rule the web. Game on!

Can you make your SEO goal SMART?

In SEO, it’s important to have clear goals in mind. Setting vague goals like “get more leads” or “get more traffic” won’t give you focus or direction. These vague goals waste time and result in generic SEO work that doesn’t help your mission.

Goals provide you with a target to work towards. Having a target allows you to check if your actions align with your goal. Goals also help measure your progress. If you’re not making progress, you may need to change tactics or update your goals. On the other hand, seeing progress towards a goal gives you a sense of accomplishment and motivation.

SEO goals aim to improve your SEO and support your wider marketing and business objectives. This involves targeting awareness and engagement with potential customers on search engines. When improving SEO, you’ll look at factors like rankings, impressions, click-through rates, and clicks. You might also consider keyword volume and impressions.

To set practical goals, follow the SMART goals framework:

Specific: Clearly define your goal so that it can be measured.
Measurable: Make your goals quantifiable so that you can track progress.
Achievable: Ensure your goals are realistic within your resources.
Relevant: Align your goals with higher-order business and marketing goals.
Time-bound: Set a deadline to aid in measuring progress.

SMART goals are widely used in project management, personal development, and psychology. They can be applied to SEO to drive performance. Let’s break down each stage of SMART goals and see how you can create your own SMART SEO goals.

S: Specific
Make sure your SEO goals are specific and actionable. Avoid broad goals like “better rankings” and instead aim for something like ranking in the top 3 results for a specific keyword.

M: Measurable
Ensure that your goals can be measured. There are various SEO metrics you can track, like rankings, organic clicks, impressions, CTR, and more.

A: Achievable
Set goals that are challenging but still within reach. Consider your resources and the competitive landscape. Focus on what you can do better compared to your competitors.

R: Relevant
Make sure your goals align with your overall marketing goals. Explain how your SEO goals contribute to the larger picture and use conversion data from paid search to support your reasoning.

T: Time-bound
Set a target date to achieve your goals and create milestones along the way to track progress.

Remember that SEO takes time, so be patient and review your goals as you make progress.

Google Complementary Robots.txt Protocols & AI

Hey guys! Google just dropped some big news last night. They’re working on developing a new protocol to go hand-in-hand with the good old robots.txt. Why? Well, it’s all because of the fancy new generative AI technologies that Google and other companies are putting out these days.

This comes right after Open AI made waves by getting access to paywalled content for their ChatGPT service. But honestly, I’m not surprised that Google and others are exploring alternatives to robots.txt considering all the wild generative AI stuff happening on the web.

Now, don’t get too excited just yet. This announcement doesn’t mean anything is changing immediately. All Google is saying is that they’re gonna have some discussions with the “community” in the “coming months” to come up with fresh ideas for a new solution.

In Google’s own words, “Today, we’re kickstarting a public discussion and inviting members of the web and AI communities to chime in on complementary protocols. We want a diverse range of voices, from web publishers, to civil society, academia, and everything in between, to join the conversation. We’ll be getting these interested folks together over the next few months.”

On top of that, Google believes that it’s high time for the web and AI communities to explore other machine-readable ways to give web publishers more choice and control for all the new AI and research cases popping up.

So, there you have it, folks. Google’s looking to shake things up a bit and get some input from all the different peeps in the web and AI worlds. It’ll be interesting to see what they come up with. Stay tuned!

5 steps effective brand SERP strategy

So, you wanna know how to analyze your brand search engine results pages (SERPs) and build a killer digital strategy, huh? Well, you’re in luck! Today, we’re gonna break it down for you in simple terms.

First things first, let’s talk about what a brand SERP is. Basically, it’s what shows up when someone searches for your brand on a search engine. It includes all the information and links related to your brand that appear on the search results page.

Now, why should you care about your brand SERP? Well, my friend, it’s all about reputation. The way your brand is presented on the search results page can make or break your reputation. It’s like your online business card. So, it’s crucial to take control of your brand SERP and make sure it reflects positively on your brand.

Alright, let’s get to the nitty-gritty of analyzing your brand SERP. First, you need to conduct a thorough search for your brand name on different search engines. Look at the top results and see what information is displayed. Take note of any negative or irrelevant information that may be harming your reputation.

Next, it’s time to take action and build your digital strategy. Here are a few tips to get you started:

  1. Optimize your website: Make sure your website is search engine friendly. Use relevant keywords in your content, meta tags, and URLs. This will help improve your ranking on the search results page.
  2. Create high-quality content: Producing valuable and engaging content is key. Write blog posts, create videos, or share infographics that are relevant to your brand. This will not only attract more visitors but also improve your SERP presence.
  3. Leverage social media: Social media platforms are a great way to boost your brand’s visibility on the SERP. Create and optimize your social media profiles on platforms like Facebook, Twitter, and Instagram. Remember to engage with your audience and respond to comments and messages.
  4. Manage online reviews: Online reviews can heavily influence your brand’s reputation. Encourage satisfied customers to leave positive reviews and respond promptly to any negative feedback. This will show potential customers that you value their opinions and are committed to providing a great experience.
  5. Monitor and adapt: Keep a close eye on your brand SERP on a regular basis. Set up alerts and use monitoring tools to stay informed about any changes. This will allow you to quickly identify and address any issues or opportunities that may arise.

So, there you have it! By analyzing your brand SERPs and implementing a solid digital strategy, you’ll be well on your way to building a strong online presence and positive brand reputation. Good luck, my friend, and may the search engines be ever in your favor!

Google’s Core Web Vitals INP

Hey folks, let’s talk about this recent email from Google that has got some website owners a little worried. It’s all about their Core Web Vitals and a specific issue known as Cumulative Layout Shift (CLS). So, in a nutshell, Google sent out an email highlighting problems with Input responsiveness on certain websites, pinpointing specific elements that are causing issues.

Now, this email might have caused some panic, but don’t sweat it too much. The main goal here is to inform website owners about potential performance problems, particularly related to CLS and Input responsiveness. You see, Input responsiveness is all about how quickly your website responds to user interactions like clicking buttons or filling out forms. Google wants to make sure that sites are providing a seamless and smooth experience to users.

The email is not meant to be a punishment or a slap on the wrist. It’s more like a heads-up and guidance from Google to help you improve your site’s performance. They even give you examples of the elements on your website that are causing the CLS issues, so you know exactly what to focus on.

The bottom line is, if you received this email, take a close look at it and make sure you address those input responsiveness issues. Google recommends testing and optimizing the problem areas to minimize CLS problems and improve your overall site performance.

So, breathe easy! Google is just trying to help you out here, nudging you in the right direction to make your website better. Time to roll up your sleeves and get to work on those optimizations!

Google deprecating AJAX crawling scheme

On Wednesday the 14th of October Google are no longer recommending the AJAX crawling proposal we made back in 2009 (http://googlewebmastercentral.blogspot.co.uk/2009/10/proposal-for-making-ajax-crawlable.html)
They openly advise now to use #! in URLs over ?_escaped_fragment_= .

“Times have changed. Today, as long as you’re not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers. To reflect this improvement, we recently updated our technical Webmaster Guidelines to recommend against disallowing Googlebot from crawling your site’s CSS or JS files.”

Questions and answers

Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you’ve deprecated your recommendation?
A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you’re making the next update for your site. Instead of the _escaped_fragment_ URLs, we’ll generally crawl, render, and index the #! URLs.

Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects?
A: If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls. .

Q: I use a JavaScript framework and my webserver serves a pre-rendered page. Is that still ok?
A: In general, websites shouldn’t pre-render pages only for Google — we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user’s experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.

Source: http://googlewebmastercentral.blogspot.co.uk/2015/10/deprecating-our-ajax-crawling-scheme.html

Google aggressively tackles hacked web spam

On Monday October the 5th Google has officially confirmed that it has been rolling out the updated algorithms geared more specifically towards identifying spam in SERPs coming from the hacked sites.
With large amounts of legitimate sites being hacked by spammers and used to engage in abusive behaviour Google has been “forced” to take decisive action. Common practices such as malware download, promotion of traffic to low quality sites, porn, and marketing of counterfeit goods or illegal pharmaceutical drugs are some of the most common offences Google is currently paying attention to.

As per information on Webmaster Central Blog. The algorithmic changes will eventually impact roughly 5% of queries, depending on the language. Webmasters are being warned as Google rolls out the new algorithms, users might notice that for certain queries, only the most relevant results are shown, reducing the number of results shown.
Google also indicated that due to the large amount of hacked spam being removed and fine tuning of their systems to weed out the bad content while retaining the legitimate results some of the SERPs may effectively clear out substantially.

cleared SERPs

All these changes put now more and more responsibility on webmasters to ensure they are site security up to date.

HTTPS Content Mismatch Errors May Not Earn Ranking Boost

In recent August 28th Google hangout on Google+, John Mueller indicated a very important fact by saying at 18:30 mark that Google may not give HTTPS web pages the desired HTTPS ranking boost in case the page serves up HTTPS content mismatch errors.

What does it actually mean? It’s relatively simple, when your web page uses HTTPS protocol but somehow there are some elements on the page that are not secure e.g. images not over HTTPS or any social media plugins etc.
So far Google has been quite liniment and your pages might have benefited from move to HTTPS, even if they are not fully HTTPS secure.

However according to John Mueller this may change moving forward;

I imagine at some point, we will say you will have to serve really clean HTTPS for us to show that in the search results. I don’t know. I don’t think that there will be any kind of penalty around that. Just that we may say, we won’t count this as HTTPS, so it won’t get that small HTTPS ranking boost.

Summarising, think twice when planning to move to HTTPS as many of my industry piers have been already noticing site visibility drops after move to HTTPS even despite keeping all clean and tidy in the process. I’m still of the opinion that there always should be strong business case to support move to HTTPS rather that doing it for the sake of compliance with Google’s advice.

Trail Marathon Wales 2015

On Saturday 20 June 2015 I had run my first marathon. I wanted to choose the course I will remember forever and I have to say Trail Marathon Wales has actually surpassed my wildest imaginations. The course is in the most beautiful mountains of the Snowdonia National Park Coed y Brenin. The 42km course features single track paths over 3500ft elevation gain across up 20% inclines. I made it in 4:49 minutes which gave me 167 place out of 340 participants. Testimony to how hard the marathon was that some of the previous marathon runners didn’t manage to finish the full race. After talking to some of the fellow runners pre and post race it was necessary to add on average 1 hour to the regular marathon time to cope with TMW.

For running down hills which are abundant and steep there I used sound advice on technique from Sage Canaday https://t.co/pxBIzDVPSE

A couple of pictures including course map and elevation profile of the race;

tshirt nearlythere crossingline screenshot-app.strava.com 2015-06-21 21-43-56screenshot-app.strava.com 2015-06-21 21-46-21

 

Sascon 2015

Today I had an opportunity to take part in the panel Technical vs Content during first day of Sascon Manchester ; We were discussing what has the biggest impact on organic performance, the technical architecture of a website, or the content.? A view on the market concensus on where you should be spending your time. Speakers included: myself on behalf of iProspect Manchester; James English, BBC and Edward Cowell, Mediacom.

Content below has been originally posted by Emily Diment on http://www.pushon.co.uk/blog/technical-vs-content-sascon-2015/

The panel for this session comprises of Radek Kowalski Group Technical Search Manager from iProspect Manchester, Teddie Cowell, Director of SEO at MediaCom and James English, Senior SEO Analyst for BBC Sport.

Everyone on the panel believes that the content vs technical debate is an interesting one. It’s something that is sometimes a bit of a contentious issue, but it’s one that always needs to be considered when planning an SEO campaign or organic strategy.

Technical SEO has moved away from just a tech audit to mobile, platform builds and commerce platforms which takes tech away from short term projects to more long-term tech SEO projects, according to Radek, which makes having technical knowledge a requirement for an SEO.

So, how much do other people within agencies know or care about SEO?

At MediaCom, SEO is a core focus within other departments’ agendas. There is a big desire within the industry to get content marketing to work harder. SEOs in general have a really good understanding of how the web works. The majority of the budgets being spent on brands aren’t being spent on SEO; they’re being spent on brand budgets.

SEO considerations are increasingly becoming a part of brand marketing conversations from the very outset because SEOs are good at joining the pieces of a campaign together from the beginning. SEO helps to maximise value from budgets.

There has been a huge change in SEO over the last three years, says Radek, moving heavily towards content. In fact, people are becoming obsessed with content: how to do it better, how to get better ROI. iProspect Manchester has become very good at creating SEO content which supports massive campaigns. Technical optimisation can’t be forgotten. It’s almost like a vehicle for how content can be deployed. If the range of clients is small it can be very easy to technically deploy content, but if you’re trying to work on massive eCommerce systems for very large brands it can be difficult to deploy content. The technical element to content-based SEO is now more important because it has to work with your content.

You could have good content with a good awareness but it could be more successful if you have the technical element sorted.

Whatever you do for a project you have to make key stakeholders happy by educating them about how SEO can make things better but still keeping technical standards high.

Educating clients of the value of SEO is one of the most important aspects of any SEO campaign, says Teddie.

SEO at the BBC

One of the most interesting things about doing SEO at the BBC is being able to see that they are competing on a different landscape – they’ve got the brand authority but it’s more about the way people are able to find the content that is becoming one of the challenges, as is the growing amount of competition the BBC has for news. They can’t just rely on the fact people will be searching for their content via brand searches. Its easy to become complacent but it is a highly competitive landscape.

Some big technical SEO priorities for the BBC at the moment are sitemaps and newsfeeds as well as being able to react quickly to big news stories and high profile events such as the World Cup or Olympics.

Educating clients

We can create a piece of content that is technically sound but it still needs to go on a client’s website and that is generally when errors and problems appear, says Radek. Changing wireframes can be expensive and daunting for clients. Similarly, UX does not always go alongside technical SEO priorities. Some elements that are great for UX may not work well with the UI and can break on different devices such as mobile or tablet.

Teddie believes it is an ongoing process because IT professionals sometimes have different ideas and different priorities to people working in and doing SEO.

What would you say works well for smaller websites? Should they focus on technical or content?

According to Radek, small sites with fewer than 500 landing pages should choose a good platform like WordPress with free plugins which can do 80% of technical work for you.

The smaller the site is, the more important the technical excellence is for you. He has seen small sites with no backlinks and a really well optimised technical elements that are ranking really well. Sites need to comply with technical SEO standards and make sure themes pass site validations for mobile friendliness and page speed amongst others.

Teddie agrees: there is still the role for technical SEO for smaller sites. They need to consider essential technical things including consistency for URL and header elements like h1s.

When you see content being produced that is great, what do you see are key technical elements that are often forgotten or missed?

With storytelling content using parallax pages, which is a popular design trend, the technical issue for SEOs is that is parallax pages are losing performance for searchers and some search queries.

James feels that it is important to experiment with different content formats to try to give the best experience for the user online. We are always trying to match the best experience for the user and match it with the site hierarchy.

Something designers love is Flash and JS but try to stick to HTML5. This is because it’s very flexible and easy to digest for the search engines. Sometimes different types of content make a page not mobile friendly and you are running the risk of making the page unconsumable on certain devices, says Radek.

70% of your audience may access your content using mobile (think about that). If you work in an industry that focuses on young adults they don’t really care about desktops, laptops or even tablets. They see the world via mobile phones and any campaign should be mobile friendly.

You have to remember that different clients have different issues, says Teddie. Not every brand was well prepared for the mobile update app. What Google did by forcing the point of the mobile agenda was a really good thing to do. Brands needed a wakeup call and that’s what the Google update did.

How much of a role does UX play for SEO?

SEOs need to define the two different elements – UI and UX. User interface will help to make your site more accessible and more compliant. BBC has been focused for years on making their website accessible for different users. BBC creative should be used as a standard for accessibility UI.

Google has been giving people massive amounts of technical information and guidance which people should be listening to. Google will start to become more and more picky, and is giving messages like “be mobile friendly”, “be clean”, “provide a good UX”. This is because they care, but they want to save their own money because their infrastructure to crawl the www most likely costs billions.

What is content to you? What is a good piece of content, what are you talking to clients about?

At MediaCom content can be lots of things, says Teddie. Anything that connects a brand to their consumers or something that people choose to engage with such as YouTube, press partnership or the website itself. It’s about how the users are moving around the content.

James believes like Teddie that it is anything that connects users to the brand. He tries not to use the word content much around journalists because to journalists content is like a dirty marketing term. They call content other words such as features, articles, interviews, video content, TV content.

The final question for the panel was a good one.

Where do you spend a limited budget that has to get results? Technical or content? Consider that the client is a retail company of a medium size.

James: Technical

Teddie: Depends on the nature of the business. but for medium size retail, technical.

Radek: investing in technical can be immediately seen as having a positive impact.

 

Google Webmaster Tools Search Queries No Longer Rounds Data

In the recent days we’ve seen a very promising development in Google Webmaster Tools console. John Muller, Google’s Webmaster Trends Analyst has announced that site data & metrics will be more actionable than ever with GWT search queries no longer be rounded & bucketed.

http://googlewebmastercentral.blogspot.co.uk/2014/01/search-queries-not-rounded.html

Google collects “impressions” & “clicks” to the site and displays in Webmaster Tools console for the last 90 days. This is definitely a great development in the light of sharp increase in the “not provided” dat

Google Matt Cutts on guest blogging video

Matt Cutts Head of Spam at Google has just released new video about guest blogging. In the video Matt explains when guest blogging can be perceived as spammy and what to avoid when guest blogging. He advises to use it in moderation and always to approach it with “how I can add value by sharing my knowledge” and also embark on creating actual relationship with blog masters rather than using it just for the link building.

Here is the video

Future of SEO and Google Semantic Search

We all ask this burning question – how will future of SEO look like?. In the wake of Google Penguin 2.0 and 2.1 killing backlinks,Google Humingbird algorithm update and sudden growth of “not provided” data triggered by Google defaulting to SSL search, just like any other SEO I’ve been trying to systemise and answer my own questions and doubts caused by all the changes we’ve recently experienced.

Some of my recent thoughts and questions worth crunching on;

• Discovering user intent using keyword data
• Predictions based on location, search history, circles and type of the device
• Google Hummingbird, Knowledge Graph, Voice Search and Google Now
• Matt Cutts search experience optimisation is the future of SEO – more info in the video here
• What experience do we want a visitor to have on our site?
• What are touch points, key actions and conversion points SEO and CRO.
• Technical SEO will prevail
• From links to answers
• Search engines want to “understand” more using structured data / microdata – I recommend you study schema markups at http://schema.org if you haven’t already
• Schema as technical language creating identifiers or entities that Google is very and will increasingly be interested in. That is how together with Knowledge Graph Hummingbird will be “connecting dots”.
• Google is rapidly adopting semantic search technologies
• Schema as technical language creating identifiers or entities
• Entity search, knowledge graph helping bots to better understand, match and create better response to a query

Use of semantic search coupled with semantic markup allows will be sharply increasing as Google needs more information and quickly for its Knowledge Graph. That is how Hummingbird will be able to increasingly provide better answers to user queries on various devices – wow, that is nearly like intelligent & learning algorithm. I therefore recommend to focus on technical SEO and definitely embracing semantic markup.

To conclude this space is definitely worth watching…….

Google says thank you to top contributors

Google has launched last week second Top Contributors Summit after testing the idea for the first time in 2011.

If you don’t know who the Top Contributor is let me explain; Top Contributors are highly experienced individuals helping (for free) to resolve various problems or sharing useful tips on how to get more out of Google range of products. They’ve have been a vital line of help in resolving many Google related products issues for the growing range of official Google product forums; https://support.google.com.

According to Google Top Contributor volunteers contribute across 250 product communities in as much as 26 languages.
During summit Top Contributors were also able to test latest Google products and technologies including Google Glass. It’s really nice to see that Google utilises and appreciates advocacy of such individuals.

My own experience with Top Contributors has always been a positive one, particularly for Webmaster Tools and Google Analytics support forums.

So thank you all, job well done for being there and helping us all when it seems as Google itself is not able or willing to help.

You may find more details on the event in the original blog here – http://googleblog.blogspot.co.uk/2013/10/saying-thank-you-to-our-google-top.html

Tough Mudder Google Expert

Google Expert has now officially acquired rights to brag about being the toughest and muddiest Google Expert.

Team effort has resulted in nice amount being collected for The Mustard Tree – http://www.mustardtree.org.uk/ – a fantastic organisation that looks after the homeless and those marginalised by society.

Thank you all for your generous donations: http://www.justgiving.com/tough-muppets

tough mudder google expert
More event pics here

Google Penguin 2.1 spam filtering algorithm update now live

Google’s is seriously busy this month. We all still try to get our heads around the recent new Google “engine” Hummingbird and now we get confirmed update to this new algorithm already.

If you are not familiar with what Hummingbird algorithm is, I encourage you to read my previous post – http://www.googlexpert.co.uk/google_hummingbird_algorithm
Head of webspam Mutt Cutts has confirmed launch of the new update on the 4th of October and indicated that roughly 1% of searches will be affected.

Penguin 2.1 is another evolution of Penguin 2.0 which has caused major shift in the whole SEO industry by targeting spammy backlink profiles including paid links and black hat SEO tactics.

If you have already been hit by Penguin 2.0 and taken actions to remove and disavow part of your backlink profile this week make sure you’ll pay close attention to your rankings. You may either see improvements or further drops driven by this recent update.

Here you can find more information on what the Penguin update is – http://googlewebmastercentral.blogspot.co.uk/2012/04/another-step-to-reward-high-quality.html

Is Google PageRank dead?

Google page rank is something we SEOs used to pay substantial attention to. If you remember Google Toolbar used to display values from 0 to 10 which would indicate how much authority page had in Google eyes. It was truly helpful tool for quick competitive comparison of different websites from the same sector or great indicator of site’s quality when scouting for links.
Over the last couple of years we’ve seen Google substantially limiting its support for the PageRank and now Chrome doesn’t support it anymore and Google Toolbar for Firefox has been suspended too.

This year we had only one PageRank update around February. Additionally head of Google webspam team Matt Cutts has recently indicated that it’s highly unlikely for Google to update is in the reminder of 2014.

Soo, it looks like Google Page Rank is dead…

If you are still unsure what the PageRank is I recommend following article -http://searchengineland.com/what-is-google-pagerank-a-guide-for-searchers-webmasters-11068

Google stops mugshot websites

According to the New York Times, Google has released another algorithmic update on Thursday 3rd of October .
Don’t worry, it’s not another Penguin or Panda ready to roam free, it’s a manual fix in the algorithm designed to stop websites featuring mugshot pictures from ranking highly in Google.

If you don’t know what the mugshot site is than let me explain. It’s a website which makes money by obtaining and publicising mugshot pictures and extorting money from the affected individuals to get them removed. This is the story that apparently triggered head of search spam Mat Cutts to take action – http://www.nytimes.com/2013/10/06/business/mugged-by-a-mug-shot-online.html

I think as it’s high time for such action as apparently FBI has received thousands of complaints related to such practices. Mugshot sites can charge anywhere between $70 to $1400 USD for the removal of the mugshot picture, often for the picture to re-emerge on another site of this type later.

Time will show how effective the algorithmic update will be.

Help – 100% not provided – Google is now defaulting to the https

SEOs and webmasters brace yourselves! Google is now defaulting to the https:// (encrypted) pages even when users are not signed in, with the impact of this being that soon 100% of organic keyword data from Google will be displayed as (not provided). The whole thing has been well illustrated by this graph from http://www.notprovidedcount.com/.

Be aware that no tracking solutions are immune to this change and moving forward, it’s important to devise new reporting logic and methodology in order to keep on top of your SEO campaigns.

One of the solutions you can try is cross reference brand / non-brand data from both Google Webmaster Tools and Google Analytics. Webmaster Tools search query report can provide you with understanding of how people are searching for and arriving on your site via Google, whilst the Bing / Yahoo / Ask keyword data from Google Analytics will provide an accurate representation of how non-Google users are organically arriving on your site.

Obviously such a solution is far from ideal as Google Webmaster Tools is rounded but at least such a process can be replicated on a monthly basis. Still, by combining the two data sets, and also looking into Bing data, you will gain good understanding of how searches are being made across all search engines, and allows us to accurately apportion brand and non-brand visits.

So dear friends – good luck in the new 100% not provided soon age.

What is Google Hummingbird Algorithm?

Busy, busy time for all SEOs . So far we’ve been fire fighting increase in not provided data sets, algorithmic updates and “penguins” (Penguin 2.0) running havoc across whole SEO industry.

But that’s not all…, Google has broken the news on 26th of September about completely new Google Hummingbird Algorithm. And yes, it’s not another algorithmic update but completely new shining algorithm. More interestingly new algorithm has been in place for a month now and no one has noticed!
Google claims results coming from use of new algorithm are to be faster and more precise hence name Hummingbird.

New algorithm is designed to provide better answers to searchers’ questions utilising Google’s Knowledge Graph in much more advanced way then before.
More information on how Google evolves in use of the Knowledge Graph and addresses multi-device usage I encourage to have a look at following post- http://insidesearch.blogspot.co.uk/2013/09/fifteen-years-onand-were-just-getting.html

A couple of facts to digest:
• Google has quietly changed algorithm around a month ago and announced it only on 26th of September
• It’s supposed to be faster and more precise
• New algorithm is more “intelligent” looking into not only keyword matching but also intent behind search query
• New algoritm takes integration with knowledge graph to new level
• New algorithm is supposed to better handle complex long tail queries
• Social signal even more important than before
• New engine still uses learning’s from all Google updates like Venice, Panda and Penguin series.

I recommend following Article from Search Engine Land which provides nice Google Hummingbird FAQs – http://searchengineland.com/google-hummingbird-172816

Google Webmaster Tools search query reports delayed

Google Webmaster Tools search query reports delayed
After recent Google Encrypted Search roll out which shook the whole SEO industry eyes of all SEO’s and webmasters has inevitably shifted to Google Webmaster Tools data and its search query reports specifically.

However, as it has been reported search query data in GWT has been suffering from a “bug” causing query reports to be delayed right now.

As per Search Engine Land article – http://searchengineland.com/google-webmaster-tools-blocks-keywords-173153 – tool stopped showing data on September 23rd. Google claim the issue will be resolved shortly.

Google helps hacked websites

On the 30th of October has announced new very cool feature within Webmaster Tools. Whole new section is called now Security Issues and offers substantially more information about potential hacking attacks, malware and any other security issues on your site, all conveniently wrapped in one place. Google can help now with recovery of a hacked website and enable webmasters to easier locate code injection instances with examples rather than just let you know that something is wrong without further advice. On top of all new features Google has also set up a dedicated help portal for hacked sites with detailed articles explaining each step of the process to recovery, including videos.

Now every webmaster will be able to:
• Discover more information about any existing security issues – all in one place.
• Identify source of the problem using detailed code snippets.
• Request review for all issues

It truly reassuring to know that Google has been so proactive in helping affected sites. Once you’ve performed a clean up and ensured that site’s security will not be compromised again, you can request a review for all issues with one click of a button straight from the Security Issues page.
More information can be found on http://googlewebmastercentral.blogspot.co.uk/2013/10/easier-recovery-for-hacked-sites.html

Google site links for non-branded keywords

Google site links for non-branded keywords

Some of you might have noticed that certain websites which rank number one for a specific keyword or search phrase display now below meta description additional set of site links. Please have a look at ranking for “bikes” for www.halfords.com website.

halfords bikes

You will notice that site links will change depending on the search phrase eg “bikes” triggers different selection that “kids bikes“. Site links basically list other number 1 ranked pages (in most cases) and are triggered by consistent and high quality ethical optimization across your entire site.

halfords kids bikes

What you can see is the direct result of Google’s determining that a website in general has become a truly authoritative web site from an optimization perspective for a certain subject. In practice the more pages are well optimized and consistently come up in high positions in search results for their own unique keyword phrases, the more likely those additional pages will be linked below website’s primary listing for a particular phrase.

Google attempts this way to help people doing a search to be able to access deeper and more targeted pages also related to the primary search phrase.

Google’s Disavow Links Tool

Last week Google’s Chief Engineer Matt Cutts has introduced the much anticipated Google Disavow Links Tool which is now available to all webmasters through Google’s Webmaster Tools service.

It’s a follow up to Google’s unnatural links messages sent earlier this year triggered by the evidence of paid links, link exchanges or other link-related schemes violating their quality guidelines.

Google’s Disavow Tool allows webmasters to disavows web addresses linked or pointed to one’s website which is considered malicious or spammy in nature. That action in fact attempts to inform Google about links and sites that one wish to be disregarded from the current link building profile.
Google PageRank system primarily relies on natural links, formulating its basis for page rankings which means that it uses links between pages to help a search engine determine what web pages are reputable, informative and relevant to most end users.
Google has been advising uses to manually remove spammy and/or lower quality links from affected websites as soon as possible however website owners who struggle remove the links completely are now advised to make use of the new Disavow Links Tool.

As explained by Cutts, Google Disavow Links Tool works by allowing a website owner to upload a text file of the one’s ignored links, with one URL per line which may describe either a full domain or single web page. Google promises then to review and consider the untrustworthy set of links which may result in a website regaining some of the lost rankings slowly, however that process can take several weeks.

Summarising, it’s great that there is now a tool which can help deal with spammy links, at the same time it looks like Google will gather a very comprehensive database of spammy websites so be careful of “bad neighbourhoods” when working on your back link profile.

Google Penguin Update

Penguin Update

What does that animal wants with me?
Google has now confirmed that the Penguin update has been fully rolled out. The official launch was on April 24. According to Matt Cuts penguin update isn’t designed as a penalty but it’s more of the full on algorithmic change which is supposed to level the field for all websites.
Levelling the field can be read as Google’s attempt to punish pages that have been spamming Google search results. If you’re not familiar with spam, it’s when people do things like “link schemes”, “keyword stuffing”, others methods which artificially attempt to improve once ranking which in principle violates Google’s guidelines. To learn more, check Google Webmaster Guidelines

Was my website affected?

IF your rankings suddenly disappeared then it’s no brainer. If you’ve seen changes but can’t nail it then check your analytics. Assess organic traffic and keyword data pre and post 24th of April. If you can sudden drops in traffic which coincides with negative ranking movements then you have your proof.

How to future proof yourself?

Assess your present web strategy. Do you maximise all your digital channels or are you mostly reliant on one of them.

Invest in your brand. Building brand authority and citations, concentrating on the user experience, and keeping the technical aspects of your site current will definitely pay off.

Insulate yourself from algorithm changes by choosing right ways of promoting your website particularly in the organic search results. Don’t believe in guaranteed positions as some companies tend to sell it – you can follow the best practice which will be rewarded by Google but no guarantees can be made (those who provide guarantees usually use spammy methods).
Get ready for the change.

Create contingency plans. Forecast whether your business model could afford and survive drop in current rankings. Shall you be forced to rely on PPC – would you go bankrupt or still do well.

Google Six Minutes Evolution Of Search

Very interesting video presenting evolution of Google Search in a short six minutes video – enjoy!

Google announces search algorithm change promoting fresh content

The incredibly fast pace at which information is propagated across the world today constantly increases demand for receiving most up to date search results.

Google engineers constantly strive to keep up with that constantly changing environment and have now introduced new changes to search algorithm.

Google has announced that their algorithm with “freshness” update is designed to provide us with the most up-to-date results able to determine fresh results for searches around recent events, latest topics, recurring event and frequent updates.

The Caffeine update last year was a major infrastructure fix to web indexing system last year, which enables Google to crawl and quickly index the web for the freshly discovered content. Google indicates that new and improved algorithm impacts now around 35 percent of searches.

We are still to see how Google prove their algorithm to work correctly and prevent spam at the same time says SEO Google Expert.

Guest Blogging, Sponsored Posts and Becoming a Blogger

I’m pretty sure you’re absolutely sick of hearing about guest blogs now – heralded as the figurative bamboo to the Panda patrolling the Google pen, they are quickly building in popularity and everyone knows it. Especially the blog owners.

I’ve been doing guest blogging for a while now (since early ’10) and I’ve noticed a disconcerting rising trend in the number of times I am swerved to a sponsored post. Amongst the people I’ve been speaking to, there is a split opinion on the value of sponsored posts. Obviously, it’s nice to actually have a link on a strong blog and, in many cases, you don’t even have to write it – yet, in the grand scheme of things, this is a bought link. Blog owners don’t just want good content now; they (understandably) want an income.

Now, not every blogger will demand cold hard cash for a post – but if they suspect that you’re a grubby SEO just looking for a link, you’d best grab your wallet. SEOMoz contributor, Michael King, did a fantastic and extensive post on relationship management and how to stay sweet with your contacts.

So what instills that trust in the blog owner that you’re not just going to send ‘The Top 5 reasons why you should visit [COUNTRY]’?

Simply put, it’s owning your own blog.

It’s common knowledge that most companies should have their own regularly-updated in-house blog, but simply due to the extra time it takes (or just due to plain naivety), many don’t, immediately putting them on a weaker footing. So if you’re searching for a good blog to guest post on, you have to think like they do. By which I mean, stop being lazy and start a blog.

Fair enough, it’s not easy, it’s going to take time, writing skill and may cause you to lose another -.25 of your eyesight, but the end result would be worth it. One tip – stay general, but realistic – blog about things you know about and enjoy and remember that you can’t be CEO of a shoe firm one day and then the Deputy Manager of Laptop Repairs the next. You are not Mr Benn. You should also refer to Cracked.com’s ‘The 8 Worst Types of Blog on the Internet’ (NSFW).

If you already have a blog and are doing this then congratulations and I apologise for wasting your time and bandwidth, but if you haven’t, then seriously, get on it! Of course, this is a long-term strategy and you’re not going to become a celebrity within the blogosphere with a PA 1, DA 1, but give it some time, put in an hour a day and you’ll be commanding your own authority, boosted by your own quality content and name within the community.

Heck with Google’s rel=”author” tag, now is a better time than ever to have your own blog – you’ll even get to stick your mugshot next to your posts. This will undoubtedly increase trust in blogger relationships, as they’ll be confident you’re not an evil robot just scraping it from another blog.

There will be a few people to disagree with me here and state that a mention of their ‘team of content writers’ in an e-mail to a blog owner has never failed to get them a posting, or client domain e-mail addresses commanding far more authority than a blog, but I just don’t feel it. I think it’s all about being on a level playing field and proving you’re not just a faceless company after a link – to entirely misquote Batman Begins – “To befriend the blogger, you must become the blogger”.

So there we go, thanks for reading and please share with me your opinions on sponsored posts, guest blogging and how you’d go about it.

Thomas Clark is a member of the SEO department of Manchester SEO agency, Lakestar Media.