Google deprecating AJAX crawling scheme

On Wednesday the 14th of October Google are no longer recommending the AJAX crawling proposal we made back in 2009 (http://googlewebmastercentral.blogspot.co.uk/2009/10/proposal-for-making-ajax-crawlable.html)
They openly advise now to use #! in URLs over ?_escaped_fragment_= .

“Times have changed. Today, as long as you’re not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers. To reflect this improvement, we recently updated our technical Webmaster Guidelines to recommend against disallowing Googlebot from crawling your site’s CSS or JS files.”

Questions and answers

Q: My site currently follows your recommendation and supports _escaped_fragment_. Would my site stop getting indexed now that you’ve deprecated your recommendation?
A: No, the site would still be indexed. In general, however, we recommend you implement industry best practices when you’re making the next update for your site. Instead of the _escaped_fragment_ URLs, we’ll generally crawl, render, and index the #! URLs.

Q: Is moving away from the AJAX crawling proposal to industry best practices considered a site move? Do I need to implement redirects?
A: If your current setup is working fine, you should not have to immediately change anything. If you’re building a new site or restructuring an already existing site, simply avoid introducing _escaped_fragment_ urls. .

Q: I use a JavaScript framework and my webserver serves a pre-rendered page. Is that still ok?
A: In general, websites shouldn’t pre-render pages only for Google — we expect that you might pre-render pages for performance benefits for users and that you would follow progressive enhancement guidelines. If you pre-render pages, make sure that the content served to Googlebot matches the user’s experience, both how it looks and how it interacts. Serving Googlebot different content than a normal user would see is considered cloaking, and would be against our Webmaster Guidelines.

Source: http://googlewebmastercentral.blogspot.co.uk/2015/10/deprecating-our-ajax-crawling-scheme.html

Google’s Accelerated Mobile Pages Project

Revolution is coming as Google announces a new open source initiative called Accelerated Mobile Pages (AMP), which is designed to dramatically improve the performance of the mobile web.

For many, reading on the mobile web is a slow, clunky and frustrating experience – but it doesn’t have to be that way. The Accelerated Mobile Pages (AMP) Project is an open source initiative that embodies the vision that publishers can create mobile optimized content once and have it load instantly everywhere. (https://www.ampproject.org/)

Aim is to enable webpages with rich content like video, animations and graphics to load instantaneously and also work with smart ads regardless of mobile device used. Such move will further expand Google advertising real estate and is definitely a smart move.

Google heralds AMP project as “the start of an exciting collaboration with publishers and technology companies, who have all come together to make the mobile web work better for everyone.”
At the moment Twitter,Pinterest, WordPress.com, Chartbeat, Parse.ly, Adobe Analytics and LinkedIn are among the first group of technology partners planning to integrate AMP HTML pages.

AMP project will be focused on providing functionality focused on some key areas:

1. Content: The Accelerated Mobile Pages Project will provide an open source approach, allowing publishers to focus on producing great content using the shared components for high performance and great user experience. The initial technical specification is now available on https://github.com/ampproject/amphtml

2. Distribution: Google has designed a new approach to caching that allows the publisher to continue to host their content while allowing for efficient distribution through Google’s high performance global cache. This means that Google will follow up by opening their cache servers to be used by anyone free of charge.

3. Advertising: Google declares they will work with publishers in the industry to help define the parameters of an ad experience that provides the speed we’re striving for with AMP.

 

Questions? – Explore common FAQs around AMP

Original source: https://googleblog.blogspot.co.uk/2015/10/introducing-accelerated-mobile-pages.html

Google aggressively tackles hacked web spam

On Monday October the 5th Google has officially confirmed that it has been rolling out the updated algorithms geared more specifically towards identifying spam in SERPs coming from the hacked sites.
With large amounts of legitimate sites being hacked by spammers and used to engage in abusive behaviour Google has been “forced” to take decisive action. Common practices such as malware download, promotion of traffic to low quality sites, porn, and marketing of counterfeit goods or illegal pharmaceutical drugs are some of the most common offences Google is currently paying attention to.

As per information on Webmaster Central Blog. The algorithmic changes will eventually impact roughly 5% of queries, depending on the language. Webmasters are being warned as Google rolls out the new algorithms, users might notice that for certain queries, only the most relevant results are shown, reducing the number of results shown.
Google also indicated that due to the large amount of hacked spam being removed and fine tuning of their systems to weed out the bad content while retaining the legitimate results some of the SERPs may effectively clear out substantially.

cleared SERPs

All these changes put now more and more responsibility on webmasters to ensure they are site security up to date.