Rel=”prev”/”next” For Pagination is Dead: Where Do We Go Next?

In the early hours of March 21st, Google announced that Rel=”prev”/”next” markup has officially been deprecated as a crawl and/or indexing factor. This is significant news as it substantially changes classic strategic considerations for ecommerce catalogues, multi-page articles and blog taxonomies – all page types that make up a considerable portion of the internet.

The rel=”prev”/”next” markup was created as guidelines to assist search engines in crawling longer paginated structures. As explained by Google in a now-removed page, “This markup provides a strong hint to Google that you would like us to treat these pages as a logical sequence, thus consolidating their linking properties and usually sending searchers to the first page.”(archived here) These markup chains were frequently used in complex blog or ecommerce taxonomies as a signal to Google that the paginated series of pages within were to essentially be considered a single entity. For more details on the original markup’s attributes, see our original Pagination Guide.

In the official Google Webmaster Twitter account’s statement, the reasoning offered for retiring this markup is: “As we evaluated our indexing signals, we decided to retire rel=prev/next. Studies show that users love single-page content, aim for that when possible, but multi-part is also fine for Google Search.”

The statement may have been speaking generally to the matter of combining a multi-page article into a single source. However, it doesn’t address necessary instances of blog taxonomies or ecommerce catalogues, which can create deep and complicated linking structures that Googlebot and other crawlers will not always follow to completion.

It also does not address that the pagination markup was previously designed to consolidate linking signals between large groups of related pages within a sequence, while indicating to Google that a group of similar pages should be treated as a single entity for returning queries in the index.

Why was this markup deprecated?

No official statement has been made on the reasoning behind deprecating this markup, but several assumptions can be made via unofficial communications from Google representatives.

Even in the initial documentation, it was suggested that Google is actually quite capable of processing paginated sequences (URL has since been removed by Google, archived here):

Do nothing. Paginated content is very common, and Google does a good job returning the most relevant results to users, regardless of whether content is divided into multiple pages.”

Some site owners still found that deeper pages within more complicated sequences were never crawled or indexed. It is very possible that Google’s ability to process these sequences has improved over the years, though vigilant monitoring on a site level should continue to determine if this is the case for your specific site.

How these changes should inform your SEO strategy

Before embarking on any of these strategies, it is recommended that you examine current indexing behavior and log files to ensure that Googlebot is unable to crawl through your current paginated sequences. If your sequences are of reasonable size, these steps may not be necessary.

That being said, some of the tips below may still improve the internal link equity flow into more important pages of your site, depending on the priorities of providing optimal ranking advantage for crucial URLs.

Option 1: Flatten your internal linking structure for the Paginated Series

As we stated in our initial pagination for SEO guide:

Your best option is always optimal site design. There are a number of ways that these problems can be prevented before they begin. When planning the design of an ecommerce or similar site, consider the following measures you can take to cut down on large-scale pagination issues:

  1. Increasing the number of categories, which will decrease the depth of each paginated series
  2. Increasing the number of products per page, which will decrease the number of total pages in the paginated series
  3. Linking to all pages within the now manageable paginated series from the first page, which will alleviate any crawl-depth and link authority flow problems”

This option still holds true despite the markup retiring. “Flattening” out the linking structure of your paginated sequences by providing more category or hierarchy pages will provide more internal linking opportunities for Googlebot to follow while ideally decreasing the total depth of the sequence of pagination.

For a very simple example, you could break up a single paginated series about boots to two separate category pages centered on work boots and hiking boots instead. This also gives the benefit of creating more granular targeted pages to rank for longer-tail queries. However, this is a double-edged sword because these pages must add value for the users in order to potentially rank and draw in traffic on their own accord, and indexing thin or useless content is never advantageous for site performance.

Option 2: View-All Page and rel=“canonical”

Still a viable option outlined in their initial pagination documentation, utilizing a rel=”canonical” tag to point each page within a paginated sequence to a view-all page will accomplish much of what the rel=”prev”/”next” chain was previously designed to do. This will consolidate link signals within a single page while also returning a single page in the SERPs for users to access. It is crucial that this page loads within a reasonable amount of time, however, which Google initially defined as 2-4 seconds.

Remember, this is not the same as applying a rel=”canonical” tag on every deeper paginated page back to the initial page in the sequence. This would not only prevent any content on the deeper pages from appearing for queries targeting content on them, but it would also prevent Google from passing internal link equity onto these pages or internal links contained within these pages.

In more complex ecommerce scenarios, a combination of options 1 and 2 may prove most effective. For example, in a particularly complex ecommerce catalogue, a good strategy may be to increase the number of categories while also creating view-all pages for each category. This will reduce the number of products for each view-all page, which hopefully improves load times, and will also create new, more granular category pages to rank for long-tail queries. As mentioned, be sure these pages provide a worthwhile experience to users in order to maximize the benefit.

Option 3: Remove your paginated content from the index

If there is no strategic advantage to matching queries against content deeper within a paginated sequence, it may be most efficient to simply apply a <META NAME=”ROBOTS” CONTENT=”NOINDEX”> tag to these pages.

However, you must keep in mind that this will eventually cut off all internal link equity flow to pages linked to off of these deeper pages as well. As a result, using this option is somewhat limited in the real world.

Option 4: Do nothing

As alluded to in John Mueller’s tweet above, it does not appear that this was a recent change to Google’s crawling/indexing processes, despite Google representatives publicly recommending the markup less than a month ago. Therefore, there is a strong chance that making any of the above changes without full consideration of the consequences (or implementing additional low quality, thin pages solely for the sake of flattening the site’s structure) could have a negative impact.

In addition, removing existing rel=”prev”/”next” markup is premature, given Google’s mixed statements within the last 30 days. There is the possibility that Google may still use the markup as a guide for crawl patterns, though most statements made suggest the markup may not be used at all. Still, the fact that Bing has been utilizing rel=”prev”/”next” markup since 2012 (though not for page/link signal consolidation) suggests there may be advantages to holding onto the deprecated markup.

The deed is done

The upside to this shocking news is that there should be little-to-no negative impact on site performance right now, especially considering that this appears to have actually been implemented some time ago. This gives the industry as a whole the opportunity to reevaluate how we treat paginated sequences in the light of this information, and to improve the crawlability of our sites while potentially creating new, valuable pages for users in the process.