SEO Pagination: How to Detect Possible Mistakes and Set It Up Correctly

SEO Pagination: How to Detect Possible Mistakes and Set It Up Correctly

Ava

Taras  |  

28 Nov, 2019

The more your website grows, the more relevant is the issue of combating internal duplicates, as well as the challenge of search engines perceiving some of the pages as low-quality.

Pagination creates partial or full internal duplicates. If done poorly, search engines may doubt their value for users.

In this article, we'll figure out how to distinguish pagination problems for SEO, how to choose the right settings, and which pagination strategies can be used to promote an online store, large portals, or message boards.

How to Find Out If Pagination Has Been Set Up Incorrectly For SEO

The main sign pagination leads to duplicates are duplicate page titles and duplicate text within the same site.

Duplicate titles

Here are the main tools to check your site for duplicate titles:

1. Screaming Fog

2. Netpeak Spider or Comparser

Scan the website in question using one of the tools and sort the results by title tag duplicates. Inspect pages with URLs that indicate pagination (/page/2/, /?page=2, and so on).

Internal duplicate content

Duplicate content usually appears when one text is repeated on paginated pages, which is not great for SEO.

To check internal duplicate content you can use:

  • Grammarly (copy the text and scan it for plagiarism)
  • Duplichecker (either copy and paste your text, or upload a Docx or Text file from your computer)
  • Manual search (compare the first page with the paginated pages and look for repeats.

Having the main issues identified, you have to start solving them. Just don't make typical mistakes listed down below.

Bad SEO Pagination Strategies

  • Disallow paginated pages indexing in robots.txt
  • Add rel="canonical" to the first page of a paginated series

Using these methods, you can probably solve a duplicate problem, while getting another one – your products or articles will not be indexed properly.

Reasons your articles or products will not be indexed:

  • Search bots will not visit pages other than the first one due to a disallow in robots.txt
  • Pagination is ignored because it was determined through rel = "canonical" that pages 2, 3, 4 are copies of the main page.

This problem will be especially relevant for online stores with a large number of products.

Good SEO Pagination Strategies

1. Meta robots

On all paginated pages, put the tag <meta name="robots" content="noindex, follow" />

2. Adding unique titles and descriptions

Every page should have unique Title and Description tags, don't use the same text from the first page of the category.

3. The "View all" + rel="canonical"

In addition to pagination pages, the "View All" page is created, with every single product displayed on this page:

<link rel="canonical" href="http://site.com/category-1/view-all.html" />

And every paginated page has a rel = "canonical" to the "View All" page.

We purposefully don't describe these strategies, since there are many articles that discuss them in detail. Like this one.

What Do Big Sites Use?

We've put together a list of SEO pagination strategies that online stores, message boards, and aggregators use to figure out what works best. We came to the conclusion that the most part of companies use self-referencing canonical.

Conclusion on SEO pagination on large sites:

  • No one uses the View All strategy.
  • About half of large websites leave pagination pages accessible for search engines, using unique metadata for these pages.
  • The second half of the sites uses a strategy to disallow paginated pages from getting indexed while still letting search bots crawl them. The prev / next tags are also indicated to improve the indexing of products in deep pages.
  • Some do not bother and completely disallow indexing (their products probably aren't being indexed correctly).

The Perfect SEO Pagination Strategy

1. Robots.txt

Do NOT set a disallow in robots.txt. Otherwise, articles or products on paginated pages will be indexed poorly.

Paginated pages' URLs must be different from the URL of the root page.

2.  rel="next" and rel="prev" attributes

You should definitely implement them for proper crawling.

Every paginated page should include rel="next" и rel="prev" HTML attributes to indicate the relationship between component URLs. Thanks to this markup, search engines can determine that the content of these pages is connected in a logical sequence.

Add a tag to the <head> section of the first page (http://website.com/category) that points to the next page, as shown below:

<head>

...

<link rel="next" href="http://website.com/category?page=2"> <a href="http://website.com/category?page=2" > <="" p=""></a>

...

</head>

Since that's the root page, no  rel="prev" needed.

On the second page, add links pointing to the previous and next URL. For example, on the second page, you can add the following lines:

<head>

...

<link rel="prev" href="http://website.com/categoty?page=3"> <a href="http://website.com/categoty?page=3" > <="" p=""></a>

...

</head>

Since this is the last URL, no rel="next" needed.

3. Indexing paginated pages

On the 2nd and subsequent paginated pages, the source code should include a line in the body of the <head> tag, which prohibits bots from adding a page to the index of search engines, but allows them to follow the links:

<head>

...

<meta name="robots" content="noindex, follow" />

...

</head>

4. Sorting pages

Ideally, these pages shouldn't generate new URLs. But if they do, you have to make sure they're not being indexed.

In the source code of the sorting page, add this line to the body of the <head> tag to prevent it from being indexed by the search engines:

<head>

...

<meta name="robots" content="noindex, nofollow" />

...

</head>

5. Page titles

They don't have to be unique.

6. 1st paginated page duplicate

The first paginated page should not exist! Usually, this is a duplicate of the root page.

Set a 301 redirect from the 1st paginated page to the root page.

Successful Pagination: a Mini Case

  • Online store of fishing products in Spain.
  • All paginated pages are open for indexing.
  • Pagination in general categories (like fishing). All products of the section from spinning rods to hooks are paginated.
  • Pagination of products in small categories (for example, hooks) and category + brand (Rapala wobblers).

Solution

Implement the above pagination strategy for the online store + remove pagination completely from the root categories such as fishing (only leave logos of internal categories).

Similar Articles