Pagination is leading to duplicate meta descriptions

Hi,

I’m loving the new pagination feature on the collection pages, but all the SEO analysis tools are giving errors because each paginated page of the collection now has identical OG/meta tags etc.

What do you suggest so we don’t see negative SEO consequences due to what Google thinks is duplicate content?

Thanks

2 Likes

Question: If you had the ability to add unique meta data to a paginated page, what would you put there? A differently worded description of page 1’s or 2 or 3 and so on?

Paginating a collection is just breaking up a list. If you needed unique descriptions for parts of a list, why not break it up into new lists. You could create a new page with a filtered collection list. That will always rank better because it’s specific and so are the list items. I personally use silos and good pruning for great success. Just my thoughts.

I have come to the forum to research this issue, as it is now my concern.

Here’s what the literature has to say about SEO and duplicate titles.

Duplicate Titles

A title is considered to be a duplicate if it matches the exact title of another page. Duplicate titles will diminish the quality of a page since it is unclear on which page has more relevance to a given topic. Furthermore, it will also confuse the user when navigating your site.


With Pagination, the exact page where the element is found on is rendered again and thus indexed with a new url, with the exact same title. There’s no way, at present moment, to vary the subsequent pages because it’s internal.

Is there a way to append the pagination page and ID number to the title of the page it is found on like you mentioned? And this be done automatically by the internal system would be superb.

Right now, my client’s site is showing hundreds of duplicate titles…which may negatively effect their SEO and my services to them if I can’t show the problem resolved.

An example of what the Pagination does to titles:

Original page where pagination is used:
About Dr. Scott Hollander

Paginated Pages (duplicates):
About Dr. Scott Hollander
About Dr. Scott Hollander

So while the URLS are different (as they should be because they are now indexing as 3 different pages instead of 1) the title for EACH of these pages in SEO SETTINGS > TITLE TAG is exactly the same for all 3 of these pages. Why? Because it’s really only 1 page and there’s really only 1 place where we can edit the Title Tag. In the above example the title “About Dr. Scott Hollander” is indexing 3 times.

I explained that to the best of my abilities. Let me know of a good resolve for this. Thanks in advance!

Edit directly following the post: Works the same way for duplicate descriptions so we’ll need a workaround or resolve for that as well.

3 Likes

The following Moz article on this subject is well thought out and includes actual recommendations from Google about what you can do and recommendations. While the article is a little old, the recommendations are not, and much of the well-referenced content is available at the Official Google Webmaster Central blog.

There is content on this subject at Search Console Help as well.

On the links you shared, the page content is visibly thin and loosely related. I would start there.

1 Like

Awesome research, thank you!

So from my understanding, we need to add the rel next/prev markup to EACH paginated page. Where do we have control of each page of the pagination to do so? And if we did have control, wouldn’t we just name/describe it differently in meta?

The other way then is to hide specific subpages in robots text…which I think is scary lol

I think the best way (without having access to HTML inside of paginated subpages) would be the rel canonical HTML markup…BUT we actually LOSE the content on the now non-indexed subpages in doing so.

In the end, we still need a way to edit the duplicate pages that pagination creates.

Any feedback on my observations?

You might want to ask the question “Why am I paginating 10 results?”. Also, expecting a page, or a list of pages, to rank on thin content with no real structure (think Wikipedia), is not realistic.

The SEO purpose of a list is to get pages crawled. The benefits of a list to a user is to provide helpful related information. Concentrate on the later. Google can see and track user behavior (google analytics for example). It is an element of the algorithm.

If this was a big issue, Amazon would not be showing the same title, and meta-description when paginating. What they do instead, is to create landing pages that are relevant to the parent link and category, with strong content silos.

That is the best I can offer for free. :slight_smile:

Totally understand the uses and the how-to’s of SEO. Some of the content is thin simply because my client hasn’t yet added his commentary, as we can’t include large chunks of someone else’s work (citing an article for discussion). Beyond that, there are other pages that aren’t as famished. The paginated element is a symbol and exists on nearly all pages, so that does not help as much.

In any case, it sounds like there’s no definite resolve other than don’t use pagination or ignore these SEO trips because it doesn’t matter anyway.

Thanks!

I’m also running into the issue of dupe tags, descriptions and even missing lang tags on subsequent pages.

I have a blog that I’ve paginated and I’m getting duplicates with pagination. Same issue as @dchait and @n3tworth

Any thoughts on how to address it? I don’t buy the whole “don’t use pagination” or break up your list solutions. These are not scalable.

Any other solutions for pagination? infinite scroll perhaps?

We’re hitting problems here as well. We are consulting with an SEO company to help drive more visitors to our Webflow blog and their primary concern for us is that paginated blog content is being seen as duplicate content by google because each subsequent page has the same meta as the original.

Right now it doesn’t appear there’s a scaleable solution to this problem. Removing pagination is not an option because we can’t realistically load 100 blog posts on a single page.

I was hoping to find more information in the Webflow Universtity article on pagination but they only link to an outdated Google blog and incorrectly suggest using rel=‘next’ and rel=‘prev’ tags.

2 Likes

Were you guys able to resolve this? Please say yes! @n3tworth @arronjhunt @cmartelo

2 Likes

Yeah, my client is asking about the same issue. Fingers crossed for some answers! :grimacing:

3 Likes

+1 to this, any solution suggestions from the Webflow team would be appreciated.

1 Like

@arronjhunt @askwebflow @allykat87 @milkshaken

you may give our cms library component a try, guys.

1 Like

Guys, I’m also having this problem.

I use a list on a static page to display all posts with a page.
I followed the comments of the post, but I saw that there was still no clear solution for that.

The client’s CMS has more than 300 posts, I cannot remove the page from it.

This is an image from WebCEO:

@dram @wmosquini

I implemented the above, took about 5’, works like an absolute charm, and is

  • much much faster than my previous infinite scroll solution
  • loads everything on the same page and is therefore far superior for seo purposes.

Thoroughly solid, I highly recommend.
Many thanks.

1 Like

@dram @Finsweet

However, I have discovered a rather major caveat:
The component is not compatible with srcset images, at least not if one wants it to work properly on Safari.

I have started a thread for that here:

Just an update to say the issue I reported above has now been resolved to perfection by the @Finsweet team. I highly second the recommendation of using their cms library, as it resolves the OP’s issue and is a powerful tool with many other advantages too. And free!

Your cms library component is awesome, but it still uses Webflow’s pagination and I’m still getting duplicates according to our SEO analytics. :frowning:

@aaronatkinsdesign same here. Did you ever get it working?

There is an option in Google Search Console to prevent duplicate content issues with parameters.

Use Crawl URL parameters tool to setup how google should treat the content - https://www.google.com/webmasters/tools/crawl-url-parameters