Google’s quality threshold – and how it relates to indexing – has become a more well-known and widely talked about topic over the past year.
Several elements contribute to the value proposition of an individual page and domain. But one key concept that Google covers in their Quality Rater Guidelines is “beneficial purpose.” (I cover how beneficial purpose relates to indexing, in Why 100% indexing isn’t possible, and why that’s OK.)
Increasingly, when websites experiencing performance (and rankings) declines, it’s because:
- The SERPs have changed (and now present users with different value propositions).
- The site has spread value around a specific topic across too many URLs with the aim of ranking multiple URLs for multiple keywords.
When we’ve then audited, and consolidated these pages (or page elements), we’ve strengthened the value proposition of the target page and seen performance improve, and the pages better align with what Google is now choosing to serve on Page 1.
Google has discussed content consolidation, but more in the context of consolidating domains or subdomains that have overlap or compete for the same topics and terms.
By applying the logic of value proposition and beneficial purpose to this, we can do the same for documents existing within a single domain.
What is content consolidation?
Content consolidation is the process of merging different pieces of content, such as blog posts, articles, or landing pages built for SEO, into a single, cohesive article.
This single cohesive piece has a strong value proposition (and main content), and supporting elements that link to associated content (supporting content).
During this process you will also naturally be removing (or correcting) outdated and incorrect content.
This should work in-line with your overall content strategy in either generating visibility at the top of the funnel, or answering use case questions to then direct the user journey toward your conversion-orientated pages.
Consolidation audits should focus on the content, and in my opinion, keep the page type agnostic.
For example, there’s no harm in moving elements from blog posts onto commercial pages if it helps enhance the commercial page’s value proposition and rank for more relevant queries.
Definition of “quality”
Section 3.1 of the Quality Rater Guidelines defines a bullet list of the most important factors in determining page quality.
But the first bullet point in this list is the one that relates closely to beneficial purpose, and this is literally “the purpose of the page.”
Each page has a different purpose (e.g., to provide information or sell a product/service). A page quality score is then applied to that page type.
As Google has been augmenting and refreshing SERPs in recent months, some seemingly changing or mixing contrasting intents (to provide results to cover multiple common interpretations more than before), “beneficial purpose” has become more poignant.
When we then start talking about a page having a relevant beneficial purpose, and high quality, we enter the realms of describing pages as having:
- High levels of E-A-T.
- High quality and satisfactory levels of main content (defining the core beneficial purpose).
- Good levels of relevant, supporting content.
By contrast, a page can have a strong beneficial purpose (e.g., to sell or promote a product or service), but if it lacks the other factors, it will struggle. Because Google ranks URLs, distributing a topic and its content (and beneficial purpose) over too many pages dilutes potential value.
Google describes this as:
Low quality pages may have been intended to serve a beneficial purpose. However, Low quality pages do not achieve their purpose well because they are lacking in an important dimension, such as having an unsatisfying amount of MC, or because the creator of the MC lacks expertise for the purpose of the page.
A common situation I see this lead to (a lot) is where Google will choose to rank a blog post, designed as supporting content, ahead of a commercial page for commercial terms (albeit in non-traffic driving positions) because it has a higher value proposition that a commercial page, that has been spread out over a subfolder and multiple URLs.
Several data sources can inform your consolidation efforts.
- Common analytics tools for pageviews, entrances and exits.
- Google Search Console.
- Your rank tracking tool of choice.
With this data, you will be able to identify potential problem areas.
Identifying priority URL targets
When I’ve worked with clients to do this, the first question is “where do we start?”
The best answer lies within Google Search Console.
Within coverage reports, we want to look for pages that have been categorized in the excluded section as:
- Crawled – currently not indexed
- Alternate page with proper canonical tag
- Duplicate without user-selected canonical
- Duplicate, Google chose different canonical than user
- Soft 404
These are all exclusion categories that indicate potential page quality issues, and that the URLs could be valid but are falling below the quality threshold for indexing.
As mentioned earlier in the article, removing good content segments from a blog URL and adding them to a commercial URL (if it makes sense to do so and enhances value proposition) isn’t a bad thing before removing the dissected page.
For a number of established websites, you may identify that you have multiple blog articles and other areas of the website (such as support) all competing for similar terms.
This is a good opportunity to realign these pages. For example, you may have a blog article targeting some top-of-funnel terms, and a support article targeting specific issues.
Here you can de-optimize the article to not compete with the support article as it provides a better value proposition and better satisfies the intent of the support queries.
There is no set strategy or framework to really follow in doing this, as all websites and content strategies are structured differently.
However, your core focus should be on maintaining and improving performance of business metrics (e.g., leads, revenue) – even if it comes at the cost of some pageviews.
Opinions expressed in this article are those of the guest author and not necessarily Search Engine Land. Staff authors are listed here.
New on Search Engine Land
About The Author
Dan Taylor is head of technical SEO at SALT.agency, a UK-based technical SEO specialist and winner of the 2022 Queens Award. Dan works with and oversees a team working with companies ranging from technology and SaaS companies to enterprise e-commerce.
Learn More: latest news on stimulus,u visa latest news,o panneerselvam latest news,g dragon latest news,latest news about stimulus check,j cole latest news,p chidambaram latest news,hepatitis b latest news,sarah g latest news,l&t latest news,p square latest news,