Home / Marketing / Big Site Search Engine Optimization Basics: Faceted Navigation

Big Site Search Engine Optimization Basics: Faceted Navigation

In the event that you work on an enterprise website — especially in e-commerce or listings (such as for example employment board website) — you probably make use of some form of faceted navigation construction. Why wouldn’t you? It will help people filter down seriously to their particular desired pair of results relatively painlessly.

While beneficial to users, it is no secret that faceted navigation could be a nightmare for SEO. At Distilled, it’s quite normal for us getting a client that has tens of an incredible number of URLs being real time and indexable once they shouldn’t be. More often than not, this will be because of the faceted nav setup.

There are certain great articles online that reveal just what faceted navigation is and exactly why it could be an issue for search-engines, thus I won’t go into much information with this. A fantastic place to begin is this post from 2011.

What I desire to target as an alternative is narrowing this problem right down to a straightforward question, and offer the feasible solutions to that concern. The question we have to response is, “just what choices do we have to decide what Google crawls/indexes, and exactly what are their particular pros/cons?”

Brief summary of faceted navigation

As an instant refresher, we are able to define faceted navigation as any way to filter and/or type outcomes on a webpage by particular attributes that aren’t fundamentally related. For example, the colour, processor kind, and display screen resolution of a laptop. Here is a good example:

Because every feasible mix of facets is usually (one or more) unique URL, faceted navigation can make some dilemmas for SEO:

  1. it makes most duplicated content, which will be bad for various factors.
  2. It eats up important crawl budget and will send Google wrong indicators.
  3. It dilutes website link equity and passes equity to pages that people don’t even want indexed.

But very first… some fast examples

It’s well worth using a few momemts and looking at some examples of faceted navigation being most likely harming SEO. They’re easy examples that illustrate just how faceted navigation can (and in most cases does) be a problem.


First up, we Macy’s. I’ve done a straightforward website:search for domain and included “black dresses” as a keyword to see just what seems. At the time of writing this post, Macy’s has actually 1,991 items that fit under “black outfits” — so why tend to be over 12,000 pages listed with this search term? The answer might have anything to do with exactly how their particular faceted navigation is established. As SEOs, we could remedy this.

Home Depot

Let’s collect Depot as another example. Again, doing a straightforward site:search we look for 8,930 pages on left-hand/inswing front side exterior doors. Is there a reason to own that many pages within the index concentrating on comparable products? Not likely. Fortunately this is fixed utilizing the correct combinations of tags (which we’ll explore below).

I’ll leave the instances at that. You can easily go on many large-scale e-commerce web pages and locate issues with their navigation. The things is, numerous large web sites which use faceted navigation could be doing better for SEO reasons.

Faceted navigation solutions

When deciding a faceted navigation solution, you’ll have to decide what you need inside list, exactly what do go, and how to make that take place. Let’s see what the options tend to be.

“Noindex, follow”

possibly the first solution that comes in your thoughts is using noindex tags. A noindex tag is used when it comes to only function of letting bots understand to maybe not integrate a certain page in the index. So, whenever we just desired to remove pages from the list, this answer will make lots of sense.

The issue here is that as you can lessen the amount of duplicate content that is in the list, you’ll still be wasting crawl budget on pages. Additionally, these pages tend to be obtaining link equity, which can be a waste (as it doesn’t gain any indexed web page).

Sample: If we wished to feature our web page for “black clothes” within the index, but we performedn’t want to have “black dresses under $ 100” into the list, incorporating a noindex label towards the latter would exclude it. But bots would remain coming to the page (which wastes crawl spending plan), plus the page(s) would be getting website link equity (which would be a waste).


Many websites approach this dilemma by using canonical tags. With a canonical tag, it is possible to let Bing know that in a collection of similar pages, you have got a preferred version that will get credit. Since canonical tags were created as an answer to duplicated content, it can appear that is a fair option. In addition, website link equity is going to be consolidated towards canonical web page (the one you deem most critical).

But Google will nevertheless be wasting crawl spending plan on pages.

Example: /black-dresses?under-100/ might have the canonical Address set to /black-dresses/. In this situation, Bing will give the canonical web page the authority and website link equity. Additionally, Bing wouldn’t begin to see the “under $ 100” web page as a duplicate of this canonical variation.

Disallow via robots.txt

Disallowing chapters of the website (particularly specific parameters) might be a fantastic answer. It’s quick, easy, and it is customizable. But, it can incorporate some drawbacks. Specifically, link equity are trapped and not able to move everywhere on your site (just because it’s coming from an external supply). Another disadvantage here’s even though you tell Google not to ever go to a particular web page (or area) on the site, Google can certainly still index it.

Sample: we’re able to disallow *?under-100* in our robots.txt file. This will tell Google never to go to any page with this parameter. However, if there were any “follow” backlinks pointing to any URL thereupon parameter with it, Bing could nonetheless index it.

“Nofollow” internal links to unwelcome facets

An selection for solving the crawl spending plan concern should “nofollow” all internal links to facets that aren’t important for bots to crawl. Sadly, “nofollow” tags don’t solve the matter totally. Duplicated content can still be indexed, and link equity will nonetheless get caught.

Example: When we performedn’t wish Google to consult with any page which had two or more factors indexed, incorporating a “nofollow” tag to all interior backlinks pointing to those pages would help us make it happen.

Avoiding the problem altogether

Obviously, if we could prevent this problem altogether, we ought to only do this. If you are at this time in the process of building or rebuilding your navigation or internet site, i’d highly recommend thinking about building your faceted navigation in a way that limits the Address being changed (that is generally completed with JavaScript). This is because simple: it offers the ease of browsing and filtering products, while possibly just generating an individual Address. However, this will probably get past an acceptable limit within the other direction — it is important to by hand ensure that you have actually indexable landing pages for crucial facet combinations (e.g. black outfits).

Here’s a dining table outlining what I typed above in a more digestible way.

Options: Solves duplicated content? Solves crawl budget? Recycles link equity? Passes equity from additional links? Allows interior website link equity flow? various other records
“Noindex, follow” Yes No No Yes Yes
Canonicalization Yes No Yes Yes Yes is only able to be used on pages being similar.
Robots.txt Yes Yes No No No Technically, pages being blocked in robots.txt can certainly still be indexed.
Nofollow inner backlinks to unwanted facets No Yes No Yes No
JavaScript setup Yes Yes Yes Yes Yes needs more work to put up normally.

But what’s the best setup?

First down, it’s crucial that you comprehend there’s absolutely no “one-size-fits-all answer.” In order to get towards ideal setup, you can expect to almost certainly need certainly to use a combination of the aforementioned options. I’m going to emphasize an illustration fix below that will work for most websites, nonetheless it’s vital that you recognize that your answer might differ predicated on how your website is created, just how your URLs tend to be structured, etc.

Luckily, we are able to break up exactly how we reach a perfect solution by asking ourselves one question. “Do we care more about our crawl spending plan, or our website link equity?” By answering this question, we are capable of getting nearer to an ideal solution.

Think about this: You have a webpage which have a faceted navigation which allows the indexation and discovery each and every single aspect and aspect combination. You aren’t concerned about website link equity, but obviously Bing is spending valued time crawling an incredible number of pages that don’t should be crawled. What we love in this situation is crawl budget.

Within certain scenario, I would recommend these option.

  1. Category, subcategory, and sub-subcategory pages should remain discoverable and indexable. (example. /clothing/, /clothing/womens/, /clothing/womens/dresses/)
  2. per category page, just enable versions with 1 aspect chosen become listed.
    1. On pages which have a number of factors selected, all facet links come to be “nofollow” links (e.g. /clothing/womens/dresses?color=black/)
    2. On pages that have a couple of facets chosen, a “noindex” label is added also (example. /clothing/womens/dresses?color=black?brand=express?/)
  3. Determine which facets may have an SEO advantage (including, “color” and “brand”) and whitelist all of them. Really, toss all of them in the index for SEO purposes.
  4. Ensure your canonical tags and rel=prev/next tags are setup appropriately.

This option will (over time) begin to resolve our difficulties with unnecessary pages being when you look at the index as a result of navigation for the website. In addition, observe just how within situation we utilized a mix of the feasible solutions. We used “nofollow,” “noindex, nofollow,” and proper canonicalization to produce a far more desirable outcome.

Other items to consider

There are additional factors to take into account about this topic — i do want to address two that I think would be the main.

Breadcrumbs (and markup) assists a lot

If there’s no necessity breadcrumbs on each category/subcategory page on your own internet site, you’re doing your self a disservice. Kindly go apply them! Furthermore, when you have breadcrumbs in your site but aren’t tagging all of them with microdata, you’re missing out on a huge victory.

Exactly why is not difficult: You’ve got a complicated site navigation, and bots that see your site may possibly not be reading the hierarchy precisely. By adding precise breadcrumbs (and marking all of them up), we’re effortlessly telling Bing, “Hey, i am aware this navigation is complicated, but please think over crawling our website in this way.”

Implementing a URL order for aspect combinations

In extreme situations, you can find a website which includes a unique URL for virtually any aspect combo. For instance, if you are on a laptop web page and select “red” and “SSD” (for the reason that purchase) through the filters, the URL could be /laptops?color=red?SSD/. Now imagine if you find the filters in the contrary purchase (very first “SSD” then “red”) and also the URL that’s generated is /laptops?SSD?color=red/.

This can be really bad because it exponentially escalates the quantity of URLs you have. Eliminate this by enforcing a particular order for URLs!


My hope is that you feel much more equipped (and have now a few ideas) on the best way to handle managing your faceted navigation in a way that advantages your quest existence.

To summarize, here are the main takeaways:

  1. Faceted navigation can be great for users, it is often setup in a fashion that adversely impacts SEO.
  2. there are numerous reasons the reason why faceted navigation can negatively impact Search Engine Optimization, although top three tend to be:
    1. duplicated content
    2. Crawl spending plan becoming lost
    3. Link equity not being utilized because effortlessly whilst should-be
  3. Boiled down further, the question we want to reply to begin approaching an answer is, “What are the means we could get a handle on what Bing crawls and indexes?”
  4. in terms of a solution, there isn’t any “one-size-fits-all” option. You’ll find so many fixes (and combinations) which you can use. Most commonly:
    1. Noindex, follow
    2. Canonicalization
    3. Robots.txt
    4. Nofollow interior backlinks to unwanted factors
    5. Avoiding the problem with an AJAX/JavaScript solution
  5. whenever attempting to think about a great option, the main question you can easily think about is, “What’s more important to our website: website link equity, or crawl budget?” It will help concentrate your feasible solutions.

I would love to hear any instance setups. What maybe you have unearthed that’s worked really? Whatever you’ve tried which has had affected your website adversely? Let’s reveal inside responses or go ahead and take myself a tweet.



Leave a Reply

Your email address will not be published. Required fields are marked *