Too Big To Rank?
By Everett Sizemore, Director of Marketing at Inflow
Just because Amazon has 109,000,000 URLs indexed, yet continues to rank well on Google doesn’t mean that having a huge website is a good thing. There’s only one Amazon. Most of the other enormous websites that I’ve see out there are so bloated and weighed down with useless content that they are probably not reaching their full ranking potential.
Here are a few ways your giant website might be harming search engine rankings
Too many URLs indexed that don’t need to be.
When writing this article I randomly selected the promotional products website, 4imprint.com, to use as an example. I don’t mean to pick on them specifically. Most of these customizable promo product websites tend to have similar issues.
Too many product skus with low quality content.
I have yet to find an eCommerce site that has successfully scaled “good” content across tens-of-thousands of products. They get that big by pulling in product data feeds from manufacturers, suppliers and drop shippers -- which also happen to give that same content to every other merchant.
As is the case of our example site, sometimes the product details consist only of a few bullets, which are repeated for many other products in their catalog.
Search for any bullet in quotes and you’re likely to end up with results like this:
A mere 18 duplicates may not seem like that big of a deal, but this is one example on a site with hundreds of thousands of similar examples to choose from.
Technical SEO issues causing indexation bloat.
The page below is an eCommerce category from the same example site. As you can see, this page doesn’t provide much value to visitors on the site. And it definitely doesn’t help visitors from the search results!
How big of a problem is this? A quick search reveals another 2,840 URLs that have no reason for being indexed by Google:
Why This Matters
OK, so these sites have a lot of URLs indexed. What’s the big deal? Below are three of the main reasons to address “indexation bloat” on sites that are too big to rank.
Wasted Crawl Budget
Googlebots aren’t going to spend much time on this site crawling junk page after junk page. Crawl budget will be lowered as a result, and even the good pages will suffer. Learn more about crawl budget optimization here.
Wasted Link Equity
Every link is a vote. You have control over this election. Use it! Learn how big of an impact limiting internal links on huge eCommerce websites can have in this talk by Alex Stein, SEO at Wayfair, presented at MozCon 2016 (either be logged in on Moz or purchase the video bundle).
Lower Sitewide Quality Perception
If Google can tell whether a single page is high quality or low quality, it stands to reason that this metric (let’s say Quality Score) can be extrapolated out to the domain as a whole. In other words, Google knows whether the site is low or high quality, and removing the lowest quality pages is likely to benefit the site as a whole. Learn more about the concept of pruning in this Moz.com post.
What You Should Do Now
If you have thousands of pages indexed by Google but suspect a large portion of those aren’t sending any traffic, or may be considered low quality, the next step is to perform a content audit, which is described here. There’s also a tutorial here.
Everett Sizemore is Director of Marketing at Inflow, an eCommerce marketing agency based in Denver, Colorado. He has over a decade of experience in technical SEO, is a Moz Associate, and speaks regularly at industry conferences. Learn more at www.GoInflow.com.