How to Deal with Internal Crawl Errors and Broken Redirects

November 18th, 2015 by Everett Sizemore

Internal crawl errors and unnecessary redirects affect user experience and crawlability – thus your rankings. Checking for these issues is an important part of reviewing past SEO efforts to ensure they are up-to-par with modern search engine optimization techniques.

Crawl Errors: How to Assess

There are a few ways to check your site for internal crawl errors. The most thorough approach is to use multiple resources and tools, such as the three below.

3 Ways to Check Your Site’s Crawl Errors

1. Google Search Console (formerly known as Google Webmaster Tools): The crawl errors report in GSC has multiple tabs, but the one to focus on here is the Not Found tab (see screenshot below). This report contains URLs that Googlebot has determined as non-existent (404 error code).

Graph with site URL errors

You can get more information by clicking an individual URL. In doing so, you will see a popup window with information, such as first detected and last crawled (Error details) tab. If the URL is found in a sitemap, you’ll see the “In sitemaps” tab and a link to the sitemap containing the bad URL. The “Linked from” tab will show you pages on the site that were found to be linking to the 404 page.

error detail

Action Items

1) Ensure your sitemaps do not contain URLs returning a 404 error
2) Remove all internal links to 404 pages
3) In many cases, URLs returning a 404 should be 301-redirected to the most related/similar page on the site. If there is no similar or related page, it is OK to let it return a 404 error. Executing No. 1 and No. 2 above should ensure the URL does not get crawled in the future. The exception is if there are quality external links pointing to the page returning a 404. If this is the case, the URL should be redirected to retain link equity.
4) After completing steps 1-3, use the “Mark as Fixed” feature in the Crawl Errors report. Then, check back periodically to see if the issues have truly been fixed or if new crawl errors are found.
5) To force Google to “see” that redirects have been put in place, use the Fetch as Google First enter the URL, click “Fetch” then click “Submit to Index.” This will force Google to recrawl the URL and it will see the 301 directive. You can submit up to 500 individual URLs per month with this feature.


2. Bing Webmaster Tools: Similar to GSC, Bing Webmaster Tools has a Crawl Information report that helps you identify pages that have returned various status codes.The 404 errors show up in the “tab” labeled 400-499 (Request errors).

Site crawl information

Action Items

1) Ensure your sitemaps do not contain URLs returning a 404 error
2) Remove all internal links to 404 pages
3) In many cases, URLs returning a 404 should be 301-redirected to the most related / similar page on the site. If there is no similar or related page, it is OK to let it return a 404 error. Executing No. 1 and No. 2 above should ensure the URL does not get crawled in the future. The exception is if there are quality external links pointing to the page returning a 404. If this is the case, the URL should be redirected to retain link equity.


3. Screaming Frog SEO Spider: Screaming Frog can be used for free for up to 500 URLs. If you have more URLs, you’ll have to purchase the tool. The above two methods are sufficient to identify, monitor and solve the majority of, if not all, issues with crawl errors. If you would like another method, check out this post on the Screaming Frog site, How to Find Broken Links Using Screaming Frog.

Redirect Chains: How to Assess

Over the lifetime of a website, multiple migrations, architecture changes, and changes in products and services may occur. Many reasons that pages will have been redirected. In some cases, sites end up with redirect chains — when a URL redirects to another URL that is redirecting to a third URL. In an ideal world, URLs being redirected should only redirect once. As a marketer, it is important to understand this and know how to identify whether redirect chains occur.

Thankfully, Screaming Frog makes this easy. If you are going through a site migration, you can follow the process here, How to Audit Redirects Using Screaming Frog.

If you are not going through a migration, there still may be redirects in place on the site. An easy way to look for redirect chains is by looking in the .htaccess file and using the method in the above article (upload URLs in list mode) to verify the redirects are pointing to the right place. If you are on a Windows server, there’s a good chance you are using the URL Rewrite Module. You should be able to find a list of all URLs being redirected in your web.config file. Then again, use the same method above to look for redirect chains.

These fixes to broken URLs and redirects can have a huge impact of your bottom line. If you’re ready to dig even deeper into your eCommerce site’s SEO health you can access our free workbook for a thorough assessment.

Pinterest Has Made It Easier To Create Pin Buttons

October 14th, 2015 by Chris Crum

Pinterest announced that it has updated its developer site to make it easier for people to build their own Pin It buttons.

There’s a simple widget builder that lets you create buttons/widgets to add to your site or app. You can quickly grab code for the Pin It Button, the Follow Button, the Board Widget, the Profile Widget, and the Pin Widget. Read the rest of this entry »

In Preparation For The iOS 9, Facebook Created A New SDK For Developers

September 16th, 2015 by Chris Crum

Last month, Facebook announced a new iOS SDK in beta to help developers prepare for Apple’s iOS 9, which launches on September 16. On Thursday, Facebook announced the release of the final version of v4.6 and v3.24 of this SDK.

Any developers who use Facebook Login, App Events, Analytics for Apps, Sharing across Facebook and Messenger, App Invites, App Links or Native Like will have to take the necessary steps. Read the rest of this entry »

Android 6.0 IS Is Now Called Marshmallow

August 19th, 2015 by Chris Crum

Google announced the final developer preview for the M release of Android, which was first launched in May. Today, it introduced the official Android 6.0 SDK and opened Google Play to publishing apps that target the new API level 23. Oh, and it’s called Marshmallow. Read the rest of this entry »

Go Client Library is Now Available for Cloud Bigtable

July 22nd, 2015 by Chris Crum

Google said on Friday that the Go client library for Google Cloud Bigtable is now available.

Cloud Bigtable is Google’s scalable NoSQL database product, which drives many of its own products like Gmail, Google Earth and YouTube. The Go programming language is gaining popularity, particularly among those building Cloud apps. Read the rest of this entry »

Facebook Page Includes New Plugin, With New Features

June 17th, 2015 by Chris Crum

Earlier this year at its F8 developers conference, Facebook announced the Page plugin, which is replacing the LIke Box for websites. It’s a way for sites to embed and promote a Facebook Page, and enables site visitors to like or share the page directly from the website itself. Read the rest of this entry »

Reading Tip – 5 Free Resources for Coders or Would-Be Coders

May 20th, 2015 by Glenn Letham

Some week-end or late night reading tips here along with several resources that a coder or would-be coder might find handy to have in the library… Enjoy! Read the rest of this entry »

Facebook Is Deprecating Its Like Box Feature, Will Now Use Its Page Plugin

April 22nd, 2015 by Chris Crum

A few years back, Facebook launched the Like Box plugin as one way for websites to generate more engagement from Facebook users by showing content from their Facebook pages, and showing them other people who have liked the page. The company is now shutting the plugin down. Read the rest of this entry »

A couple Google Webmaster Tools tips and tricks

March 25th, 2015 by Chris Abraham

Too few webmasters move beyond Google Analytics, which is a pity. Google Webmaster Tools is all you really need to become a Google-whisperer. Why spend all this time and money on MOZ and Woorank when you could just hear it directly from the Horse’s mouth.

I have been using it a lot lately since I have been doing a lot of SEO strategy work and I have really had some serious fun working through some of the tools that Google basically begs you to adopt in order to make things super-easier for them — and therefore better for you, the site owner or manager.

This week, we’ll discuss International Targeting and HTML Improvements.

Location, location, location (International Targeting)

Be realistic. I know that the Internet is global and that there might well be a distant client on foreign shores but it’s really worth it to get as geographically specific as possible. Do it for Google.  There’s two ways to signify your language and location to Google and they are through the use of Hreflang Tags and through Country Targeting.  That said, if you happen to purchase a clever domain like, which is an Icelandic top-level-domain, Google doesn’t give you much of a choice.


Whoops. I plan to shift my primary domain from to in order to be able to target the US in my Google Webmaster Tools. While I admire the fine, fair, high-cheekboned people of Reykjavik, I don’t know if they’re my ideal geographic target.

Google Just Wants to Help You Help It (HTML Improvements)

I feel like folks are spending time and money using both free and paid tools like MOZ and Woorank when Google is pretty explicit about what it wants and needs from you.

Google does care about Title and Description duplication and is happy to help you work through it.

Google is looking for Goldilocks titles and descriptions, searching for content that is not too short, too long, but just right.

Since Google explores your site using robots, spiders, and bots, their science is inexact.

Sometimes duplication isn’t duplication at all but has to do with Canonical URLs, aliases, and things such as trailing slashes and URL variations.

Strangely enough, Google’s a little odd. It’s 2015 and Google still requires us to submit both our plain domain as well as our URL with a WWW subdomain.  So, sometimes duplication isn’t simply duplicate content but issues with how you have your site domain setup.

I have three more subjects that I will discuss next week, Google Only Speaks Structured Data (Sitemap.xml)You Can Lead a Google to Structure (Data Highlighter), and Are You Fast Enough? (PageSpeed Insights).  I look forward to continuing on our exploration of Google Webmaster Tools next week.

Good luck! Go git ‘em, Tiger!


Twitter Rolls Out First Official WordPress Plugin!

February 26th, 2015 by Ritu Sharma

Twitter announced the release of its first official WordPress Plugin, enabling the WordPress publishers to have an easier access to a host of Twitter’s features and functionality. Read the rest of this entry »