Sorry this site requires JavaScript to be enabled in your browser. See the following guide on How to enable JavaScript in Internet Explorer, Netscape, Firefox and Safari. Alternatively you may be blocking JavaScript with an advert-related or developer plugin. Please check your browser plugins.

What if you didn’t move the page and simply deleted it and don’t have a page that carries the same information as the old one? Or what if the page never existed in the first place and the broken link was simply to result of someone else’s error during a copy and paste? You have a few options you could choose from, and the top three are:
Comments7 Comments  


from vanessafox 3169 Days ago #
Votes: 4

Rae is absolutely right that you should reclaim all your broken links. If sites are linking to you, but they make a mistake and the link is broken, you want to be able to provide a great user experience for anyone clicking on that link and you want the SEO credit as well.Of course, a great way to find broken links to your site is to check Google’s webmaster tools. The 404 report lists all pages that Googlebot tried to crawl and received a 404 response. This situation generally happens when Googlebot follows a link (from your own site or another site) that is broken.Your first response with that report should be to cross-reference it with the internal links report also provided by webmaster tools so you can fix any internal links that are broken. But for external links, while you can try to contact the owners of those links, you can’t always get them fixed. Personally, I recommend redirecting the broken links to the right page, but not to implement the catch-all non-404 redirect outlined in option 3.One problem with redirecting all requests that would ordinarily return a 404 response with a 301 to a 200 is that search engine bots will think all non-existent pages are real pages and lots of badness can happen because of that. As Rae notes, your 404 pages will get indexed. Seach engine bots may end up spending lots of time crawling non-existent pages, which may mean they don’t have time to crawl the pages you really care about being indexed. If you don’t have a robots.txt file, a request for it will return a 200... Soft 404s just can cause trickiness that I wouldn’t suggest setting up.

from andyhat 3169 Days ago #
Votes: 0

Ah  thats a good insight, cross-referencing the two Google Webmaster Tools reports to find the violating pages, you should do a post on  "advanced functions in Webmaster tools" like this that arent immediately obvious to people who didnt necessarily build it ;)

from lightseo 3168 Days ago #
Votes: 0

The coolest thing is finding an old, broken URL that had a few external links. redirect heaven..

from johnandrews 3167 Days ago #
Votes: 1

I suggest everyone use Vanessa’s comment to make this article a "are you advanced" test. Despite all the silly debate about "advanced seo", it is really very simple: if you read that question about handling 404s and knew absolutely how to deal with them and why, in accord with Vanessa, you are "advanced". If you can name 2 reasons why Vanessa is correct about option 3, including addressing the risk/reward ratio, you are advanced. And if your own sites already include pages which are qualified as on-topic-enough for just about any incoming 404 likely to exist, simply because the presence of such "semantically transitional" pages is simply a consequence of your SEO, you are advanced.

from g1smd 3117 Days ago #
Votes: 0

I prefer to use Xenu LinkSleuth to check internal links and outgoing external links for errors.I do use Google WMT to check for broken incoming links.  It is very useful for that, and several other things (short/long/duplicated titles and meta descriptions, etc).For broken external incoming links,  I usually set up a 301 redirect to the correct URL.  However, do note that  it takes Google a long time to spot that the URL previously returning 404 now returns 301.  I have recently  mentioned that several times over at WMW. 

from JohnHGohde 3117 Days ago #
Votes: -2

I use Xenu LinkSleuth to check for broken links. Would never 301 re-direct a broken external link under any circumstances as Google balks on redirections almost as much as they do on broken links, especially on re-direction chaining. I try to avoid moving internal URLs as much as possible.   Ergo, I fix 301 re-directions just like they were a broken link.For me, the WebArchive usually comes to the rescue. When that fails, I simply remove the link. Sites that remove themselves from the WebArchive are nuts, IMHO.My natural health website averages 14 external hyperlinks per page. All of which must be periodically maintained. So, I speak from experience.

from JohnHGohde 3116 Days ago #
Votes: -1

The post by Rae Hoffman was actually pretty weak. The correct solution is a combination of re-direction and a catch all custom 404 error page. You should leave the redirects in long enough to allow Google to clean up its index and for other webmasters to maintain their websites. Yes boys and girls, websites actually require maintenance which is more time consuming and tedious than boring. Then you should periodically remove the re-directions in order to avoid re-direction chaining.There are any number of little technical IT details that can adversely affect a website’s standing in the SERPs. Link Building wont help websites any that have disappeared completely from the Web index. Nor, will filing reinculsion requests help any when what the real problem is; is one of those boring little IT details called a lack of website maintenance.

Upcoming Conferences

Search Marketing ExpoSearch Engine Land produces SMX, the Search Marketing Expo conference series. SMX events deliver the most comprehensive educational and networking experiences - whether you're just starting in search marketing or you're a seasoned expert.

Join us at an upcoming SMX event: