Sorry this site requires JavaScript to be enabled in your browser. See the following guide on How to enable JavaScript in Internet Explorer, Netscape, Firefox and Safari. Alternatively you may be blocking JavaScript with an advert-related or developer plugin. Please check your browser plugins.

Alan does a good job of laying out the two sides of the coin. A good read for folks confused about the relative importance of W3C compliance as it relates to SEO.
Comments41 Comments  

Comments

Avatar Moderator
from Sebastian 1697 Days ago #
Votes: 1

I so wanted to submit this one. ;-(



Avatar Moderator
from Jill 1697 Days ago #
Votes: -1

I love Alan even more now than I already did!



Avatar Moderator
from hugoguzman 1697 Days ago #
Votes: 4

Sorry that I beat you to it, Sebastian. Code validation as it related to SEO is one of the biggest stumbling blocks for inexperienced marketers (and for some experienced SEOs out there).



Avatar Moderator
from Sebastian 1697 Days ago #
Votes: 2

No bad feelings, j/k. I was going to start kinda series with Edward's take  http://sphinn.com/story/143886 and waited for Dave's post, Alan's piece just fits perfectly.



Avatar
from AlanBleiweiss 1697 Days ago #
Votes: 3

Hugo,  thanks for Sphinning this.  I really did my best to be fair and balanced while staying true to my core beliefs.

OMG Jill :-)  [blushing]



Avatar Moderator
from hugoguzman 1697 Days ago #
Votes: 1

You're welcome, sir! I also threw it up on my fledgling LinkedIn group:

http://www.linkedin.com/groups?gid=2813501&trk=myg_ugrp_ovr




Avatar Moderator
from Sebastian 1697 Days ago #
Votes: 1

This new submission adds fuel to the fire: http://sphinn.com/story/144077 - worth a read.



Avatar
from AlanBleiweiss 1697 Days ago #
Votes: 1

Sebastian - see my Sphinn comment on that one



Avatar
from pageoneresults 1697 Days ago #
Votes: 1

Where do I begin? How does one sit here and argue in support of crap code? I don't understand that concept. Nor do I understand where the ROI argument comes into play. You paid for a website design, you paid for development, you should have received well formed valid markup. Or at least you would have, had you dealt with professionals in the process.

If you're sitting there now staring at 100s or even 1,000s of markup errors, then I'd say you paid for an inferior product. There were a set of instructions (protocols) that the designers/developers were supposed to follow when building your website. If they failed to follow those instructions and delivered a website to you that is rife with markup errors, you should request a refund of your monies. Why? Because at some point you're going to pay to clean those errors up.

What's the ROI? How closely do you monitor 400/500 errors? What you say? That's not an SEO responsibility? While that may be true for many of you, it is part of a Professional SEOs responsibility to make sure that this type of monitoring is in place. If it isn't, you're flying in the dark. I'm sure you're going to ask where the ROI is on monitoring 400/500 errors?

Take all the markup errors present, look at all the 400/500 errors, fix them, and you've addressed a very large portion of the leaks in the foundation that you've been hired to promote. Now you can get down to some visible SEO, you know, that stuff you see in the browser. The stuff I see most of you discussing when it comes to validation.

I know, I don't stand a chance against an industry that doesn't care about their craft at this level. That's okay. The Developers will soon be filling your positions, you're really not required anymore. SEO at this level is dead and has been for quite some time. ;)

Just remember this, the only thing keeping most of you afloat right now are links. What happens when the dependency on links are lessoned and Google finally admits that well formed valid code is preferred? What about Microformats, RDFa, etc. They all require well formed valid markup to perform as expected. What will you do when you're faced with those challenges? It's really difficult to recommend hCard, hProduct, hReview, hCalendar, etc. when the existing code is tag soup. In fact, I do believe it won't work if you don't use well formed markup. That's the kind of markup I like, the kind that only works one way, the right way.

Fix your websites!

On a side note, WOW! Look at all the work that has gone into Sphinn to cleanse markup. Bravo and Kudos to those slaving back there. I'm impressed! :)



Avatar
from AlanBleiweiss 1697 Days ago #
Votes: 3

Edward,


A Truly passionate reply, I must say.  And yes, you have a valid position in terms of the notion that sites built on crap code really deserve a refund.

Except the type of clients who hire me are owners of sites that have cost them tens of thousands or quite often hundreds of thousands of dollars.  Usually they've evolved over years of time long before I came along.

And if I were to tell the client - you need to spend thousands of dollars on your site to resolve issues that absolutely, unequivocally have nothing to offer in terms of SEO, and only THEN will I offer you immediate actionable items that themselves, REGARDLESS of those non SEO items, WILL get your business a DRAMATIC increase in total visits, quality of visits, total conversions, quality of conversions, and in turn, increased revenue, I assure you - my clients will more often than not, explain to me that they do not have the LUXURY to explain, within their corporate structure, that their IT department FAILED MISERABLY at their web initiative and that, instead of seeing APPRECIABLE return on current marketing budgetary allocation within 30, 60 or 90 days, it's probably going to be SIX MONTHS or A YEAR.

Because oh, excuse me, but it's going to take 90 days just to get the IT team to admit they screwed up, IF they own up to it. AFTER they push back in self defense.  Repeatedly.  Out of fear of embarrassment and losing their jobs.

Now that's not the ONLY scenario I face, but it is quite common.

And the foundational issues there ARE the reality I face with the vast majority of my clients.  Overwhelming pressure from on high to get results sooner rather than later.  So it's ALREADY an uphill battle for me when I go into these situations, to help them understand THE BASIC CONCEPT of how SEO will help increase their bottom line financially.  Regardless of validation issues NOT directly related to SEO.

You have a serious DISDAIN for the majority of the SEO industry.  That's sad.

You push MOST of us into the "you only survive on links" category.  That is pathetic, baseless attack mentality, and I will NOT tolerate such nonsense.  Because personally, I DO NOT obtain the results I do based on link building.  In fact, I happen to focus 95% of MY work strictly with ON-SITE SEO.  And I refer my clients out to OTHER people for link building.  Valid, Real, Legitimate, Long-Term, Slow, Painstaking QUALITY link building.

As for Miicroformats, JUST THIS WEEK my latest audit for a MAJOR western United States financial institution, a FIFTEEN PAGE document with almost THIRTY actionable items, happened to include an entire SECTION on Microformats, their VALUE, and their IMPLEMENTATION.

So please, Edward.  Before you bash ME and the MAJORITY of us who disagree with you, check your motives.  Because they just muddy your already blatant unwillingness to acknowledge the challenges we face.



Avatar
from AlanBleiweiss 1697 Days ago #
Votes: 1

And Edward,

Just for the record, I am HAPPY that you believe it's important to communicate to YOUR clients that obtaining 100% W3C validation is so critical.  Really. I am.  Because it shows you are truly passionate about seeing the web evolve.  And you care about your clients.  But please, wake up to the fact that YOUR beliefs and YOUR methdods in regard to the topic of validation are NOT necessarily the most PLAUSIBLE for every situation given the challenges we face as an industry where we ourselves do not have direct control over the code base.  And it would be a MIRACLE if you were to also wake up to the concept that it is JUST as valid, if not more so, that many of us choose to focus on those things that serve the needs our clients communicate to us, as opposed to an ideal.



Avatar
from andymurd 1697 Days ago #
Votes: 0

After watching P1R & Alan slug it out on Twitter, I had to weigh in here with my 2c...

It's my understanding that crawlers run a process like HTML Tidy on the pages they download in order to better parse it. Yahoo states that it uses HTML Tidy for YQL so it's likely to be used elsewhere too.

HTML Tidy is good but it can't deal with complete garbage. If your code is that messy, crawlers aren't likely to be able to parse your pages and you just won't make it into the index.

Most of the time, invalid markup can (and will) be cleaned but the issue here is what happens during the cleaning process:

  • What happens when your affiliate link anchors have two "rel" attributes?
  • No alt tags? Good luck with image search!
  • Your logo wasn't closed properly? Better hope that G doesn't mistake your content for boilerplate.
  • Two id attributes on your headings? I doubt you're going to see "Jump To" in your snippets.

I don't think that there's a check in Google's algorithm that penalises invalid code, but that doesn't mean that validation is not useful for SEO.




Avatar
from AlanBleiweiss 1697 Days ago #
Votes: 2

It's agreed that certain aspects of valid code are required.  I'm not arguing that.  I never have.  Obviously if a site doesn't follow core principles, it's not going to rank well for on-site factors.  Those are the issues I personally address in my own audits.  I will not, however, insist a client site pass 100% validation because many "errors" do not cause such problems.  It's a balancing act that has to be integrated into the bigger picture of business process.



Avatar
from btard 1697 Days ago #
Votes: 2

What andy said 2 up - this is automated. One day a bot will bork at bad code. It's kinda inevitable and something 'important' will be missed.

Good job we have all these tools to rectify these problems.

W3C Validator, webmaster tools etc etc etc.



Avatar
from pageoneresults 1697 Days ago #
Votes: -1

Except the type of clients who hire me are owners of sites that have cost them tens of thousands or quite often hundreds of thousands of dollars. Usually they've evolved over years of time long before I came along.

Alan, those types of clients are usually the ones that benefit the most from cleaning up their markup.

And if I were to tell the client - you need to spend thousands of dollars on your site to resolve issues that absolutely, unequivocally have nothing to offer in terms of SEO.

Wow, you seem really sure of this? Thousands of dollars? Nothing to offer in terms of SEO? I've written more than a handful of articles that absolutely prove you wrong and I'm tired of holding your hand in all of this. I've lost count on the number of contradictions that are now appearing in your responses. I would love to get my hands on one of your client's sites and do a Case Study. If you are that non-chalant about writing well formed valid code, I'm positive I can show them an ROI on cleaning things up. I'm unequivocally, absolutely, positively, sure that I can. ;)

And only THEN will I offer you immediate actionable items that themselves, REGARDLESS of those non SEO items, WILL get your business a DRAMATIC increase in total visits, quality of visits, total conversions, quality of conversions, and in turn, increased revenue, I assure you - my clients will more often than not, explain to me that they do not have the LUXURY to explain, within their corporate structure, that their IT department FAILED MISERABLY at their web initiative and that, instead of seeing APPRECIABLE return on current marketing budgetary allocation within 30, 60 or 90 days, it's probably going to be SIX MONTHS or A YEAR.

I thought I was guilty of using excess words to express a point. Read that again! What did you just say?

Oh, I get it. Your clients want all the pretty reports that show them the natural progression of their website whether you were there or not? They're the corporate types who love to see all the pretty bar charts that in reality, are just one big cluster of items that are usually not very actionable. And I've seen what corporates do with those 20, 30, and 50 page reports. All the managers use them for job security. It's usually one big cluster you know what. :)

Because oh, excuse me, but it's going to take 90 days just to get the IT team to admit they screwed up, IF they own up to it. AFTER they push back in self defense. Repeatedly. Out of fear of embarrassment and losing their jobs.

And you're there to pat them on the back and tell them to keep producing crap code that IS going to interfere with your overall marketing objectives? That makes a lot of sense.

And the foundational issues there ARE the reality I face with the vast majority of my clients. Overwhelming pressure from on high to get results sooner rather than later. So it's ALREADY an uphill battle for me when I go into these situations, to help them understand THE BASIC CONCEPT of how SEO will help increase their bottom line financially.

Dude, what kind of clients are you working with. You sound stressed. They sound stressed. Overwhelming pressure from on high to get results sooner rather than later. That's the type of environment that will always produce less than optimal results. I'm glad it's you and not me. :)

Regardless of validation issues NOT directly related to SEO.

I've run validation routines on 300+ SEO websites for the last 5 months. I can tell you Alan, that a large percentage of validation errors on SEO websites are directly related to SEO. There is this cascading effect that takes place with parsers.

You have a serious DISDAIN for the majority of the SEO industry. That's sad.

No, I have a serious disdain for the majority of the SEO industry who talk the talk but don't walk the walk. It's an old school cliche but it surely applies to our industry.

You push MOST of us into the "you only survive on links" category.

I know, the truth hurts. Don't worry though, the backup crew will agree with me, links are it. That is all you need. You don't even need an SEO really.

That is pathetic, baseless attack mentality, and I will NOT tolerate such nonsense.

Heh, did I hit a nerve? Just like I won't tolerate the nonsense being slung about when discussing well formed and valid markup. Get the picture? Most SEOs are link whores anyway, there's no secret about that? ;)

In fact, I happen to focus 95% of MY work strictly with ON-SITE SEO.

Okay.

And I refer my clients out to OTHER people for link building. Valid, Real, Legitimate, Long-Term, Slow, Painstaking QUALITY link building.

Off topic.

As for Microformats, JUST THIS WEEK my latest audit for a MAJOR western United States financial institution, a FIFTEEN PAGE document with almost THIRTY actionable items, happened to include an entire SECTION on Microformats, their VALUE, and their IMPLEMENTATION.

Cool! Now the question becomes, will they act on it? Or, will they cherry pick like most corporates? If they cherry pick, they've wasted their money. It is very difficult to implement Microformats in tag soup.

So please, Edward. Before you bash ME and the MAJORITY of us who disagree with you, check your motives.

The ONLY folks disagreeing with me are those who don't understand it. I've not seen one person here yet who follows standards who disagrees. So for me, I'm outnumbered because our industry just hasn't gotten it yet. Or most of it anyway. I know there are more than a handful who Sphunn this who do get IT. ;)

Because they just muddy your already blatant unwillingness to acknowledge the challenges we face.

Alan, I've been there. You don't hear me whining about them, do you? No! That's because the clients I work with understand the importance of all this and make the suggested changes, all of them. They're not allowed to cherry pick either. There is no a la carte menu. It is all or nothing. If they take all, excellent. If they take nothing, there are thousands of SEOs that I can refer them to for assistance.



Avatar
from onreact 1696 Days ago #
Votes: -3

Brushing your teeth is not a ranking factor but I still brush my teeth nonetheless!



Avatar
from AlanBleiweiss 1696 Days ago #
Votes: 2

onreact, that's good to know.  because uh, someone here might not otherwise want to have a face to face conversation with you.


But as far as SEO goes, if you are not considering the financial reality of having to help clients choose which things to tackle in what priority then the concept you put forth is irrelevant.  So too goes the reality that I don't address how much they sell their products for, even though that too can have an impact on visitor conversions.



Avatar
from Badams 1696 Days ago #
Votes: 6

I'll repeat my comment on the article here as well as I think it'll add to this discussion:

I honestly think that SEOs in the ‘your website must be 100% W3C compliant’ camp are stunningly ignorant on how information retrieval works. Search engine crawlers do NOT render a page – they retrieve the HTML code and parse it, but they never render it as a browser does. For SEO purposes the HTML code needs to be suitable for parsing, NOT for rendering in a browser. Thus W3C compliance really isn’t an issue. The code just needs to be clean enough for a search engine to parse it and distinguish content, navigation, and style.



Avatar
from AlanBleiweiss 1696 Days ago #
Votes: 1

Barry,

That's a most excellent way to clarify where the importance of focus needs to be from a purely SEO perspective!



Avatar
from pageoneresults 1696 Days ago #
Votes: -1

SEO does NOT require 100% validation.

I had to go back through ALL of my Tweets to see where I specifically say 100% validation, I couldn't find one. Alan, of course I've followed every single reply to this topic both here, on Blogs and on Twitter. I see where you're going with this. I've watched you shift course from the first moment I started responding. You've gone from NOT caring about validation to now caring about it for those items which may affect SEO. You're either for it or you're not, which is it. You can't flip in the middle of this discussion and expect anyone to take you seriously.

It's agreed that certain aspects of valid code are required. I'm not arguing that.

That's not true. You've been arguing it ever since you misquoted Matt Cutts. You've already stated that you're not a developer so how do you know which aspects of valid code are required? You're now agreeing to specific things as you see the injustice you're doing based on peer comments. That's the way these discussions normally progress, I've been involved in many of them over the years. Developers like do this too after they see the light, then they clean up their code.

I will not, however, insist a client site pass 100% validation because many "errors" do not cause such problems.

Alan, after getting involved with this debate, I'm certain you're not in a position to cherry pick markup errors and let the client know which ones may prevent proper parsing. No one is insisting 100% validation, that is a phrase you've injected in all of this which allows you an escape once things get cooking in here. I'm going to nip that one right now. In all the years I've promoted Standards and Compliance, I've not once said that 100% validation is required. Although if you want, I think with certain members of this group, I can safely say that you're much better off being 100% valid than not.

Brushing your teeth is not a ranking factor but I still brush my teeth nonetheless!

Thank you for those wise words of wisdom. Now, do you have anything of value to add to the discussion? If not, sit back and watch. No wait, let's discuss the 44+ errors you're toting around on your site?

http://www.SEOConsultants.com/validation/history/onreact.com

Yes, that is what we should be doing. Reviewing all the errors everyone in this topic has. I see some really juicy ones too. For example, I notice Jill was quick to hop on this bandwagon. Not so fast young lady...

http://www.SEOConsultants.com/validation/history/highrankings.com

Now that you love Alan even more, have him explain to you the tag soup you've got at HighRankings.com. Have him explain how those 4 missing alt attributes are a sure sign of not knowing. Have him explain how those 46 multiple ID instances are having a dramatic impact on the elements they are assigned to. Who knows, there could be some really important Fragment IDs in there. After reviewing the really obvious ones, then we can do a line by line review of the 100+ other cascading errors. [Shakes head.]

But as far as SEO goes, if you are not considering the financial reality of having to help clients choose which things to tackle in what priority then the concept you put forth is irrelevant.

You keep falling back on this after the fact scenario. It is not Alan. What financial reality? Dude, it may take about 3-4 hours to clean a site with 1,000+ errors. It just takes a coordinated effort and once done, you move on. Oh, you also monitor from that point forward. An extra task for the publishers. They have to make sure their document is valid prior to publishing. That shouldn't be too difficult if the environment they are working in is conducive to producing well formed valid markup.

I honestly think that SEOs in the ‘your website must be 100% W3C compliant’ camp are stunningly ignorant on how information retrieval works.

I honestly think that SEOs in the 'validation doesn't matter' camp are stunningly ignorant on how IR works. What about semantics? Since you mention document retrieval, is there a semantic analysis performed during the IR phase?

Search engine crawlers do NOT render a page – they retrieve the HTML code and parse it, but they never render it as a browser does. For SEO purposes the HTML code needs to be suitable for parsing, NOT for rendering in a browser.

A search engine crawler is a UA (User Agent) correct? A browser is a UA, correct? There really isn't much difference when it comes down to the bottom line, what's in the code. Both the crawler and the browser are expecting certain things. If those are not present, exceptions come into play, error recovery. I'd rather be proactive in this area and make sure that the UAs don't have to go through that error recovery, they shouldn't have to. It is my responsibility to make sure they don't.

Thus W3C compliance really isn’t an issue. The code just needs to be clean enough for a search engine to parse it and distinguish content, navigation, and style.

Okay, and how does one know if the code is clean enough to parse and distinguish? What, you just look at it in the browser and say everything looks fine? Or, are you a genius type and can view source and immediately determine that the 500+ errors present are no cause for concern?

I have a message for clients following this topic. Have your designer, developer, whomever is responsible for your markup put in writing that the markup errors present ARE NOT and NEVER WILL BE a cause for concern. Just be sure to document that now. You'll want that piece of paper at a later date when one of the Search Engines announce that compliant code is a benefit. Oh wait, Bing just did that recently!

Bing Recommends W3C Compliant Code For Better Indexing http://www.SERoundtable.com/archives/021773.html

I guess you're going to say; "who cares about Bing?" Ya, I saw that coming, save your breath. For most of you, there is only one UA - Googlebot. That is one intelligent bot too, nothing compares to that little devil! ;)



Avatar
from Badams 1696 Days ago #
Votes: 1

What about semantics? Since you mention document retrieval, is there a semantic analysis performed during the IR phase?

Sure, but code with a couple of faults won't hinder an indexer's semantic analysis - most code is stripped from the content at this stage anyway.

A search engine crawler is a UA (User Agent) correct? A browser is a UA, correct? There really isn't much difference when it comes down to the bottom line, what's in the code.

Do you seriously believe a crawler behaves exactly like a browser? If so, you have bigger issues as an SEO than your rather stubborn stance on W3C compliance.

Okay, and how does one know if the code is clean enough to parse and distinguish? What, you just look at it in the browser and say everything looks fine? Or, are you a genius type and can view source and immediately determine that the 500+ errors present are no cause for concern?

Yes, actually, I am a genius, thanks for noticing. I'll send you a scan of my Mensa membership card if you so desire. And please stop using straw-man arguments. Where did I say that sites with 500+ errors are a good thing? If you want a serious debate, please stop resorting to logical fallacies.

Working towards compliance is good practice, but striving for 100% compliance is sheer idiocy. It's unnecessary and usually quite resource-intensive. Just make sure your code is mostly compliant and renders properly in all browsers, and you really don't have to worry about those last few handfuls of validation errors.



Avatar
from pageoneresults 1696 Days ago #
Votes: 0

Sure, but code with a couple of faults won't hinder an indexer's semantic analysis - most code is stripped from the content at this stage anyway.

That would be incorrect. It would be all dependent on the severity of the faults we are referring to. When you say most code is stripped from content at this stage, please do expand. Which code is stripped and which is left for making heads or tails of the document content?

Do you seriously believe a crawler behaves exactly like a browser?

That's not what I said. Please do quote me correctly. I said they are both UAs. Those UAs also have a set of guideline to follow, they too are published by the W3. I know you've read those since you seem prepared to do battle. Let us begin... :)

If so, you have bigger issues as an SEO than your rather stubborn stance on W3C compliance.

So my stance is stubborn because I'm one of the few who are vocal about the lack of professionalisim and attention to detail in this industry? Okay, I'm a stubborn ole schmuck! I like that title. :)

Yes, actually, I am a genius, thanks for noticing. I'll send you a scan of my Mensa membership card if you so desire.

No need to, I'm convinced.

And please stop using straw-man arguments. Where did I say that sites with 500+ errors are a good thing? If you want a serious debate, please stop resorting to logical fallacies.

I will if you will. "Sure, but code with a couple of faults won't hinder an indexer's semantic analysis."

Working towards compliance is good practice, but striving for 100% compliance is sheer idiocy.

Wow! You see, that's the type of mentality I'm up against. "Sheer idocy."

It's unnecessary and usually quite resource-intensive.

Really? Why is it resource intensive? Oh wait, I know why. Because everyone involved in the process to date have been doing it wrong from the beginning and now they have to learn how to do it right. Ya, that can be resource intensive. Definitely not worth the ROI if you ask me. ;)

Just make sure your code is mostly compliant and renders properly in all browsers, and you really don't have to worry about those last few handfuls of validation errors.

A bit of an oxymoron based on this discussion don't you think so? Your suggesting that folks make sure their code is compliant yet their knowledge of that code is minimal, basic at best. It doesn't add up. They'll run a validation routine, see 100+ errors and run. They know that it works in their browser and that other browser they tested in. Good job!

For all of you following along, do me a favor please? Take your document and run it through this tool from the W3...

http://www.W3.org/2003/12/semantic-extractor.html

Here's what I predict happening. Many of you will most likely receive an error. Why? Well, that tool is rather strict and follows specific protocols for semantic analysis, similar to those used by the developers of User Agents which covers Browsers, Crawlers, etc.

"This tool, geared by an XSLT stylesheet, tries to extract some information from a HTML semantic rich document. It only uses information available through a good usage of the semantics defined in HTML."

It's very difficult to extract information if the semantics defined in the HTML are broken or otherwise INVALID, wouldn't you say so?



Avatar
from pageoneresults 1696 Days ago #
Votes: 0

Using bad language, comment arrested. Proceed to the story.

That's 2 for 2. I'm guessing I'm on some sort of pre-mod status? Okay, I can take a hint.

First arrest because of the word crap? What did I get arrested for this time? I bet it was the word stripped?

You would think a paid membership would allow me to say words like crap and stripped when referring to markup. Speaking of which, I don't see many here with paid memberships. What, you don't believe in giving back to that which you use on a regular basis for promotion? Something isn't right there. ;)



Avatar Administrator
from Michelle 1696 Days ago #
Votes: 1

@pageoneresults - no pre-moderating happening - just getting trapped in filters. I've released the comment that was caught, and saw the others that were first caught, but with revisions, you were able to post. so all of your comments on this topic are live - if you find comments getting trapped again, feel free to ping or dm me on twitter, or drop a note to mods@sphinn or admin@sphinn.com and we'll look into it straight away :)



Avatar
from NickWilsdon 1696 Days ago #
Votes: 1

Hi Edward

Yep nothing personal but seems you tripped the bad word filter. Released those comments now -



Avatar
from AlanBleiweiss 1696 Days ago #
Votes: 2

Edward,

In your latest rant, you go on to say

I've watched you shift course from the first moment I started responding. You've gone from NOT caring about validation to now caring about it for those items which may affect SEO. You're either for it or you're not, which is it. You can't flip in the middle of this discussion and expect anyone to take you seriously.

Excuse me Edward.  This ENTIRE debate sprang up from my original tweet:

W3C compliance is NOT an SEO factor to Google #MattCuttsQuote #SMX

And my article yesterday morning before all of the other nonsense you prompt, is filled with references to 100% compliance as being the bellweather difference between your camp and the rest of the SEO community.  So I'm not quite sure why you think I've never cared about any compliance at all.

But for whatever the reason, let's go over this one last time.  I care whether a site's got functional markup.  But only to the degree that the search engines need it to properly index and rank a site.  It just so happens that having some things within a site's code structure be properly functional ALSO means that site is at least partially compliant.

For the record, I am quite capable of cherry picking markup issues to the degree that I need to for the purpose of auditing sites for my clients.  Here's why:

I say that I am not a developer because that is not what I do for a living.  I don't code sites all day.  I HAPPEN to have, over the course of the past fifteen years, learned enough programming to have built, from the ground up, about 50 web sites.  Some small, one fully functional ecommerce sites that to this day generates about a million dollars a year for the site owner.  But I do NOT claim to be a developer because my skills in that arena are based on part time, occassional learning and use.  So my personal programming ability comes more as a hobby than a profession.

So once again your bashing me is shown to be baseless and lacking in factual understanding of who I am, what my true background is or what my skills are.  And thus your claims about me are so off the mark as to be laughable.


At one point you stated

The Developers will soon be filling your positions, you're really not required anymore. SEO at this level is dead and has been for quite some time.

If you think developers, who quite often have no grasp of marketing, corporate politics, or corporate budgetary financial constraints will actually be able to rule this industry some day, good luck with that.

In addition to the bit of programming I've done, I've been in business management for more than twenty years, across multiple industries.  So I actually comprehend the bigger picture about this.

Here's one of my favorites you wrote:

No, I have a serious disdain for the majority of the SEO industry who talk the talk but don't walk the walk. It's an old school cliche but it surely applies to our industry.

Don't walk WHAT walk? Your walk? Edward, your walk is not one most in the industry walk simply because your path has been determined to be too radically off focus when held up against the light of needing to help our clients achieve their goals from that broader business reality. And if you think the likes of Greg Boser, Jill Whalen, Aaron Wall and myself don't walk the walk of true SEO professionals, not our problem.

We do quite well in the industry delivering the real goods to real clients regardless of whether those sites meet YOUR criteria for optimization and regardless of whether our methods meet YOUR criteria for responsible delivery of the services we're hired for.

And one last thing Edward

Why is it that your very own directory discriminates against at least some segment of society that just may be visually impaired? Or that may have security constraints not allowing them to have their browser implement scripts?

I'm referring to the fact that your site has a "noscripts" feature that spits out a blank GIF file - no.gif is the file name.  And in that "noscripts" code, that file is the ONLY thing that gets displayed.  AND oh look - there's a blank ALT attribute with that.

Now don't go further ripping my site apart as a counter to that.  We already know my sites are rife with problems.  YOU, however, sir, are the one who has been throwing some pretty hefty stones about this subject.  So I'm just curious.

Of course, if you'd like, you're free to fix that minor problem on your site.  And once you do, feel free to come back and bash us some more.  I have several other issues with your own work to point out if needed to further show you have no right to bash others in this industry for the very things you fail on.




Avatar
from pageoneresults 1696 Days ago #
Votes: 0

I care whether a site's got functional markup. But only to the degree that the search engines need it to properly index and rank a site. It just so happens that having some things within a site's code structure be properly functional ALSO means that site is at least partially compliant.

Alan, it's finally nice to see you come around. If I can get you to at least say partially, I've achieved my initial goals.

I don't code sites all day.

Neither do I but, I do find myself traversing that code quite frequently.

So once again your bashing me is shown to be baseless and lacking in factual understanding of who I am, what my true background is or what my skills are. And thus your claims about me are so off the mark as to be laughable.

Alan? Loosen your collar, let some steam out, and take a deep breath. Now, have a sip of this Kool-Aid and relax. ;)

You are correct in that I am unfamiliar with your background and I'm not sure why either since I've been somewhat active in the community for a few years. No one is questioning your credentials. What I am questioning are the arguments that "W3C compliance is NOT an SEO factor to Google". We know that is not exactly what Matt said and it was one of the reasons we are having this discussion now. Our industry tends to misquote Matt quite a bit. If there is something in his words that can be used to justify something, they will be reconstituted to do just that e.g. "W3C compliance is NOT an SEO factor to Google" in which I responded with...

http://www.SEOConsultants.com/Google/Validation

If you think developers, who quite often have no grasp of marketing, corporate politics, or corporate budgetary financial constraints will actually be able to rule this industry some day, good luck with that.

I do believe we have some job titles getting mixed up in the process here. We were discussing the impact that W3C Compliance may have on SEO. We're discussing on page SEO, not some corporate mumbo jumbo, okay? Can we keep this topic on target? No more credential rolls, etc? I mean, with me being an 8th grade dropout and everything, I'm starting to get a complex. :|

Don't walk WHAT walk? Your walk? Edward, your walk is not one most in the industry walk simply because your path has been determined to be too radically off focus when held up against the light of needing to help our clients achieve their goals from that broader business reality.

Whew! Now I'm really wishing I had some credentials to roll with. Let me just send you back to the instructions that were in the box with the website.

And if you think the likes of Greg Boser, Jill Whalen, Aaron Wall and myself don't walk the walk of true SEO professionals, not our problem.

Ooops, did you just mention Greg Boser? Let me ping him and see what his thoughts are. As soon as I post this, I'll shoot him a Tweet and we'll see if I can drag his old arse over here. We'll share a few stories with you. ;)

I'd be real careful about dragging anyone elses name into this topic, they don't want anything to do with it, I know.

Why is it that your very own directory discriminates against at least some segment of society that just may be visually impaired? Or that may have security constraints not allowing them to have their browser implement scripts? I'm referring to the fact that your site has a "noscripts" feature that spits out a blank GIF file - no.gif is the file name. And in that "noscripts" code, that file is the ONLY thing that gets displayed. AND oh look - there's a blank ALT attribute with that.

Holy crap! How did that get there? It must be there for a reason, I wonder why? Any ideas Alan? Did you dig any further? Did you take a look at our .htaccess file? Notice that the image has a null alt attribute since it has no meaning? There are over 1,000 valid documents there, you're going to have to come up with something much more worthy than that. And be sure to think about .htaccess before opining, okay? ;)

Now don't go further ripping my site apart as a counter to that. We already know my sites are rife with problems. YOU, however, sir, are the one who has been throwing some pretty hefty stones about this subject. So I'm just curious.

Pebbles actually, I haven't reached into my bag of stones yet. Then come the really BIG rocks. It will continue.

Of course, if you'd like, you're free to fix that minor problem on your site.

Nothing to fix. Strike 1. Now, waste some more of your valuable time trying to find any major flaws. And remember, we're discussing well formed valid markup and its potential impact on one's SEO initiatives.

I have several other issues with your own work to point out if needed to further show you have no right to bash others in this industry for the very things you fail on.

OOH-RAH! Bring it on sir. Do you think I'd come out of the gate not prepared for this type of stuff? Let's do it! My fingers are trembling and I have this REALLY BIG SMILE on my face. I'm excited!



Avatar
from AlanBleiweiss 1696 Days ago #
Votes: 1

Ed,

THanks for setting me straight on the noscripts issue.  I was mistaken. No problem admitting that.  Clearly at the moment I saw that, I failed to properly consider the depth of reasoning, and that was clearly a result of ego.  So I apologize for that one. And I do appreciate the dialogue.  Even if you've been having it for years with others in the industry, I think the main issue here is important enough to warrant this latest go-round.



Avatar Moderator
from Sebastian 1696 Days ago #
Votes: 3

A few comments more and this thread will outrank the W3C for code validation, and Al-Quaeda for fundamentalism as well. ;-)

More seriously, decades of experience taught me that complex things tend to come with more than two sides.

Validation in itself has just black and white, coz either a page validates or not. A validator's output is either true|false, when you count a crash caused by inputting way too much garbage as false.

Consensus is that white is good.

Standard compliant code in the context of SEO on the other hand deals with a boatload more variables, both on the input as well as on the output side of things. That is, we're dealing with 256 or more shades of gray in between of black and white.

That's the point where, usually, pragmatism pays off.

Say a client's root index page produces 1k validation errors, and various templates, scripts, includes, subroutines and whatnot in lower levels are good for a gazillion more. How do we deal with that?

Every professional SEO review includes code validation. So you tell the client that fixing those bugs is a prerequisite for any SEO task. Like, will a dandified manager visit a brothel located in an unfindable cave near Tora Bora, where the hookers are 102 years old and won't lift their burka? You can't make much money from the few weirdos with a fetish like that out there on the Interwebs, so where's the point in generating wooer traffic?

If, and that's a bold IF, the client refuses to take --and implement-- this advice, do you do a runner? Nope. First, you aren't a coward, and second, you can't afford it. I say that's a big IF, because from years and years of training developers, and from talking to decision makers in companies of all sizes who're able to fall for common sense, I do know that if you can resist to offend the client with geek speech and know-all-better manner, you will make your point.

You'll even create a budget out of nowhere when you not only avoid pissing off anyone involved, selling code sanitizing not as a bug fix but as a feature, erm, added value for just next to nothing instead. If necessary, underprice it, or even offer it for free. You'll get your profit later on. You're acquiring a long/ever lasting relationship.

Of course this approach can fail every once in a while. Write it off then. You can't be happy with a client from hell.

Or you proceed, warning the client that SEOing a piece of shit is worth exactly that. Just collect your fees upfront then.


As for crawlers vs. browsers ... there's a lot of ignorant assumptions and guesses findable in this thread. Not addressing anyone particular, I'd suggest that the concept of treating crawlers like browsers is not a bad idea. Some crawlers are way more sophisticated than any bowser out there. And exactly those crawlers can ruin your most important rankings if you annoy them long enough. And yes, that goes for client-sided coding, even Ajax'ed stuff, too.



Avatar
from AlanBleiweiss 1696 Days ago #
Votes: 1

I really do think the biggest discrepancy in all this discussion lies in how much validation checking we perform, and what's the "right" amount.

If a test of a few pages shows only a handful of errors, and none of those are errors that I believe will harm the crawlability, that's all I do. Because I really do have bigger fish to fry.

Of course, if a site is not being indexed properly for primary site pages, I definitely dig into the cause. For me however, those situations tend to be that the site was built in Flash, or they had a flawed robots.txt file, or fundamental content type SEO flaws. And I've more than several times, had to inform clients that they need to rebuild from the ground up.

I do not, nor will I EVER run W3C validation on every page of most client sites, since I typically deal with sites that have thousands of pages these days.

Is it possible that this is perceived as a disservice by some?  Yes, it quite well could be, if it turns out that I've missed some glaring and complex issues deep in the site.

Yet part of my process also includes examining how many pages on the site are actually indexed compared to how many exist in reality.  If it turns out that many pages aren't, I will check validation on a few of those, but again, only a few.  Because inevitably, I have found that if the pages I've checked turn out to not have serious crawl-block type errors, then inevitably too, I've found one or more other SEO oriented issues (such as duplicate content, link problems, etc).

And this approach has never failed to help client site indexing and ranking.

Since my audit/action plan documents are filled with caveats and disclaimers, I definitely cover the bases as far as making sure the client understands that future audit work may be required after 1st phase recommendations are completed and deployed.

Except, to this day, I've never had a client situation where it's needed to get to that point.  Total number of pages indexed goes up.  SERP positions rise.  Traffic increases.  Conversions increase.

And all while the client still has countless other actionable items to take on.




Avatar Moderator
from Sebastian 1696 Days ago #
Votes: 1

Alan, you could just have said that pragmatism pays off. ;-)

I don't do code validation of thousands of pages, too. It's easier to identify the scripts. Validating one page per script is enough.




Avatar
from DavidBlizzard 1696 Days ago #
Votes: 1

Sebastion, we have actually turned down some optimization jobs because the tag soup was too much to bear and they didn't have the budget to fix it. Most of the time we do just as you say and down the road we get a site re-design because we agreed to wade through the messy code. Make them happy with what they want, gain confidence, and then get the budget to build what you really wanted in the first place. This practice is in line with what many SEOs that don't build sites do. They don't want a site design so why bother selling one. I'm sure that makes Edward cringe and I understand his point of view, at some time someone needs to be held accountable and there is SEO value. With us we hope that we get the opportunity to clean it up later if we can't sell it up front. To sell it up front we have to explain semantics, future-proofing, and one of the most important to us, it's easier to work with (because we understand it and did not give up on education and the new standards).

I read the comments on the 1 year old version of this topic and was surprised to see statements like "I understand basic HTML". I'm guessing since then many SEOs have started realizing a new dawn is ahead and they might have to delve into semantic markup.

I think some of the SEOs that don't keep up with new standards are going to find that their ability to pay the bills are failing in the next few years. Those that don't know what I'm talking about just search for HTML5 and SEO. The future is bright.

Props to Edward (PageOneResults) because of him we are beefing up our efforts on all new jobs, he has opened my eyes. hint: microformats



Avatar
from AlanBleiweiss 1696 Days ago #
Votes: 1

Sebastian

If I had kept it that short, it would compromise my diatribe reputation.  What good would that be?  I take my online diatribe reputation seriously.


Thanks for mentioning the common script aspect  - given that most of the sites I deal with have templates that pull from common header, footer and other element specific includes, that's one of the things that causes me to be confident in my methods.


David,

You bring up a good point about the sequence of events and initially going the least resistent path, with the mind-set that there's going to be opportunity down the road to revisit things that initially go untouched.  It's a valid business model in a realistic world.



Avatar
from NateSchubert 1696 Days ago #
Votes: 1

You know, I'm not a pure SEO. I'm an Ecommerce  Marketing Manager who spends virtually all of my time doing the work of the employee's I  had to let go in September after our worst year ever. Answering phones, emailing price quotes, blogging, optimizing, paid search-ing, everything.

I absolutely take the time to make sure my website is as compliant as I can make it because it makes sense that when you undertake a task, you do it correctly, thoroughly, and you don't take short cuts.

With specific regard to search engines, if they're not taking the cleanliness of your code into account, they will. Do search algorithms get less complex as the internet continues to grow in size? I don't think so. How many of us will be surprised when Google adds a website's level  of compliance to as another  factor in ranking? I  won't.

I generally feel like a little fish in an ocean filled with sharks in that you've all been  at this a long time, have more  time to  devote to this facet of the internet, and you're successful. I'm just starting out. For once I'm actually happy to say that because I don't have the problem of relearning my skills ot include being compliant.

It's threads like these that bring new members. It sure did for me! Great words from everyone, great insights.



Avatar
from AlanBleiweiss 1696 Days ago #
Votes: 2

Nate,


Welcome!  I think you're wise to do the very best you can personally to ensure your own code is compliant.  Long ago, I was managing full on web development projects with a team of developers and learned the concept of clean code that was compliant to accepted standards, if for no other reason than the pain that doing otherwise would induce in future changes either from the same developer(s) or if a new team had to come in.


I only wish it were that simple, cut and dried in the scenarios I deal with.



Avatar
from IncrediBILL 1695 Days ago #
Votes: 0

Search engine crawlers do NOT render a page – they retrieve the HTML code and parse it, but they never render it as a browser does.

That may be true but they do try to identify recurring data elements such as headers, footers and menus that repeat page after page in order to separate the page layout from the actual content.

Therefore, broken code, such as tables, frames and divs, that even display properly in MSIE quirks mode can theoretically make a SE belch trying to properly interpret the page.

However, the obsessive compulsion to squeeze everyone through the W3C's validator is nonsense.

Visitors don't care, they don't look at your source and go "Holy Cow! Look at this garbage code!" and bail from your site.

The SEs just keep sending traffic.

When the SEs stop sending traffic to sites that don't validate, then, and only then, does everyone has a legit reason to get their sites into compliance.

Last but not least, a lot of the 3rd party plug-in garbage doesn't even validate so your site could be squeaky clean except for the analytics code embedded in your page.







Avatar
from AlanBleiweiss 1695 Days ago #
Votes: 0

Bill - that's a good one -


a lot of the 3rd party plug-in garbage doesn't even validate so your site could be squeaky clean except for the analytics code embedded in your page.


Quite often when performing speed tests, the single slowest element is Google's own Analytics code.  And more than a few times, I've found conflicts on client sites caused by that same code.  In one case we had to dump GA altogether as the only way to resolve a scripting conflict.



Avatar Moderator
from Sebastian 1694 Days ago #
Votes: 0

Too old to sphinn it, but really worth a read: Web site code optimisation - Does valid code matter?



Avatar
from billmarshall 1694 Days ago #
Votes: 0

Was reading through this as I also read the article pointed to and saw Sebastian's link which seems to be broken. I believe he was trying to point to my old post at Web site code optimisation. Thanks Sebastian



Avatar
from AlanBleiweiss 1694 Days ago #
Votes: 0

Bill - that is a good article.  "Elegant code" - when I was early on in my career, overseeing a twelve person web team, we had two senior developers handling the biggest client projects.


They used to battle constantly about how something should be written.  When I was curious one day about how to write code to perform a specific function (me NOT being a developer, and having only previously worked with BASIC 15 years before that, and HTML at that point as far as web development goes), I couldn't even comprehend what one of them tried to tell me.  It felt so convoluted in my brain as to be archaic.


Then the 2nd developer sat me down.  And before he ever showed me how to go about it, he gave me a lesson on elegant code.

And from there it was like magic.  Things made so much more sense, and I was able to follow along fairly effortlessly.




Avatar
from billmarshall 1693 Days ago #
Votes: 0

Thanks Alan. Elegant code is so much easier for everyone concerned once it's in place and also so much easier to debug for cross-browser compatibility. Now if only I could write PHP as elegantly as HTML ;-)

Working with programmers and trying to get them to output elegant html from their coding can be a trying process initially but once you manage it things get a lot simpler.



Upcoming Conferences

Search Marketing ExpoSearch Engine Land produces SMX, the Search Marketing Expo conference series. SMX events deliver the most comprehensive educational and networking experiences - whether you're just starting in search marketing or you're a seasoned expert.



Join us at an upcoming SMX event: