Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Chris Miller on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Purpose of standards, it there an ulterior motive? 7

Status
Not open for further replies.

1DMF

Programmer
Jan 18, 2005
8,795
GB
I've been taking part in a forum relating to Matt Cutts from Goolge announcing that G might be addin code validation to google WMT.

It started the usual debate over valid code having nothing to do with SERPs. should or shouldn't G do this and are they already doing this when crawling a site.

I made the point that valid code and SERP's have to go hand in hand , you won't get far in google without a TITLE tag and your page won't validate either, so to me there seems at a basic level a corrolation to valid code and SERPs , but I got the following response to that train of thought...
Of course I accept that the title tag is important! The reason that not having one is a massive problem is not that the W3C's validator doesn't think you're a good coder; it's that you have left out the best way of quickly telling a search engine (And the user!) what the page is about.

So the question is as per the subject, are some of the standards based purely on search engine requirements?

is the only reason the title tag is required is because Google likes it, not because it is part of semantics and accessibility and cross browsers compatability.

Can this statement really be true, that some semantics / standards possed as the right thing to do is a mere smoke screen for helping multi-billion dollar companies such as Google or Yahoo.

But then again if so that also proves a corrolation between Standards / Validation and SE's doesn't it?

Does W3C even care if you get in an SE or not.

Not all code is for public domain , it could be a members only or intranet site etc.. surely W3C just care if your code is valid not if it gets high SERPs.

Other peoples opinion on this is much appreciated.

full forum thread can be viewed here...
"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
So the question is as per the subject, are some of the standards based purely on search engine requirements?

IMHO, it's the other way around... I just think that the response you got was badly worded.

Dan





Coedit Limited - Delivering standards compliant, accessible web solutions

[tt]Dan's Page [blue]@[/blue] Code Couch
[/tt]
 
lol - thanks dan, not as bad as the response I got after that!!!!

either way, there is an undisputable link between standards and SE's / SERP's.

Whether the standards pave the way and then the SE's use certain tags or certain tags are required for validation by the standards because of the way SE's use them.

So I stand by my statement
I can 100% guarantee you that invalid code will drasticly affect your google ranking

not having a title tag is invalid mark up and also will affect your google ranking, how is that not an accurate statement?

oh well , onwards and upwards, got a shed load of copytext to write, cheers Dan.

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
I agree with Dan that it's the other way around. There isn't a conspiracy by the big search engine companies to dictate standards - I believe they are using the fact that there are standards to assist in their own parsing of pages (as common sense would dictate).

The standards are there to give the browser developers a known "base-line" of expected behaviour (otherwise each browser would offer a completely different experience from another). And they provide a straightforward avenue for web page developers to guarantee their code renders the same across browsers.

As we have all seen, if a page doesn't conform to a doctype, then browsers will handle it's rendering differently. If a page does conform to a doctype, then it will render the same across browsers (ok, that's more an ideal - but you get the idea).

So if you throw any old crap on a page you can expect your page to appear differently from browser to browser... which makes it less useful to the end user (remember it's the end user that the thing is actually designed for - not the searchbot).

The search engines may decide that "less useful to the end user" means that they are not all that interested in the data - it's lower quality. They may decide that validating pages are "more useful to the end user" and therefore place more weight on that particular page.

I think it's a good thing because it will encourage more quality web pages - which means more work for me... and less cowboy SEO hackers making a mess of things.

I like Matt's blog... it's a good read.

Cheers,
Jeff

[tt]Jeff's Blog [!]@[/!] CodeRambler
[/tt]

Make sure your web page and css validates properly against the doctype you have chosen - before you attempt to debug a problem!

FAQ216-6094
 
Thanks Jeff,

I was getting a bit confused and worried, that my understanding, mainly from you guys was being shredded and it upset me - lol

Anything to make pages cross-compatible , easier to read, by anything be it a bot, screen reader or human, has to be a good thing right?

You guys have taught me that semantic /standards compliant code can help with ranking, the other forum has differing views , but with google making this announcement, my money is with you guys.

As it seems even google thinks it makes a difference, otherwise why would they bother, i'd only use their validator to check G was happy with my page to ensure correct indexing, but if the change they wanted conficted with W3C (i know unlikely) then W3C will win in my book.

That makes me think of another few things actually, Google annnounce they might bring out a few more tags to help with SERPs , if W3C doesn't adopt them , they become invalid mark up don't they.

Aslo I noticed using Google WebOptomiser, to implement the sections you have to add </noscript> end tags without starting tags , so again I see a conflict between standards and Google functionality.

hmm a hard one , screw W3C get ROI, or Screw G and have well have no money but a valid page!

Looks like the old "can't have your cake and eat it" syndrome.


"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
Web standards have absolutely nothing to do with search engine visibility.

As Dan said, you code to standards so you know your pages are going to render as you expect them to.

What search engines do is entirely up to them.

<honk>*:O)</honk>

Tyres: Mine's a pint of the black stuff.
Mike: You can't drink a pint of Bovril.
 
Come on foamy, surely you can accept having invalid code can affect your SERPs.

I'm not saying one is guided by the other, i'm saing one can affect the other.

And I guess Matts example of a headless page shows that the code will render the same in every browser( I checked IE6/7 , FF, Sfari & Opera)

But it won't index well and it is invalid.



"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
Leaving out a title tag will certainly effect your SERPS but this is nothing whatsoever to do with web standards adherence and more to do with the title tag being a highly relevant piece of content.

Not closing a paragraph will mean your page fails to validate but it will make no difference to your SERPS.

The title tag existed long before web standards came along. The standard says we should use it and it just so happens that SE algorithms put an amount on emphasis on what the tag contains. Nothing more than that.

I don't think we should confuse web standards with search engine visibility and search engine optimisation.

If search engines decide to use web standards as a yardstick then great. But it does not follow that one must code to standards to achieve good SERPS, and I doubt it ever will, nor should it.

<honk>*:O)</honk>

Tyres: Mine's a pint of the black stuff.
Mike: You can't drink a pint of Bovril.
 
We can only summise what an SE's bot does with a page that is tag soup and gobbledygook, missing end tags COULD play a massive part and to me seems logical that they might, especially with the implications of what is relevant to SEO and SERP's

missing end <p> tag maybe not but missing end <a> anchor tag , surely that has implications.

missing end <script> tag again i can see how this COULD affect SE's.

But enough said, I think i'll go crawl back under my rock on this one, I seem to be the only one that sees a corrolation between the two, how ever that link is made.

[worm]



"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
I believe that invalid code can affect your SERPs in the face that SERPs rankings are partially defined by hits on a specific page.

If you have invalid code, your page can load up slower and cause not as many people to regularly visit your page, thereby dropping your ranking.

I'm not sure how widespread this is, but I've found that invalid code (non standard compliant) loads slower than valid code.

[monkey][snake] <.
 
I will fully back up Foamcow on this one. 1DMF, you're really seeing too much here. Of course omitting the title tag is invalid and will hinder your SE rating, but not quoting attributes is invalid as well and it will not affect the SE at all. Similarly is a hidden paragraph of keywords perfectly valid but it shoots down your SE rating considerably.

I know that in a way you can connect the two. But similarly I can connect building highways and selling Ferraris (hey, the latter do perform better on a highway than a mountain road, right?), clothing industry and us actually wearing something and so on and even having tattoos and smelling bad in the armpits (mind you, a test at Sparknotes proves that to be true). They all apply, but one should not correlate them, as the logical link between them is too weak.

___________________________________________________________
[small]Do something about world cancer today: PACT[/small]
 
True Vragabond, but I don't disagree there are other ways of writing invalid code nor other valid code ways of destroying your SERPs.

The reason I even use the title tag as an example was because before I started down the route of semantic / valid code writing , loads of my pages never had a title, the page worked, i didnt' care it only said 'microsoft internet explorer' in top bar, the bar was of no conciquence to me and I hadn't even heard of Google let alone SEO.

I taugh myself with no training, no book , no school, just altavista and notepad, I copied someones code, pulled it apart and worked out how to make a table, hten put some text in then etc.. etc...

It's easily done when you're young and starting out, teaching yourself as you go along, and anything that can help those from making simple mistakes like I did and cleaning up the crap code on the net has to be a good thing.

It's no me giving too much weight to the argument, it is eveyone else who is applying their current , good practice knowledge who are failing to see the importance and relevance to the argument.

I speak from FACTUAL exprerince so i know for a FACT i'm right!

It was merly coincidence I learnt the importance of the title tag, when I started to use them, I statred to appear in SE's , then the penny dropped!

If only I had known sooner.

But I know the we don't build highways (roads) for Ferraris.

The romans invendted roads before cars even exisited, I thought everyone knew that ;-)







"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
monksnake said:
If you have invalid code, your page can load up slower and cause not as many people to regularly visit your page, thereby dropping your ranking.

You are mixing up loading with rendering
SE indexers do NOT render the code, so the time it takes to appear in a browser will make absolutely no difference

The number of vistors to a page has absolutely NO bearing on ranking

Chris.

Indifference will be the downfall of mankind, but who cares?
Woo Hoo! the cobblers kids get new shoes.
People Counting Systems

So long, and thanks for all the fish.
 
The number of vistors to a page has absolutely NO bearing on ranking
lol - I'm going to have to disagree with you on that one Chris.

More visitors to your site means your site is popular, popular sites tend to get linked to more, more backlinks means higher rankings.

You all tell me on the HR SEO forum...

1. content is king
2. you'll get more visitors with great content
3. people will value your content and it becomes popular
4. valuable , popluar content is linked to more
5. thus giving you the quality, backlinks required to acheive higher rankings.

Then you make a comment that visitors have absolutely NO bearing on ranking , you can't have it both ways!

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
1dmf said:
So the question is as per the subject, are some of the standards based purely on search engine requirements?
No.

1dmf said:
is the only reason the title tag is required is because Google likes it, not because it is part of semantics and accessibility and cross browsers compatability.
The title tag is used by all kinds of document indexers so the interested parties who make up the W3Cs working committee decided it was a required element.

1dmf said:
Can this statement really be true, that some semantics / standards possed as the right thing to do is a mere smoke screen for helping multi-billion dollar companies such as Google or Yahoo.
The W3C working committee may have SE representatives on it but the standards are NOTHING to do with what SE want or need.

1dmf said:
But then again if so that also proves a corrolation between Standards / Validation and SE's doesn't it?
Nope.

1dmf said:
Does W3C even care if you get in an SE or not.
Nope.

1dmf said:
Not all code is for public domain , it could be a members only or intranet site etc.. surely W3C just care if your code is valid not if it gets high SERPs.
Nope.


Valid code != SERPs
SERPs != Valid code


It wouldn't matter IN THE SLIGHTEST to the SEs if your didn't even validate to HTML3.2

invalid and broken code is one thing, code that simply doesn't validate is not a barrier to search engines ranking

There are thousand upon thousands of pages out there that don't have a doctype, don't have a character set, have multiple <head> & <body> tags because of included files.
And many are going to rank despite this supposed "handicap"

As has been said many many many times, validated code is useful for many reasons. Search engine results is NOT one of them.


Chris.

Indifference will be the downfall of mankind, but who cares?
Woo Hoo! the cobblers kids get new shoes.
People Counting Systems

So long, and thanks for all the fish.
 
More visitors to your site means your site is popular, popular sites tend to get linked to more, more backlinks means higher rankings.

Yes popular sites get linked to more often

BUT that scenario relies on one vital point which is missing in the real world.

not everybody who visits your site has a website/page they could place a link on

It seems that so many webmasters are so insulated from the real world by so many degrees they forget about it.

so shall we rephrase the point;

popular sites get linked to more often by other webmasters

So if other webmasters are NOT your target audience, all that great content wont gain you a single link. Good conversions maybe, links? No.


Chris.

Indifference will be the downfall of mankind, but who cares?
Woo Hoo! the cobblers kids get new shoes.
People Counting Systems

So long, and thanks for all the fish.
 
Thanks Chris, I'd give you another star if i could :)

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
not everybody who visits your site has a website/page they could place a link on

Perhaps one way of looking at it, but however, these days, who doesn't have a blog / myspace / facebook / forum account etc...

it's not quite a true statement either, unless you are now going to tell that links form these places don't count.



"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
Once again we wander into the realms of mixing up cause and effect.

Coding to W3C standards has naught to do with search engine ranking.

Creating a page without a title tag, for example, will probably mean it won't perform as well in a search engine than if it had a title. But this is not because the page doesn't validate. It is because the search engine algo uses what it finds in the title tag to determine a degree of relevance against a search query. If the tag isn't there, or if it's not closed that doesn't mean the SE won't index the page content. It just sees data after all and doesn't really care one way or another.

It so follows that if you write invalid code you *may* create issues as far as SERPS are concerned. But this is nothing at all to do with the code being valid or invalid by W3C standards. It is because the SE may not gain any meta info from your markup (assuming they would anyway)

To a search engine your page is just a big lump of text data. It doesn't care if you didn't close a tag. It will just guess in the same way that a browser won't explode if you don't close a tag. It might not give the expected result, but the text is there even if only visible in the source.

Sure it may look for particlular marked up text for use in relevance calculations, but that's rather trivial. The page content will still be indexed regardless.

So ultimately, yes, creating an mangled lump of code might effect ranking position. But not really because of failure to meet any standards, rather because the SE may not know what you meant. But it will still 'read' and index the data it finds.


<honk>*:O)</honk>

Tyres: Mine's a pint of the black stuff.
Mike: You can't drink a pint of Bovril.
 
Which is why it is a good idea to validate your code as is *may* / *could* affect SERPs.

And why it *might* be a good idea for Google to have this facility. :p

"In complete darkness we are all the same, only our knowledge and wisdom separates us, don't let your eyes deceive you."

"If a shortcut was meant to be easy, it wouldn't be a shortcut, it would be the way!
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top