Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Need help: ASP not found in Google save by site title

Status
Not open for further replies.

MakeItSo

Programmer
Oct 21, 2003
3,316
0
0
DE
Hi friends,

this one is giving me headaches:
Our company website needs to run in (currently) 5 languages.

The original design (1 language) was done by a DTP agency and contained of a grand total of 7 single HTML files.
For one language, I could live with it, although a typo in one of the keywords had to be corrected in 7 files...
[poke]

So when it came to making the site multi-lingual it was pretty obvious that this was not the way to do it.
So I used the design frame of the original index.htm to create one index.ASP. The content is now loaded dynamically into this page dependent on browser language settings (first load) and then on session cookie values.
The content itself is stored in subfolders (de, en, fr, es, it).
The content snippets themselves are in HTML format, no head or body tags, actually just text wrapped in their required HTML formatting tags.

The only page containing the meta tags "keywords, content, description, and title" is the Index.asp.

Here's the head structure of the Index.asp:
Code:
<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "[URL unfurl="true"]http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">[/URL]
<html xmlns="[URL unfurl="true"]http://www.w3.org/1999/xhtml">[/URL]
<head>
<%
'Here follows some asp code for setting variable values
%>
<meta http-equiv="content-language" content="<%=language%>">
<meta http-equiv="content-type" content="text/html; charset=ISO-8859-1">
<meta name="date" content="2008-06-01">
<meta name="Language" content="<%=Sprache%>">
<meta name="author" content="[original author]">
<meta name="publisher" content="[original author company]">
<meta name="copyright" content="[original author company]">
<meta name="asp-programmer" content="[myself]">
<meta name="page-topic" content="<%=Dienst%>">
<meta name="abstract" content="[our company]">
<meta name="title" content="[a set of important keywords]">
<meta name="description" content="[a short summary]">
<meta name="content" content="[subset of most important keywords]">
<meta name="keywords" content="[keywords]">
<title>[site title]</title>
</head>

When we still had singular HTML files, we were found by googling, but now we aren't - and that's baaaad!
[nosmiley]

Is this caused by me only using one page with a head?
I have also learned that most search engines don't even look at the keywords anymore.
However, I cannot "stuff" our pages with keywords nonsense. The website shall make the reader curious and answer his first questions, not overload him with information!
For that we have personal talks!

Please give me some advice on how to get our page into Google.
[pc3]

Thanks a lot!
Andy

[navy]"We had to turn off that service to comply with the CDA Bill."[/navy]
- The Bastard Operator From Hell
 
What content do you return if the client does not accept cookies, and does not report a language preference? That's what the Google spider will do, and you need to be sure it's getting the right response.

Furthermore, your approach runs the danger of being labelled as cloaking - since you're serving up different content for the same URL, depending on browser settings. You're not doing so with any malicious intent, but the googlebot isn't always able to tell the difference.

To be safe, I suggest you abandon the clever browser-settings-and-cookies approach, and just have different urls for each langauge version - example.com/en , example.com/de , example.com/fr , etc. would be one way of doing it. You can still use asp to dynamically build up each page, of course.

The website shall make the reader curious and answer his first questions, not overload him with information!
I disagree. I think you should put as much information as you can on the site, albeit not overloading any one page. If I'm looking for a service and can't get all my questions answered from their website, I might mail them to find out, but I'm more likely to look elsewhere. "Personal talks" are all very well, but they take time that I might not be willing to spend.

Consider also the case where I'm looking to compare several similar services before committing to one. If I can't find enough info on your site to make that comparison, you're out of the game. I don't want to contact you and lay myself open to sales patter, junk mail and whatever else.

Maybe that's just me though.

-- Chris Hunt
Webmaster & Tragedian
Extra Connections Ltd
 
What content do you return if the client does not accept cookies, and does not report a language preference? That's what the Google spider will do, and you need to be sure it's getting the right response.
In that case, German is the default. Language switching won't work then.
Is there an alternative other than singular pages?
Furthermore, your approach runs the danger of being labelled as cloaking - since you're serving up different content for the same URL, depending on browser settings. You're not doing so with any malicious intent, but the googlebot isn't always able to tell the difference.
Actually I doubt that. It is a not uncommon CMS approach, although most do this via PHP or JSP, not ASP.
Still, no content is cloaked.

To be safe, I suggest you abandon the clever browser-settings-and-cookies approach, and just have different urls for each langauge version - example.com/en , example.com/de , example.com/fr , etc. would be one way of doing it. You can still use asp to dynamically build up each page, of course.
Abandon the browser-setting: that would be one option; although I really don't know what difference that would make.
Abandon the cookies approach: I can live with that, provided I get to know a different, equally efficient approach that will not consist of singular pages.

I'll be glad for any proposal.

The website shall make the reader curious and answer his first questions, not overload him with information!
I disagree. I think you should put as much information as you can on the site, albeit not overloading any one page. If I'm looking for a service and can't get all my questions answered from their website, I might mail them to find out, but I'm more likely to look elsewhere. "Personal talks" are all very well, but they take time that I might not be willing to spend.

I was not referring to only giving "a bone to gnaw on" and else leaving the reader in the dark.
I am in the localisation business, i.e. translation, internationalisation, and everything that comes with it:
CMS, optionally interfaces to customers' CMSes, DTP, technical consultation etc.
Giving a good summary is about all information I (we) can reasonably give. Our profession simply has too many special cases and ever new tasks, that covering it all would overstrain any reader... [tongue]

Consider also the case where I'm looking to compare several similar services before committing to one. If I can't find enough info on your site to make that comparison, you're out of the game. I don't want to contact you and lay myself open to sales patter, junk mail and whatever else.

Maybe that's just me though.
Nope, that's not just you. That is common customer practice.
But we've checked out our competitors and we're quite fine with our approach.
We also get lots of feedback, and so far we've only been congratulated on our web presence.
:)

Besides: I've now done something I should have done a lot earlier: I've put it through W3C's validator and found some HTML errors which I've fixed.
With this, I've also found some IMGs that did not have ALT texts yet.
I have also moved the META tags "content", "title", and "description" to the HTML content snippets, and adapted their contents to match the respective page content.
[ponder]
It is still too early to tell whether this will have any positive influence, as Google obviously takes a while to re-spider it...

Thanks a lot for feedback!

Cheers,
Andy

[navy]"We had to turn off that service to comply with the CDA Bill."[/navy]
- The Bastard Operator From Hell
 
Still no change despite the fine tuning...
No come up with Google...
[cry]

[navy]"We had to turn off that service to comply with the CDA Bill."[/navy]
- The Bastard Operator From Hell
 
7 pages with 5 languages?

Why not forget the cleverness and just make 35 pages?
Yes it will take time, everything does but your current solution has taken time to conceive, build and now time trying to fix it.

If there are common elements then by all means use includes in some form to pull them in to your pages but actually make the pages for each language rather than making a single page with changing content.

I took this approach with (although there are currently only 2 languages with more awaiting translation). It's a small site just doesn't need anything fancy. The only remotely clever thing I've done is to set a cookie so the user is returned to the correct language when they come back to the site - if no cookie is present they are shown English by default.

Each page in each language has it's own URL so searches on Regional search engines now show the localised pages and link directly to them.

Tek-Tips Forums is Member Supported. Click Here to donate

<honk>*:O)</honk>

Tyres: Mine's a pint of the black stuff.
Mike: You can't drink a pint of Bovril.
 
I think I've finally got the solution:

In another forum, someone told me to make use of the Google Webmaster Tools. I did and:
Obviously, a Google sitemap was all I needed.
[reading]
*sigh*

Yes, by now, they are only 7 pages in 5 languages. But what if we decide to add 2 more languages?
All you do with 35 (or later 49?) singular pages is multiply your workload.

Of course this approach took a bit time, especially now the optimisation - but it'll pay out later.
I rather spend more time in the initial phase than over and over again later.
:)

I am just a bit disappointed that I had to Google such a long time to find out what Google needs...
[tongue]

[navy]"We had to turn off that service to comply with the CDA Bill."[/navy]
- The Bastard Operator From Hell
 
Of course this approach took a bit time, especially now the optimisation - but it'll pay out later.
I rather spend more time in the initial phase than over and over again later.

I hear what you're saying but ultimately I'd say the necessity for that kind of approach is if there are many pages and not just many languages.

When you add a new language you're still going to be writing content into the site somehow so you aren't really saving any time by dynamically loading content into a single page.

Taking my similar site. To add a new language I just dupe a directory and then copy/paste the supplied copy and mark it up. Granted this takes time, but it really doesn't take long to mark up body text for a few pages. I also add a case to a PHP switch statement to handle the cookie/redirect thing.

Where my approach might come unstuck would be if the client suddenly added 100 pages to the site. Even then I would only need to create the English version then copy the files to the other language directories and update the text in them. I'm safe though, it's not that kind of site.

Still, glad it's working for you. There's always more than one way to crack a nut after all. I've just learned over time to really consider the simple solution vs the technical one. Often the time savings in doing it the technical way just aren't that great.

Have the pages now been indexed by Google - since you added the site map?

If Google hasn't managed to crawl it manually then you need to check that there is no problem with the way you link to pages.

Tek-Tips Forums is Member Supported. Click Here to donate

<honk>*:O)</honk>

Tyres: Mine's a pint of the black stuff.
Mike: You can't drink a pint of Bovril.
 
Sorry for replying so late; been offline for the weekend. ;-)
Have the pages now been indexed by Google - since you added the site map?

If Google hasn't managed to crawl it manually then you need to check that there is no problem with the way you link to pages.
My first attempt failed; Google complained that the sitemap was in invalid format.
It was created according to the sitemap standard, in UTF-8 encoding and with escaped ampersands. :-?

I then found out that Google obviously wants the sitemap not just as XML but as a GZIPped XML... *sigh*

So what I did was simply take Google's sample sitemap, adjusted it with my site's links, made sure all non-standard characters were entity-escaped, and the file GZIPped...

I have now uploaded the new file, it has been accepted, and all URLs transmitted (although not indexed yet).
I have made sure my site follows Googles guidelines for dynamic sites, and the textbrowser LYNX also displays each site for the respective link correctly (and in the correct language).

FYI: this is what the sitemap looks like:
Code:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="[URL unfurl="true"]http://www.google.com/schemas/sitemap/0.84">[/URL]
   <url>
      <loc>[URL unfurl="true"]http://www.mysite.de/index.asp?[/URL][b]page=index&amp;lang=de[/b]</loc>
      <lastmod>2009-03-02</lastmod>
   </url>   <url>
      <loc>[URL unfurl="true"]http://www.mysite.de/index.asp?page=Leistungen&amp;lang=de</loc>[/URL]
      <lastmod>2009-03-02</lastmod>
   </url>   <url>
...
...
      <loc>[URL unfurl="true"]http://www.mysite.de/index.asp?page=index&amp;[/URL][b]lang=en[/b]</loc>
      <lastmod>2009-03-02</lastmod>
   </url>   <url>
      <loc>[URL unfurl="true"]http://www.mysite.de/index.asp?page=Leistungen&amp;[/URL][b]lang=fr[/b]</loc>
      <lastmod>2009-03-02</lastmod>
   </url>   
...
...

</urlset>
The singular links work fine, even with deactivated cookies...
:)


I'll give back some info later on whether the site has been indexed or not.

Thanks!
Makey

[navy]"We had to turn off that service to comply with the CDA Bill."[/navy]
- The Bastard Operator From Hell
 
I also have a localised site of my own -
The landing page is bilingual, the visitor chooses the english or italian version. After that, everything is in either the /eng/ or /ita/ directories (apart from cgi scripts which get a lang= parameter to tell them which language to use).

However, the individual page files only have the main content of the page in them. The page header, the menu bar down the side, and other common content is populated by server-side includes, which look at which directory the page they're called from is in, and serve up the appropriate language. They also look to see if there's a file of the same name in the other directory, and link to it in the sidebar if it exists (some pages are currently English-only).

If I were to add another language, I'd have to update the SSI stuff once, to do the common content, and translate each page that I wanted to appear in the new language. That's the minimum amount of work (cos that's the kinda guy I am!)

There are other ways of doing it, but this one works for me.

-- Chris Hunt
Webmaster & Tragedian
Extra Connections Ltd
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top