Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

What more can I do to optimize my site? 1

Status
Not open for further replies.

ilovestickers

Programmer
Jul 2, 2005
29
0
0
CA
Hi there!

I've been trying since the end of Jan to optimize my site ( but nothing seems to work. My goals are to get a good search engine position (specifically for Google), and to get a reasonable page rank. However, as it stamds I am not even on the top ONE THOUSAND google results and my page rank has be stagnant at 2 for half a year and a few PR updates.

I have got lots of links, worked on my meta tages, title, etc, and improved my keyword ratio but it never makes the slightest difference.

This is my first website and I know I must be doing somethign wrong or not doing something as I have yet to see an improvement - especially not even being in the top 1000 results! I also have an ad words accounts.

I would very much appreciate anyone's advice!! Thank you all very much!

Krissy
 
Use of includes won't affect search engines one way or the other - the component files get "included" by your web server before the client (be it a person browsing or a web spider) sees the page. The page will just look like regular HTML to them.

If you persevere and get them working, includes can make your life as a webmaster easier though, particularly if you're making sweeping changes to its layout. Suppose you've come up with this initial design:

Code:
[green]<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "[URL unfurl="true"]http://www.w3.org/TR/html4/loose.dtd">[/URL]
<html>
<head>
  <title>I Love Stickers: [/green]Home Page</title>
[green]  <meta name="author" content="Chris Hunt">
  <link rel="stylesheet" type="text/css" href="stickers.css" />
</head>
<body>
  <table>
  <tr>
  <td><h1>I Love Stickers</h1></td>
  </tr>
  <tr>
  <td>[/green]
    <h2>Home Page</h2>
    <p>Buy your stickers here!</p>
[green]  </td>
  </tr>
  </table>
</body>
</html>[/green]

The code in green is the same for every page of the site, only the black code is unique. You could put all the green code into includes, but I'd stop short of that - it can be good if your main file remains a valid HTML document when the includes aren't present. Here's how I'd split up the above page:

Code:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN" "[URL unfurl="true"]http://www.w3.org/TR/html4/loose.dtd">[/URL]
<html>
<head>
  <title>I Love Stickers: Home Page</title>
[green]<?php include 'head.txt'; ?>[/green]
</head>
<body>
[green]<?php include 'top.txt'; ?>[/green]
    <h2>Home Page</h2>
    <p>Buy your stickers here!</p>
[green]<?php include 'bottom.txt'; ?>[/green]
</body>
</html>
The three included files, head.txt, top.txt and bottom.txt are, respectively:
Code:
  <meta name="author" content="Chris Hunt">
  <link rel="stylesheet" type="text/css" href="stickers.css" />
Code:
  <table>
  <tr>
  <td><h1>I Love Stickers</h1></td>
  </tr>
  <tr>
  <td>
Code:
  </td>
  </tr>
  </table>
Note that the three included files are not themselves complete HTML documents, or even well formed bits of XML - they're just fragments of a page.

Now, suppose you decide to abandon tables-for-layout. It's easy, you write your new CSS and change top.txt & bottom.txt:
Code:
  <div id="banner">
    <h1>I Love Stickers</h1>
  </div>
  <div id="content">
Code:
  </div>
The new structure is instantly employed in every page of your site! Of course the same principle applies to less drastic changes (and to more complex designs!)

Now, as a non-php user, I don't use the php include command myself. But I have used the above approach extensively with Server Side Includes - which perform the same function for (otherwise static) HTML documents. You can read more about SSI in faq253-3309 and faq253-2000 .


-- Chris Hunt
Webmaster & Tragedian
Extra Connections Ltd
 
Hi krissy,
Just been skimming through some of the suggestions. I've been working on raising our site in search engines for the last while and have found a little bit of everything works, but google lags way behind. A six month lag is not uncommon.

Can't emphasis the good code enough. It feeds into everything. Linking to and from is an ongoing build. Build accessibility in to the pages. Standards, standards, standards. Soon becomes second natures. Standards=quality, quality=visitors - one hopes anyway.

My personal solution to Google is not to use it. Kind of like where you spend your money. Long way to go before Google will get worried but if it's all about usage and traffic etc. !

 
Wow, this thread is like the Energizer Bunny....

But one of my observations is that you are still using the Javascript menu system. If you must use the javascript, I would try to make the url in it the full url(h**p://w**.urlgoeshere.com/page.htm). * was put in to kill autolinking. Google and other engines now read the urls in a script but make it easier for them to know exactly where it is pointing.

I feel that Google and other SE's like strong internal linking. (ie. smiliar to what blogs have done or even Tek-Tips) That is longer descriptive anchortext, not just KW anchor text. I would not think that the anchortext in a scripted menu like that would realistically be completely understood by the engines and therefore would not have any anchortext benefit to SEO purposes.(this is my belief right now, I don't have statistical proof)

xtendscott
Home Improvement Watch | Cryosurgery | Walla Walla Portal | Walla Walla Martial Arts
 
SEs may be reading the URIs in script (on page only) but they don't follow them. Only a <a href="...> link will be followed.

It's a simple test for js menus and SEs

Open the page in a browser with javascript turned off. If you can still click a link, it's fine.

Chris.

Indifference will be the downfall of mankind, but who cares?
Woo Hoo! the cobblers kids get new shoes.
People Counting Systems

So long, and thanks for all the fish.
 
SEs may be reading the URIs in script (on page only) but they don't follow them. Only a <a href="...> link will be followed.
That is incorrect, at least for google. I put some pages up over a year ago to test urls in javascript, and only linked to them within a javascript. I was using the complete url though. Google a week later did have them cached.

SE's do not "run" the code but they do look for urls within the code to find pages. They can count as backlinks but I would say they would not be able to decifer any "Anchor text" for the links.

xtendscott
Home Improvement Watch | Cryosurgery | Walla Walla Portal | Walla Walla Martial Arts
 
Got to say, in the tests I've done for what links get followed nothing other than a fully clickable links has been followed.
Even having a fully qualified http address that wasn't wrapped inside <a...> </a> tags on a .asp page wasn't followed.
Tested javascript links using every combination of methods and never had the linked pages crawled.

Scott, post the code you used I would definitely be interested in testing it. I've seen it said a couple of times on different SE forums that this had happened, but it has usually turned out that an alternative route existed to the linked page.
Being a sceptic I like to see and try things for myself.


Chris.

Indifference will be the downfall of mankind, but who cares?
Woo Hoo! the cobblers kids get new shoes.
People Counting Systems

So long, and thanks for all the fish.
 
Hmm, coming up with the exact code from over a year ago. Will have to do some looking.

I am pretty sure it was the basic scripts that Dream Weaver put out at the time. I guess what brought me to test it was a local Chamber website for the longest time had very few page indexed because of its scripted menu system, and then one day it had a lot.

Code:
fw_menu_2_1.addMenuItem("Art","location='[URL unfurl="true"]http://www.wwchamber.com/relocation/comm_services/art.htm'");[/URL]
That is what I pulled up from archive.org you can look at the code yourself. If I recall I use something pretty similar with DW but it looks as if their code is from Fireworks.

They have recently done a design change so I had to get it from the wayback machine.

Thanks again to the wayback machine.
Code:
function mmLoadMenus() {
  if (window.mm_menu_0012204316_0) return;
  window.mm_menu_0012204316_0 = new Menu("root",37,14,"Verdana, Arial, Helvetica, sans-serif",8,"#000000","#ffffff","#3399ff","#66cc99","left","middle",3,0,1000,-5,7,true,true,false,0,true,true);
  mm_menu_0012204316_0.addMenuItem("page&nbsp;1","location='[URL unfurl="true"]http://www.domain.com/page1.htm'");[/URL]
  mm_menu_0012204316_0.addMenuItem("page&nbsp;2","location='[URL unfurl="true"]http://www.domain.com/page2.htm'");[/URL]
  mm_menu_0012204316_0.addMenuItem("page&nbsp;3","location='/page3.htm'");
   mm_menu_0012204316_0.hideOnMouseOut=true;
   mm_menu_0012204316_0.menuBorder=1;
   mm_menu_0012204316_0.menuLiteBgColor='#cccccc';
   mm_menu_0012204316_0.menuBorderBgColor='#3366ff';
   mm_menu_0012204316_0.bgColor='#555555';

  mm_menu_0012204316_0.writeMenus();
} // mmLoadMenus()

I hope this helps your testing. I can't recall exactly if the "/page3.htm" was found or not, but I know page1 and page2 were.

xtendscott
Home Improvement Watch | Cryosurgery | Walla Walla Portal | Walla Walla Martial Arts
 
You guys are AWESOME! I am honestly so appreciative of all the help, suggestions, and ideas - more than a thank you note can convey. I have kept this same thread going a)so that everyone knows the history of what's been suggested and what i've tried and b)to show my commitment to working on this issue (that i am not just looking for a quick fix) :)

I am currently working on all your suggestions. I managed to put a lot of my long javacrapt into .js files (with no error messages!) :) and now i'm working on everything else suggested.

Thank you again! I'll write an update post soon when I implement some more changes!

Sticker and Sticker Clubs

Enter our Contst to win 1 of 5 packs of 10,000 Stickers every month!
 
Hi again!

I've been working very hard on the site and have implemented your suggestions :) Since my last post I have worked on the following:

-trying to get more links (I have more than before), but I still need many more. However, I find it hard to get good quality links.
-I checked each and every page for errors and fixed the coding (this took a LONG time but I highly suggest everyone else puts in the effort to do it to their sites if you never have)
-moved large my javascript files to js files
-moved 100’s of images into separate images file
-improved my robot file
-put full url’s in javascript menu AND
-also implemented a noscript menu
-and also built a site map (And a 2nd Google Sitemap)
-using H1,H2,H3 tags
-made sure all images have alt tags

I have a few questions please:
-Do I need put <meta name="robots" content="index,follow"> on every page or just the index page?
-Do you see any other things I have neglected that I should work on/add

Chris - I saw someone with your same name on another forum from Blackpool - is this you? I was born in Blackpool too so it would be a small world if it is you :)

Thank you again Chris, Scott and everyone else for all your help and suggestions! I hope all this work eventually pays off!

Krissy

Sticker and Sticker Clubs

Enter our Contst to win 1 of 5 packs of 10,000 Stickers every month!
 
Sounds like it's all coming together steadily. Well done.
Do I need put <meta name="robots" content="index,follow"> on every page or just the index page?
Neither. The robots meta is an exclusion protocol, so index,follow is the default, only if you want to block an action is it required.

made sure all images have alt tags
Surely you mean alt attributes [smile]

Yep it'll be me, definitely a small world

Chris.

Indifference will be the downfall of mankind, but who cares?
Woo Hoo! the cobblers kids get new shoes.
People Counting Systems

So long, and thanks for all the fish.
 
ChrisHirst,
Scott, post the code you used I would definitely be interested in testing it. I've seen it said a couple of times on different SE forums that this had happened, but it has usually turned out that an alternative route existed to the linked page.
Being a sceptic I like to see and try things for myself.

Interested if you have tested the code yet for SE's following JScript links?

xtendscott
Home Improvement Watch | Cryosurgery | Walla Walla Portal | Walla Walla Martial Arts
 
ageing delay? sandbox? new domain penalties?

I'm helping my brother with his site....It is a new domain (3 1/2 months live) that has a pr4 now, and is indexed, and 1st page serps on google for may relevent searches.

It is a low competition, low traffic nitch. But there was no delay, and rising in serps almost daily.

The site is far from perfect:
other of my sites have done this too.



Kevin
 
Cool to know. I have not read that, but it does make sense from what I have read. We have not yet reached into the more competitive phrases for his segment. Naturaly, with product reviews, very specific phrases (model names and numbers) are doing very well so far. I'm trying to get him to write some more general articles as well as reviews.

Kevin
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top