Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Site Map Pages 1

Status
Not open for further replies.

SteveAudus

Technical User
Oct 4, 2001
409
GB
A client have requested a site map page on their site, listing all of the pages available to a user.

I personal think this is unnecessary, cause if the sites navigation is good enough a user should be anything they are looking for anyway.

But that’s want they want and I was wondering how other designers had approached this problem?

Is this just going to a page full of text links to pages around the site?

Any sueggstions?

Cheers,
Steve

 
It certainly helps the search bots...

--Chessbot

There is a level of Hell reserved for probability theorists in which every monkey that types on a typewriter produces a Shakespearean sonnet.
 
Yes, generally speaking, a site map should contain a well structured list with links to all pages of the website, so that a user can see at a glance an overview of the entire contents.
Unless you are using some sort of content managemenet system for your pages, its unlikely that you will be able to approach this programatically, so a manual job is probably the only way.
Take a look around at some websites with site maps to give you some ideas (you could start with one of mine - the site map isnt amazing but it will give you an idea of the sort of thing you need).

Nick (Webmaster)

 
Hi there,
Just got the same assignment (To create a site map).
Does one have to slog through manually - is there no tool that will scan down the trees for you?

tia -
 
Hi

I thinked to this problem too, but my final conclusion was that the automated solutions are too directory structure dependent, so I prefer to maintain it manually. Beside this, is hard to exclude the multiple pages with the same content, but tables sorted in different orders, for example. And of course, you must have some good page titles, to extract as link text. For an automated solution I would use this :
Code:
lynx -traversal [URL unfurl="true"]http://example.com/[/URL]
sort -u | awk -F$'\t' '{print "<a href=\"" $1 "\">" $2 "</a>"}'

Feherke.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top