Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Define CMS 1

Status
Not open for further replies.

wiser3

Programmer
Jun 3, 2003
358
CA
How would you define what a CMS is? A database like MySQL? A HTML editor like Dreamweaver or GoLive? Both?
 
There are only about 5 threads in this forum, and at least one of my answers speaks to your post... read first, then post.

--gabe
 
ganast,

As your posts have been very well thought out and informative and you have yet to be awarded a star - I figured I would jump in and give you one, you certainly deserve it. Thanks for your input in this forum.

boyd.gif

[sub]craig1442@mchsi.com[/sub][sup]
"Whom computers would destroy, they must first drive mad." - Anon​
[/sup]
 
wiser3, this has been taken directly from my website ( which might help:
"...
Features of a CMS Package:

An organisation typically requires a Content Management System (CMS) when it creates and publishes large volumes of content, or when the publishing process is too time consuming and inefficient. CMS is also considered the white knight if there are so many publishers that the existing system has no standardised approach to efficiently publish, store and organise content for user consumption.

Many problems, including "information overload" and search engine ineffectiveness, explain why content management continues to be one of the most pressing and important issues facing Internet and intranet site managers.

In most organisations, content management can combat the inherent laziness and significantly improve the content publishing, storage and retrieval process.

Specifically, a CMS can offer many benefits and tools including:
* User-friendly, Web-based access and use
* Decentralised authoring, allowing many authors in multiple locations
* Document version and time controls
* Content approval workflow
* Database and template creation
* Dynamic page generation
* Link management
* Document conversion
* Personalisation
* Access control and built-in security
* Usage analysis
* Template and design standardisation

Not all systems offer these benefits, while some offer more or other benefits. Therefore, determining which systems and tools will best benefit your company depends on the your specific requirements.

One feature that links all of the above features together, is Dynamic Page Generation. This effectively creates the web pages "on-the-fly".
..."

Hope this helps.

Pete.


Web Developer & Aptrix / IBM Lotus Workplace Web Content Management (ILWWCM) Specialist
w: e: Pete.Raleigh(at)lclimited.co.uk
 
WarTookMan, excellent summation:

One feature that links all of the above features together, is Dynamic Page Generation. This effectively creates the web pages "on-the-fly".

I agree with this totally as this is necessary but have also seen CMS applications that create a "static.htm" page from the data for users to view with less stress on the server to Dynamically generate the page for each visitor and more search engine friendly. If a page is updated, the static page is recreated with new content. (within the static page there may be dynamic elements with scripting languages as ASP, PHP, PERL or CFM)
 
A form of "offline" caching perhaps? Which is a common feature of most CMS packages - and thus avoids the stress on the server (often run as a scheduled agent (say) at 1am).

Aptrix / LWWCM for Domino writes the information as blocks of pregenerated code to the webpage. When the page is retrieved, it checks to see if it is "cached". If it is, then it outputs the blocks of (effectively static) HTML. Otherwise, is creates the necessary dynamic code.

Pete.


Web Developer & Aptrix / IBM Lotus Workplace Web Content Management (LWWCM) Specialist
w: e: Pete.Raleigh(at)lclimited.co.uk
 
xtendscott said:
more search engine friendly

I would suggest that this is more FUD than fact. Regardless of how a page is generated "on the fly", the web client (or search engine crawler etc) is only interested in the HTML that is delivered.

When a server processes data from a database and delivers the resulting HTML code to the client, there is no difference between that code, and a "static" variant of the output (but stored as a .html file).

I would even go so far as to suggest that a dynamically generated site is potentially more search engine friendly - since you have the ability to deliver different content based on the client (assuming it (correctly) identifies itself - which some no longer do).

Not to mention the ability to drop in a simple search facility that merely searches against the back-end database... providing local searching without the need to engage the services of 3rd parties.

There are my 2 pence...
Jeff
 
Last week i attended a search engine optimization conference in Toronto, Canada. All the SEOs' and the reps from the major search engines agreed that search engines have difficulties with dynamic site URLs that have 3 or more parameters. They also have trouble with things like session id's because they create a different URL for every visit. The search engines see these different URL's as duplicate content and penalize you for it. There was an hour and a half discussion of how to mod_rewrite your URLs' to get rid of all the URL parameter problems.

They presented evidence that dynamic sites can outrank static sites - AS LONG AS the dydnamic sites avoided the URL related problems.
 
I think there is a slight misunderstanding between dynamic content created by Content Management Systems, and dynamic content created by passing parameters to a server side script.

An example of dynamic content created by a CMS might be a list of related documents. When a page is requested containing a menu, the CMS performs a search for similar content and returns the results. If a new document is created, the CMS "picks it up" and adds it to the returned list - similarly when a document is deleted, it is removed from the returned results.

Dynamic content returned by passing parameters might be something like Yahoo! mail (or even the page you are viewing right now - passing parameters to "viewthread.cfm"), where the username, session variable, mail box view, and document id are all passed via a URL. A search engine probably won't be able to index this type of content, since the session variable (session id) is generated upon a successful login - and expires either after inactivity or when the browser is closed.

A CMS such as Aptrix / LWWCM use database views to return the required document. Unique DocTitles must be used when creating content. You can still pass parameters, if the page requires them - such as page 2 of a menu of items, however, most are straight URLs with no parameters.

Dynamic content (from my experience) results in a better ranking in search engines, because it can be "kept fresh" by adding dynamic code to the page and tricking some of the spiders that visit. For example, if you have a meta tag which outputs the date/time of generation (ie right now), then it will be deemed "fresher" than a piece of content with exactly the same number of words / matches.

Google (if you read all of the theories) apparently ranks "fresher" pages, with a higher ranking, since it is deemed to be more current.

Just my 2 pence worth. :)

Pete.


Web Developer & Aptrix / IBM Lotus Workplace Web Content Management (LWWCM) Specialist
w: e: Pete.Raleigh(at)lclimited.co.uk
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top