Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations gkittelson on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Database design, suggestions?

Status
Not open for further replies.

kasuals

Programmer
Apr 28, 2002
100
US
I'm creating a database that will archive links. I would like to avoid placing all the links into one large table since it will be receiving quite a few hits and I don't want to have to parse the table by date too often if it's large. What I was thinking was creating tables in the format of yymm (ie: Nov 03 being 0311) and placing corresponding november 03 links into that and accessing it that way.

Then have a table that contains all links for referencing the month/year table for admin control only (if I need to find a specific link without knowing the month). That table would only contain the unique ID and table name for that link.

Is this an effecient way of going about it? In my sleep dep, am I even explaining myself properly? lol.

- "Delightfully confusing..." raves The New York Times

-kas
 
what is quite a few hits and how many links will be archived at estimate??


I have build my own forum in php and have more than 250.000 posts in a year. concurrent users are sometimes more than 10
(concurrent means that 10 people have reloaded the main page within 2 minutes)

and I have everything in one table!

I think you will make it yourself quite difficult at the start if you don't know if it is absolutely necessary to do

you can always make a character field with the year and monthpart in it and add an index when performance drops

and even then you can decide to transfer the data to seperate tables


 
Well, this is for a major site, which god only knows how many hits it receives in any given day. I don't want to guestimate, and I don't have the figures... but it's ALOT of traffic.

And the links are going to be in the range of 20-150 per day.

- "Delightfully confusing..." raves The New York Times

-kas
 
I mean, if you think one table can handle parsing that table with disgustingly large amounts of page generations, I'm all for it. It would save me alot of uneccessary code.

- "Delightfully confusing..." raves The New York Times

-kas
 
you can perhaps also work with one active table and an archive table. I guess that most hits will come on the most recent links. so after half a year you tranport the links from that month to an archive table.

but working with databases is quite efficient and 150 records a day is not that much. that is only 50.000 a year. I read some posts here where they have databases with millions of records ;)

I know there are also programs available that you can test drive an application/website with virtual users.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top