Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations Mike Lewis on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Planning for the future

Status
Not open for further replies.

TamedTech

IS-IT--Management
May 3, 2005
998
GB
Morning Guys,

I’m looking for some advice on planning for future growth of my database, after speaking to a couple of programmers they suggested that the best place for this kind of advice was under the administration section.

About my setup

My database is currently sat on a dedicated server with my ISP. I’m running SQL2005 workgroup edition. The SQL Server is sharing the box with my IIS web server and Adobe ColdFusion MX7, I know this is by no means ideal but for the moment having them on the same box is both convenient and economic for me.

The server itself is nothing particularly special, a Dual-Core AMD 2ghz processor, 2GB or RAM and a 300GB RAID 1 configuration on the disks, We partitioned the disks so we have a dedicated data partition for the SQL data and logs, all of this running on Windows Server 2003 Standard SP2.

We have plenty of room to grow the server, a spare processor slot and support for about 16GB ram and a larger disk array. Not to mention splitting into several boxes with a load balancer to manage the cluster.

My Concerns

My major concerns lie with the large amount of traffic that the database could potentially be under once we’re in full production, by far the most strained element of the database will be for my application logging. The database not only records logs of applications based on the server but also receives logs from other applications we have installed on remote clients/units in the field. These logs are posted back to the server in CSV files and imported using a BULK INSERT from a stored proc.

Regularity of these imports is difficulty to determine, the flow is likely to pretty constant as its in more of a 'real-time' environment than a 'batch' process every night at midnight or somthing, the logs are processed as they arive on the server via the application, this is somthing I want to maintain as the imediacy of our reporting is a real USP for our products.

Now my cautious calculations show that within the next year we could potentially be looking at around 2 Million new records imported every week. Now I have no idea how this shapes up in the grand scale of things but having never worked with data anywhere near that size its becoming a little daunting to say the least.

At that point in time we’d have more than enough capital to invest in new equipment to form clusters and things to spread that load if needs be, and also hire in a couple of consultants to sort this stuff out properly, but I felts that researching this a little myself would be a good idea, considering I’m currently in the middle of a rebuild of the database schema.

That’s about it

I’m just after some friendly advice and reassurance on this stuff, tips of how to plan the up scaling of the project. These projected figures are very rough, we could go bust and not have any logs, or would could go bang and be doing 20 or 30 million a week, it’s really hard to determine at this point in time. Always plan for the worst case scenario i guess is probably the best route.

Thanks guys for any advice you can offer.

Rob
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top