Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Difficult Problem 1

Status
Not open for further replies.

kiwiCoder

Technical User
Aug 2, 2001
45
0
0
NZ
I have been assigned the task of automating backups without the use of SQL Server Agent. To date I have been able to create and successfully call a VB app( using xp_cmdshell) which can do the backups with SQLDMO. Now I need to create some kind of trigger which will run only when more than 100 changes have been made to a database. I have no idea if I can do this from a system database or if there is a way of doing it from Master. Any help greatly appreciated.
 
100 changes! Isn't that going to happen every day? What scenario is it that allows this to make sense? If the db isn't used very often, then what's so terrible about a weekly or monthly backup? To put an actual trigger in every table of the database would be insane.
Maybe you mean table instead of database.
-Karl

[red] Cursors, triggers, user-defined functions and dynamic SQL are an axis of evil![/red]
[green]Life's uncertain...eat dessert first...www.deerfieldbakery.com[/green]
 
Yes it will hapen every day. Our clients have multiple databases all of which are mission critical. I plan to do a db backup Monday, incremental backup once daily and logfile backup after x(100) changes to the db. From what I have been able to understand the only way to schedule backups by time, ie every 15 minutes is with SQL Server Agent which we do not want to use, so I thought to execute after x number of changes to the db. I totally agree with you that a trigger in each db is stupid, open to any other suggestions.
 
Setup the database for replication, that should give you the layer of protection you're looking for. Check out transaction replication in BOL.
-Karl

[red] Cursors, triggers, user-defined functions and dynamic SQL are an axis of evil![/red]
[green]Life's uncertain...eat dessert first...www.deerfieldbakery.com[/green]
 
If you HAVE to do it your way....

I think I would create a table to hold update information. Have two columns: DBName and UpdateCount.

Put a FOR UPDATE trigger on each database, the trigger should check this new table and compare the UpdateCount for the database. If the UpdateCount is less than 99, it increments the UpdateCount by 1. If UpdateCount = 99 (meaning this update is #100), then it would do the backup and reset the UpdateCount to 0.

-SQLBill
 
Is there a trigger for an entire database?
-Karl

[red] Cursors, triggers, user-defined functions and dynamic SQL are an axis of evil![/red]
[green]Life's uncertain...eat dessert first...www.deerfieldbakery.com[/green]
 
okay, picky picky. :D

I should have been more specific and said each table in the database.

Oh well, I hope the poster got the idea. Either way it's a lot of work to set up, but once it is set up it should be unnoticable.

-SQLBill
 
Cheers for the responses guys. Am also looking at using scheduling agent but this leaves posibilities for error which is not acceptable. Like SQLBill said, alot of work to set up, but foolproof after that.
 
can you not use wait for in some way ie
Code:
backup sql
waitfor delay '000:15:00' 
backup sql

[bandito] [blue]DBomrrsm[/blue] [bandito]
 
What's so disastrous about doing a backup every 15 minutes? Is there some magic in 100 updates?
 
Magic, hmmmm, what if one of the database's hasn't been updated at all for 2 or 3 weeks,(which happens with our clients), not a great deal of sense in backing it up every 15 min's is it.
 
Now I'm not convinced you need to make extra work. If 100 updates isn't a requirement, then just doing Transaction Log (TL) backup should work. My understanding is the TL backup is just of all changes since the last backup of any kind. So, you do a:

FULL on Monday (call it midnight)
DIFF on Tuesday
DIFF on Wednesday, etc.
and you set TL Log backups every hour. The Tuesday 0100 AM TL backup will only be of changes since the full backup was done and if there weren't any changes then nothing is backed up. And so on for each TL backup. Or at least that's my understanding.

-SQLBill
 
Replication.
-Karl

[red] Cursors, triggers, user-defined functions and dynamic SQL are an axis of evil![/red]
[green]Life's uncertain...eat dessert first...www.deerfieldbakery.com[/green]
 
Poor choice of words on my part to say magic. I just wanted to know why the drive for 100 transactions.

Will you be keeping each backup in a separate location? Or will you be overwriting the previous one?

I would say there's far more sense in backing something up every 15 minutes than in writing a ton of code to figure out how to do it every 100 updates. That's creating a maintenance headache! What actual real problem are you solving by doing this?

If you use full recovery model and schedule just a log backup for every 15 minutes, you won't be wasting much. Storage space is cheap enough nowadays that there's no point in conserving some small amount.

Alternately, there has got to be some other way to determine that some update occurred in the database. Instead of doing a backup every 15 minutes, run some query every 15 minutes (or even an external procedure) which determines if a backup is necessary. Then the backup fires.

If you had to build a car which could check its own oil, would you put all sorts of gears and wheels and mechanics attached to each part of the engine, so it could check the oil level every 10000 revolutions of the engine? Or would you just add a module that verified the oil level was correct every X number of minutes, even when the car was just sitting there not running? Which would be less likely to break down? Which would be easier to modify and maintain?
 
Replication does solve the "keep another copy in case the thing blows up" problem. But it doesn't protect against evil queries, like "DELETE Customers." Now the replicated database is missing all its customers, too.

So, I'm saying that replication is a great solution, but just pointing out one of its weaknesses. In fact, I can see using BOTH methods. Replication to keep a working copy running, transaction log backups to be able to step to any point in time. Unless I don't know what I'm talking about, which is always a distinct possibility.
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top