Ok im reasonably new to SSIS and i was wondering if you could help.
I have a large database from which i need to pull large amounts of data. Row's numbering in the several million easily.
I need to pull the data out, aggrigate and transform it and pump it into a new database. I need this to run on an overnight load and due to amount of changing data it seems most reasonable to truncate and re-load each evening rather than performing a merge.
The problem is that i have 16 of these tables to load, running each one as a piece of SQL takes about 40mins to an hour each on the server we have.
First Question. As all these aggrigated tables use SQL pulling data from the same tables im i correct in assuming its a bad idea to try and run them in parallel??
Secondly, My concern is that if one of these 16 tables fail's i can't afford to re-start the whole lot in the morning when i recive the notification because that will cause an outage on the entire datawarehouse for an entire day because it takes so long to load. That is unacceptable for the the business.
Ideally what im looking for is a solution which will allow me to run all tasks/packages in such a way that it will allow the rest to load even if one fail's.
Is there a way to get the task to continue on failure? just sending an alert to let me know so i can run that task seperately rather than failing the entire package.
I have a large database from which i need to pull large amounts of data. Row's numbering in the several million easily.
I need to pull the data out, aggrigate and transform it and pump it into a new database. I need this to run on an overnight load and due to amount of changing data it seems most reasonable to truncate and re-load each evening rather than performing a merge.
The problem is that i have 16 of these tables to load, running each one as a piece of SQL takes about 40mins to an hour each on the server we have.
First Question. As all these aggrigated tables use SQL pulling data from the same tables im i correct in assuming its a bad idea to try and run them in parallel??
Secondly, My concern is that if one of these 16 tables fail's i can't afford to re-start the whole lot in the morning when i recive the notification because that will cause an outage on the entire datawarehouse for an entire day because it takes so long to load. That is unacceptable for the the business.
Ideally what im looking for is a solution which will allow me to run all tasks/packages in such a way that it will allow the rest to load even if one fail's.
Is there a way to get the task to continue on failure? just sending an alert to let me know so i can run that task seperately rather than failing the entire package.