Can't thank you enough Hans63. Your comment that Nested loop is the problem, gave me the confidence to go ahead and play around with my newly acquired knowledge on optimization. Even before I read your latest reply, I reduced time to 45 min from 6 hours with hash join and with Merge hint to 10...
Problem here is DBAs think that it is the best strategy Oracle can choose in the situation and no one is working on ways for how to avoid nested loops. Till now I never bothered to look into the Optimizer world and now find myself digging into oracle performance manual.
Anyways Thanks Hans63 -...
Thanks to all of you for your suggestions. It looks like we can get extra storage to stage all tables as it is and I would be able to use some of these suggestions.
Hans63 - I checked my query and it is using proper index and is being driven by the table that has the filter on dates. Still it...
Thanks Hans63,
How long = for one month worth of data where I just moved few fields from all the tables (not all the fields that I would need in actual load) - mapping took 5 and half hours to finish and moved 5 million rows. I used date between (d1 and d2)on tab2 to get one month. I will try...
In my company we are looking for the best strategy for our initial load to the datawarehouse. Will appreciate if people can share different strategies that they have used to accomplish this goal. We are moving data worth 3 years and we have something like this..
Tab1 - 21 Million
Tab2 - 28...
We are in the process of desiging a datamart with nightly load etl that would be developed in Informatica. Management wants us to think on the possibility of making it a real time datawarehouse. Having experience in batch ETL and already having the ETL architecture to support it I am finding it...
Are you sure ? I think it works even for this one,No matter how many inactive records in the period, there will be only one that would have end date greater than the range end date right ?
I am sorry, if I am missing something. Hope you find a better solution than this. Good Luck.
I donot think the whole design is applicable to you just where it says
"A crucial piece of the design is the second date/time entry: transaction end date/time"
I guess it is the date range that is creating confusion.
I am trying to understand where is the problem with the date range..
Id...
Flurk,
Sorry I dont see problem with date range..
For historical analysis - you will have to ignore the active flag - you will kind of construct the active flag on the fly based on the date ranges..by comparing the date range appropriately with status and end date.
Since you are updating...
Did you consider technique of having active flag. That is along with the Status date, have an
end date when that status was active. As soon as sale goes thru a status again..previous status
becomes inactive. This way a sale will have only one status active at a time.
Id Status...
Hans63,
Thanks for describing your "ODS". After designing my datamart and understanding the source systems my design instincts were suggesting me to have something like what you call "ODS" to feed my datamart, but I was little doubtful. When I started researching to learn the best practices in...
Thanks Riverguy and Hans63,
I think key to your ETL architecture is "ODS". However I just want to know what do you mean by "ODS" in this context. I am sorry but "ODS" really confuses me and there are 1001 definitions out there.
Does "ODS" as you mentioned above have data from all the sources...
Hans63, have a star for your reading suggestions and for letting the secret of your dw experites out. :-)
Thanks for sharing your staging strategy too. This makes things little bit clear. Can you say something about staging and delivering dimensions. How do you handle them in your environment...
I am designing ETL and really confused about the whole staging architecture. Please provide me whatever links you can, I am ready to read anything. My questions are as follows.
(1)
Staging area between the source and the data Mart (backroom as they call it), is it supposed to be a persistent...
Thanks Hanes, what you have described is exactly the situation I have in hand.
sIZES
-------------
We should have around 40 Million records in inventory transaction in an year. We will have 2 years data to begin with. Cost transaction can roughly be 5-6 times of that as one Transaction can...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.