THe modeling of the schema is not related to the input format of the data coming in. Most ETL tools can handle just about any format that is out there: fixed length mainframe files, delimited ascii files, xml files etc...
The scehma for the sales history is one of the most common and well...
The purpose of a bridge table is to resolve the many to many relationship between fact and dimension. Given that scenario you can either report out at the lowest level or highest level. Assigning a weighting factor (based on number of occurrence of values in the group on the bridge table)...
I guess I was reading to much into your problem. I thought you were talking about handling retro-active changes. None of the methods described above will help you with that because you will have already linked your fact records to the image of the dimension at the point in time that the fact...
Since you are storing the attribute as a degenerate key in the fact table (and assume you want this change to involve the least amount of work), adding a date of purge will serve your purpose which is to be able to see the history of an order by constraining on dates >= the first date posted to...
This is an interesting scenario, since all sessions would be working off one named persistent cache. I have not seen a document that covers this from informatica. We use named persistent cache but do not have the need to recache during the process. I would suggest trying it out with 2 or 3...
We have used vality and trillium and of the 2 trillium is the product of choice. Data cleansing has many meanings to many people, these products are particullary good for name and address standardization and deduping. Some ETL tools now have cleansing tool functionality built in such as first...
Moving to incremental load strategy will require analysis of dimensional attributes to decide on if/how to track the changes. SOme attribute changes matter and others may not. You need to take a look at the timestamping of the rows that you capture to ensure there is a continuous...
You should read the data model resource books available from Len Silverston. It shows the overall model that could be used to build a 3N form enterprise datawarehouse layer. If you are looking to build an enterprise datawarehouse by building marts and snapping them together ala Kimball, you...
I have a need to produce both standard reports and ad-hoc queries that will contain a calculated field based on groups of records selected by date range and other criterion. The calculation for the field is complicated and will be implemented via a stored procedure (written in Jave in a UDB...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.