Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations SkipVought on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

DecisionStream

Status
Not open for further replies.

tpm

MIS
Mar 2, 2000
2
0
0
US
I'd like to hear feedback anyone using DecisionStream. Considering the product for use in a small single source data mart project. Any info on experiences with the product appreciated.
 

We've used Decision Stream to process about 30 million
rows of data in Oracle. The Build process was done
using a UNIX script. We created 8 to 9 dimension tables.

We use the output tables in Transformer to create
Power Cubes.

We've had the product for a year.

Let me know what you think about DS.
Thanks
 
JUF,

That sounds a lot like a project we are considering DecisionStream for, with Oracle back-end and potentially a high number of rows (clickstream and customer data). How did you like the Cognos tools? Would you recommend them?

dbjstein
 
I have used DecisionStream on a large DM project with SQL Server 2000 and Windows 2000 servers.

It has some very nice features and some limitations you need to be careful about, as does any other tool on the market.

Be sure to provide their Technical Pre-sales people with the following:

1. Largest expected size of your input files in GB.
2. Anticipated sizes of your dimensions and facts in rows.
3. Complexity of your transformations.
4. DBMS platform.
5. HW Platform(s).

Try to get written verification that the product can handle the above.

Chuck
 
My company specializes in building Data Warehouses using Cognos Decision Stream. As a whole this is a good product. It was developed around Ralph Kimball's star schema methodology. It has a nice feature for handling surrogate keys and Type II slowly changing dimensions.

Stan
 
hi, i'm still in the process of studying decision stream. i already have a business problem to resolve though. we need to generate a report that retrieves data from an oracle database (snowflake schema). the report lists all countries grouped by region, and their GDP growth rate. to get this GDP growth rate, i need to have current year's GDP and the previous year's GDP of each country. how do i do derive the previous year's GDP? this may be simple to experts but for decision stream beginners like me, i'm at a loss. i would have easily done this through a SQL statement but how do i implement this in decision stream. pls help. thanks!
 
I'm a little confused. Are you writing a report or delivering data to a table in which you are going to pre-calculate and store the GDP growth? DecisionStream is an ETL so it normally delivers data to a target.

Anyway, if you can easily write a SQL report to calculate the result, use that report as your data source in a DS build, then simply output the result.

You can perform calculations in a data source (i.e. SQL query that feeds the build) or within the build as a derivation. The choice is yours.
Good luck. :)
 
Sorry, I guess I didn't explain it very well. Anyway, the ultimate objective is to create a report. However, we feel that Impromptu can't perform the required calculation, so we decided to stage the data first -- that is why Decision Stream came into the picture.

We're doing the staging now. We're creating a fact build for this GDP. We have other fact builds to do and since we're new in Decision Stream it's taking quite a while. Any tips on the product? Any relevant sites? :)

Thanks a lot for responding to my inquiry.
 
Well, hopefully you are taking product training. I've generally heard that DecisionStream is easier than most other ETL tools to learn, but ETL by its nature is non-trivial.

Assuming you are a supported Cognos customer, try the web site. There are lots of answers to specific questions there but for general stuff. The course is probably the best place. I know the course provides lots of "best practice" stuff too.

Also, read Ralph Kimball's books. DS was created with his design philosophy in mind (but is not limited to it)>

I've got lots of tips on the product, but not enough time or space here. :) If you have specific questions, just ask and I'm sure everyone will chip in where they can.

Good luck
:)
 
Tip 1. Use fact builds (a.k.a. databuild) to stage your data.
Tip 2. Use a naming standard so you make sense of your Decision Streams databuilds
Tip 3. Write your SQL outside of Decision Stream then use the fact build wizard since it does mapping for you
Tip 4. Save often.
Tip 5. Go to and buy DSCatalogTools utility ($50) to move data from your development Decision Stream catalog to test or production. This product will pay for itself in 1 day.
 
Me too, I am trying to use Decision Stream for developing the regional data warehouse. I have tried to study the user manual, but it seems quit difficult.

Who know about the decision stream training? is it useful?

Thanks!
 
There is a lot of functionality in the Decision Stream product. I would suggest you read Ralph Kimball's Data Warehouse Toolkit book to understand how the tool is used to build data warehouses.
 
I would highly recommend the DecisionStream training course. It is a wonderful and powerful product, but it has LOTS of options. Even if you are very familiar with data warehouseing and other ETL tools, I'd still recommend taking the course.

Good luck,
Matt :)
 
Hi MattOh, it's been a month and I got too busy I forgot to check Tek-Tips already. We were able to use Decision Stream to create fact builds (fact tables which served as data source to our reports).

We're actually distributors of Cognos products but unfortunately I didn't have the chance to attend DS product training. The one who attended just resigned. :( Anyway, I still am not satisfied why how we used Decision Stream because we basically worked around our problem with "SQL scripting".

 
Hi Chris,
Well, as soon as you get the chance, go on the class. One of the things people like best about the product is that they don't have to do much "SQL scripting". I often tell folks that if they are writing a lot of code, they probably aren't useing the product as effectively as they could.

Good luck,
Matt :)
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top