Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations TouchToneTommy on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Search results for query: *

  1. vmcburney

    DatatStage Status

    Bocaburger is spot on, under the 7.5.1 release DataStage parallel jobs can use the DataStage TX Map stage, it has options such as Target Map Directory, Create map from input and output links, select map and map files etc. A DataStage job can then receive or load flattened data to and from this...
  2. vmcburney

    Multiple ETL tools writing to DW

    It helps if you can schedule the jobs of both tools through the one scheduling tool, this lets you build some interdependence between the two loads. For example ETL tool 1 loads inventory tables and ETL tool 2 has to wait for those to finish before running purchasing loads. Automatic...
  3. vmcburney

    Transformation limitation

    I've had a problem with the server edition aggregation stage in a much older release. In one project I abandoned it entirely and pushed my data into universe tables and used group by select statements from those tables. It may be that on unsorted data the aggregation stage has to wait for...
  4. vmcburney

    Run Datastage Project Job reports from code

    If you are talking about job documentation, the best output is a html report with a context sensitive bitmap of the job and html job properties. This can be generated with a batch script on your client PC to repetitively call up the DataStage Designer, a sample script has been uploaded to...
  5. vmcburney

    How do I generate DataStage job reports?

    This is a large topic and you are well advised to search through the forum archives for threads on the options discussed below. DataStage Hawk combines the DataStage and MetaStage repositories and products into one so most of these options may be redundant in that release. Process metadata...
  6. vmcburney

    Transformation limitation

    If your statement is that big you may be better off moving it into a routine where it can be properly tested for all combinations. It will also let you switch to a case statement, which may be easier to manage. regards Vincent An Expert's Guide to WebSphere Information Integration...
  7. vmcburney

    Routine that appends to SEQ File

    A transformer can turn a single input row into multiple output rows by putting end of line characters into a field, eg. char(10). That way the routine can return back to the transformer a text that contains an array of values with char(10) between the records. The transformer would output it...
  8. vmcburney

    OLE/XML manipulation of jobs ?

    DataStage does support dynamic link libraries for transformation routines though they are rarely used. The suite is moving towards SOA, as are Microsoft, so you will see increasing interaction between real time DataStage services and the Office Suite. DataStage can read from or write to Excel...
  9. vmcburney

    Call QualityStage jobs from DataStage

    Does your DataStage login user have permissions to the QualityStage project folders? The QualityStage plugin is the best way to run a QualityStage job as it lets DataStage input and output data to QualityStage jobs. You can try to run QualityStage jobs as shell executes from a DataStage...
  10. vmcburney

    Parellel Routine Compiles, but job fails

    Sorry, can't help you. There are probably not a lot of enterprise edition coders on this forum, try over on www.dsxchange.com instead.
  11. vmcburney

    Job Master (What is?) (How can I program it usign Datastage 6.01)

    Any third party scheduling tool, such as JobMaster, calls a DataStage job via a script that runs the dsjob program. Have a look at the Server Job Developers Guide for the section "Command Line Interface". It provides the documentation for running DataStage jobs from the command line or from a...
  12. vmcburney

    c routine

    You can use the basic routines in a BASIC transformer in a parallel job. If you don't see the BASIC transformer in your list of stage shortcuts then look for it in the repository window by displaying stage types and browsing through the parallel stage types. This lets move the functionality of...
  13. vmcburney

    How to get the date of a file

    You can do this in an operating system script, such as a Unix shell script, DataStage would pass the script the date to be checked and the script would return a success or failure status that tells DataStage whether to proceed. If you wanted to code it inside DataStage you could pass the date...
  14. vmcburney

    Best/Easiest way to execute a SQL Command

    With Oracle I find it easy to run the DSExecute command from a DataStage routine to execute a SQLPLUS command. You build the command by calling sqlplus followed by the sql to be executed. It can then be called up from Sequence jobs and you can put the command status and command output into the...
  15. vmcburney

    Help with a Hash-file

    The hash file stage only accepts exact matches to the key fields, making a < or between clause impossible. You can read the hash file using a Universe database stage instead as a hash file is a Universe table. The Universe stage does accept < clauses. Have Search the forum at...
  16. vmcburney

    Unable to retrieve column from link (Trying to load file)

    Usually means your input stage is trying to read a row but has already used up all the characters in that row before all the columns have been filled. Your fixed width lengths are out of whack. As for the blank row problem the easiest way to handle it is to get rid of the blank rows! Can...
  17. vmcburney

    How to call a oracle stored procedure

    Try user-defined SQL with something like the following: call ProcedureName (:1, :2) Values :1 and :2 get bound to the stage input columns.
  18. vmcburney

    get all jobs from a workflow

    We put all jobs belonging to the same workflow into a single Category folder within the DataStage repository. We then use the ETLSTATS package to generate operational metadata on that folder after the workflow has been run. This is a set of passive metadata collection jobs that retrieve stats...
  19. vmcburney

    Question Re: Using an Excel file as Input to DataStage

    Use a standard Microsoft Excel ODBC driver. When importing the metadata into DataStage make sure the "System Tables" check box is checked as the driver reports each worksheet as a system table. Another option is to output the Excel file into a delimited format file.
  20. vmcburney

    Migrate or transform from powercenter to datastage

    Good question. There is no official tool that will do it and the two products have complete different repositories. I think there is an unofficial tool that Ascential developed when it migrated a large number of PeopleSoft EPM clients across from Informatica to DataStage Server Edition. They...

Part and Inventory Search

Back
Top