What will happen when i given connection value (for target) in Mappings tab as well as different connection value for same target in my Parameter file?
And what will be the output if parameter file is empty, where i have defined Mapping Parameter at Mapping level and no initial value is defined?
I got answer from the post given by MacLeod72 on running totals..
Port 1: KEY Input
Port 2: VAL Input
Port 3: V_TOTAL Variable Exp: IIF(KEY != V_KEY, VAL, V_TOTAL + VAL)
Port 4: V_KEY Variable Exp: KEY
Port 5: RUNNING_TOTAL Output Exp: V_TOTAL
Thanks to MacLeod72
Source
No Al_No Amt
X1 1 20
X1 2 30
X1 3 40
X1 4 50
X2 1 60
X2 2 20
I have a source as above
Data that shud be populated in Target
No Al_No Amt Prev_buffer_Amt
X1 1 20 0 (shud be)
X1 2 30 20
X1 3 40 50 (sum...
I have SQL Source which is not having any type of keys all over database. i have to migrate that data to Oracle.
Source Master table is having more than 52 lac records, it's taking hours to read data as there is no key defined on source.
Think Partitioning source wud be best option to read the...
Target Table (oracle 10g) is having datatype Addr, where 5 columns are embeddedd
Addr contains (address1, address2, city_name, district, state, pincode)
we have to migrate data from SQL where the above mentioned are 6 different columns having varchar as datatype and pincode is numeric.
When...
You can refer my thread which i posted from converting signed over punch values, which may be helpful for you.
And refer http://en.wikipedia.org/wiki/signed overpunch
the above site which may also help u.
And the thread is " converting .c3 values.......
We have got a new project in which we have to use Informatica 8.1 for migrating all the data from SQLServer to Oracle.
So, for migrating the data should we import each and every table into Informatica and perform several different Mappings for all the tables present in SQLSERVER??
What about...
i have filtered maxout and Lifetime amounts in Exp record by record by applying condition
maxout!=0 and maxout!=-maxout
and similar conditions to Lifetime
Now passed to Agg t/r
where i need to check max date and max claimline and select maxout at that point where date and claimline is max...
When you are working with fixed width flat file check for the line endings.
If your line endings are DOS line endings then add an extra column CRLF with precision 1 to your flat file source and to Source Qualifier and donn pass CRLF further.
I had a similar problem when working with fixed width...
IIF(instr(TO_CHAR(DeductibleAmt_V0),'.')>0,TO_CHAR(DeductibleAmt_V0*100),TO_CHAR(DeductibleAmt_V0))
I have used the above Expression
and amoounts for amounts like 160150 is displaying correctly but value 4352 is supposed to be 435200
but it is displaying as 4352 only
My input is to add two amounts and datatype is decimal with precision 19 and scale 4 for both amounts.
Output should be length 9 with padded to zeroes.
Example:
benededuct (benededuct:Double:): "1500.000000000000"
maxoutaccrual (maxoutaccrual:Double:): "3600.000000000000"...
Input precision is 19 and scale is 4.
i am taking it i am taking it as precision 9 and scale 2
Then o/p is coming as say 1239.9000 or 13456.000 so if it is the case i should get number as 000123990 and 001345600
Including digit it is taking number upto 9 length.
In this case how can we multiply...
I have Amount field which have value as below say
148.90,148.99
Requirement is to remove the decimal and left pad with zero's.Precision is 9
Value 148.99 is converted correctly as 000014899 but coming to 148.90 its displaying as 000001489, but testers claims it to be
000014890
When we convert...
i have defined my ports as you mentioned still its giving same result.
i have group by on enrollid and accumid
can i have a sum condition like this
sum(accumvalue1,count(accumid)>1) , if we use count on accumid, will it count based on groupby ports(enrollid, accumid) or will it group entire...
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.