Tek-Tips is the largest IT community on the Internet today!

Members share and learn making Tek-Tips Forums the best source of peer-reviewed technical information on the Internet!

  • Congratulations strongm on being selected by the Tek-Tips community for having the most helpful posts in the forums last week. Way to Go!

Insert Into Production Table from Temp Table - Exclude Duplicate Records 3

Status
Not open for further replies.

BxWill

MIS
Mar 30, 2009
367
US
I perform a monthly insert of multiple text files into a 2012 SQL server Database.

I bulk insert the text files into the respective temporary table and then use INSERT INTO to load from the temporary table into the production table after performing various validations.

The text files have duplicate records that I want to exclude from being loaded from the temporary table into the production table.

Duplicate records are defined to be any record in the text file that has the same value for [JurisdictionCode], [AccountNo], [SequenceNumber] and [SaleDate]. Per review, there are relatively few duplicate records - approximately 3% of the total records in any given text file. (Thought - Why let this relatively small number of records prevent the establishment of the composite primary key constraints? Load valid data and keep moving!)

By removing the duplicate records, I will be able to create a primary key constraint based on the afore-mentioned four fields.

The plan is to load all of the valid data. Then, investigate later the invalid data and records with duplicate data.

Using the code below, I have not been able to successfully load the data from the temporary table to the production table and exclude the duplicate records.

There are 0 records in the query result.

Currently trouble shooting but not quite able to locate the reason why there are no records in the query result.

[Bold]
What modifications are needed to load the data from the temporary table and exclude duplicate records that are within the text file?
[/Bold]



Code:
INSERT INTO Equipment_2014 
SELECT RTRIM(LTRIM([JurisdictionCode])),
RTRIM(LTRIM([AccountNo])),
RTRIM(LTRIM([SequenceNumber])),
.
.
.

CASE
   when  len([SaleDate]) = 8
   and isdate([SaleDate]) = 1
   and rtrim(ltrim([SaleDate])) like '[0-9][0-9][0-9][0-9][0-1][0-2][0-3][0-9]'
   then convert(date,[SaleDate], 112) 
   else cast(null as date)
end as [SaleDate],
)
from #EquipmentData1
where 
[SaleDate] between convert(date, '20140101') and convert(date, '20141231')  
AND RTRIM(LTRIM([SaleAmount])) is not null
AND RTRIM(LTRIM([SaleDate])) is not null 
AND RTRIM(LTRIM([JurisdictionCode])) is not null 
AND RTRIM(LTRIM([AccountNo])) is not null  
AND NOT EXISTS(
SELECT t1.* FROM #EquipmentData1 t1
JOIN (
    SELECT [JurisdictionCode], [AccountNo], [SequenceNumber], [SaleDate]
    FROM #EquipmentData1
    GROUP BY [JurisdictionCode], [AccountNo], [SequenceNumber], [SaleDate]
    HAVING COUNT(*) > 1
) t2 ON t1.[JurisdictionCode] = t2.[JurisdictionCode] 
AND t1.[AccountNo] = t2.[AccountNo] 
AND t1.[SequenceNumber] = t2.[SequenceNumber] 
AND t1.[SaleDate] = t2.[SaleDate]
order by [JurisdictionCode], [AccountNo], [SequenceNumber], [SaleDate]
)

GO

Another thought - Is it possible/feasible to bulk insert the text file into a staging table (rather than a temporary table) so I can track the records that were not loaded to the production table and facilitate future review of these "leftover" records? [Bold]Further, I could have a field that designate the problem with the record - i.e. "Not Valid Sale Date", "Duplicate Record" etc.[/Bold]

Initially I created a "Duplicate Records Table" and loaded the duplicate records. Then, attempted to join the temporary table, #EquipmentData1 with the DuplicateRecordsTable and delete the duplicate records from the temporary table. Then, I was to load the validated data excluding the duplicate records from the temporary table into the production table.

I was not able to get the following code to work.

Code:
DELETE FROM #EquipmentData1 A
INNER JOIN DuplicateEquipmentData B
B.[JurisdictionCode] = A.[JurisdictionCode] 
AND B.[AccountNo] = A.[AccountNo] 
AND B.[SequenceNumber] = A.[SequenceNumber] 
AND B.[SaleDate] = A.[SaleDate]

Appreciate any additional insight regarding best practices to handle duplicate records in the source text files.

 
The patter to apply to bulk text file imports is

TEXTFIOLE -BULK LOAD-> Staging Tables
Staging Tables -MERGE-> Production Tables

What you do is nothing like that.

You decide about duplicates in the MERGE stage, not before.

The details depend on whether you want data coming from text to update the existing data or not. If something coming in text for the same Jurisdictiocode etc (all the criteria identifying an already existing record), than that is handled as an update, all rows with a non duplicate id are new.

And that's it.

We do have a command, that does the merge, it is MERGE.

For a simple case of a production and staging table both in the form (ID, Data) the merge statement is simply:

Code:
MERGE ProductionTabla as Target
USING StagingTable AS source (ID, Data)  
    ON (target.ID = source.ID)  
    WHEN MATCHED THEN   
        UPDATE SET Data= source.Data
WHEN NOT MATCHED THEN  
    INSERT (ID, DATA)  
    VALUES (source.ID, source.Data)

And the same holds true, if you need more than one roow for the id and for the data, the statement just becomes more complex.

Before going into merge, you might need to do your data cleansing within the staging tables.

The way the merge changes, if you want to skip data already in production is simply skipping the WHEN MATCHED portion:
Code:
MERGE ProductionTabla as Target
USING StagingTable AS source (ID, Data)  
    ON (target.ID= source.ID)  
[s]    WHEN MATCHED THEN   
        UPDATE SET Data= source.Data[/s]
WHEN NOT MATCHED THEN  
    INSERT (ID, DATA)  
    VALUES (source.ID, source.Data)

Bye, Olaf.
 
First some comments on your code
Code:
INSERT INTO Equipment_2014 
SELECT RTRIM(LTRIM([JurisdictionCode])),
RTRIM(LTRIM([AccountNo])),
RTRIM(LTRIM([SequenceNumber])),
.
.
.

CASE
   when  len([SaleDate]) = 8
   and isdate([SaleDate]) = 1
   and rtrim(ltrim([SaleDate])) like '[0-9][0-9][0-9][0-9][0-1][0-2][0-3][0-9]'
   then convert(date,[SaleDate], 112) 
   else cast(null as date)
end as [SaleDate],
)
from #EquipmentData1
where 
[COLOR=#EF2929][b] -- this is wrong as saledate at this point is not yet a date
 -- if it is indeed a date at this point then the case statement above is not required[/b][/color]
[SaleDate] between convert(date, '20140101') and convert(date, '20141231')

[COLOR=#EF2929][b]-- trims below are not required to check for nulls 
-- Unlike Oracle in sql server an empty string, which would be the result of rtrim/ltrim on a string with only spaces on it, is not converted to a null value[/b][/color]
AND RTRIM(LTRIM([SaleAmount])) is not null
AND RTRIM(LTRIM([SaleDate])) is not null 
AND RTRIM(LTRIM([JurisdictionCode])) is not null 
AND RTRIM(LTRIM([AccountNo])) is not null  
AND NOT EXISTS (SELECT t1.*
                FROM #EquipmentData1 t1
                INNER JOIN ( SELECT [JurisdictionCode], [AccountNo], [SequenceNumber], [SaleDate]
                            FROM #EquipmentData1
                            GROUP BY [JurisdictionCode], [AccountNo], [SequenceNumber], [SaleDate]
                            HAVING COUNT(*) > 1
                           ) t2
                ON t1.[JurisdictionCode] = t2.[JurisdictionCode] 
                   AND t1.[AccountNo] = t2.[AccountNo] 
                   AND t1.[SequenceNumber] = t2.[SequenceNumber] 
                   AND t1.[SaleDate] = t2.[SaleDate]
[COLOR=#EF2929][b]-- remove order by from here as it is invalid code[/b][/color]
--order by [JurisdictionCode], [AccountNo], [SequenceNumber], [SaleDate]
              )

GO

as you with to both only insert into final table the "good" records, and insert into a error table the "bad" records I would advise you to do as follows instead.

Code:
create another intermediary temp table to hold all the records, good and bad including the ones that fail the saledate validadtion (which is wrong on your code)

if object_id('tempdb..#validate_Equipment' is not null
   drop table #validate_Equipment;

select rtrim(ltrim(ed1.JurisdictionCode))
     , rtrim(ltrim(ed1.AccountNo))
     , rtrim(ltrim(ed1.SequenceNumber))
       ...
       ...
       ...
     , case
       when len(ed1.SaleDate) = 8
        and isdate(ed1.SaleDate) = 1
        and rtrim(ltrim(ed1.SaleDate)) like '[0-9][0-9][0-9][0-9][0-1][0-2][0-3][0-9]'
            then convert(date, ed1.SaleDate, 112) 
       else cast(null as date)
       end as SaleDate
-- fields related to validation
     , ed1.SaleDate as Input_SaleDate
     , dup.count_dups
into #validate_Equipment
from #EquipmentData1 ed1
left outer join (select JurisdictionCode
                      , AccountNo
                      , SequenceNumber
                      , SaleDate
                      , count(*) as count_dups
                 from #EquipmentData1
                 group by JurisdictionCode
                        , AccountNo
                        , SequenceNumber
                        , SaleDate
                ) dup
   on dup.JurisdictionCode = ed1.JurisdictionCode
   and dup.AccountNo = ed1.AccountNo
   and dup.SequenceNumber = ed1.SequenceNumber
   and dup.SaleDate = ed1.SaleDate

the above sql will give you all records from table #EquipmentData1, with SaleDate already validated, and with a count of duplicates associated with each entry (or null if one of the 4 fields are null)


Then insert into final table the valid ones

insert into Equipment_2014
select ve.JurisdictionCode
     , ve.AccountNo
     , ve.SequenceNumber
       ...
       ...
       ...
     , ve.SaleDate
from #validate_Equipment ve
where ve.SaleDate between convert(date, '20140101') and convert(date, '20141231') -- note that this will also ignore the records that are null
 and ve.JurisdictionCode is not null
 and ve.AccountNo is not null
 and ve.SequenceNumber is not null
 and ve.count_dups = 0 -- note that this will also ignore the records that are null

and then insert into a errors table the ones that failed for any reason.

insert into Equipment_2014_errors
select ve.JurisdictionCode
     , ve.AccountNo
     , ve.SequenceNumber
       ...
       ...
       ...
     , ve.SaleDate
-- fields related to validation also added to errors table so they can be queried easily
     , ed1.SaleDate as Input_SaleDate
     , dup.count_dups

from #validate_Equipment ve
where not (ve.SaleDate between convert(date, '20140101') and convert(date, '20141231') -- note that this will also ignore the records that are null
       and ve.JurisdictionCode is not null
       and ve.AccountNo is not null
       and ve.SequenceNumber is not null
       and ve.count_dups = 0 -- note that this will also ignore the records that are null
          )

note that on the above I "negated" the valid condition to make it easier to code
Alternative to the above where clause would be

where ve.SaleDate is null 
   or ve.SaleDate not between  convert(date, '20140101') and convert(date, '20141231') 
   or ve.xx is null
   or ve.xx is null
   or ve.xx is null
   or ve.count_dups is null 
   or ve.count_dups > 0


Regards

Frederico Fonseca
SysSoft Integrated Ltd

FAQ219-2884
FAQ181-2886
 
Would the use of MERGE allow for the exclusion of duplicate data within the source text files or is it primarily to allow for the handling of only duplicate data that is defined as pre-existing data in the production table relative to data within the text file?

I want to exclude the duplicate data that is within the text file as "effortlessly" as possible.

My understanding is that if I set a primary key constraint on the four fields, no duplicate data will be allowed into the production table. However, I believe that the Insert Into statement will fail because there are duplicate data that violate the constraints.

For example, say I have a monthly text file with say, 500,000 records that contain 1000 records that have the same value in the four previously mentioned fields.

Will the MERGE statement allow for the import of the 499,000 "good" records whereas using the "Insert Into" will not or will the MERGE statement allow me to import all 500,000 records as long as there is no "conflict" with any pre-existing records in the table?

 
MERGE only compares source and target row by row, that's right.Why would there be duplicate data in the text files? Or does this only come in, because you import multiple text files first? Then just change processing file by file.

To remove duplicate base in an ID, even on a compound ID, you will need one or more columns denoting which record is the one to keep, and you have to define what is the record to keep: The chronologically first? The last? The one with the highest amount stored in it? And if that is undecidable, you really have to go back to the source and correct the way these text files are generated in the first place, because that makes clear the problem is not solvable besides randomly always picking the correct row, which nobody can expect from you.

Bye, Olaf.



 
Error received upon running the portion of the sql script to insert into #validate_Equipment from #EquipmentData1 ed1;

Code:
An object or column name is missing or empty. For SELECT INTO statements, verify each column has a name. 
For other statements, look for empty alias names. Aliases defined as "" or [] are not allowed. Change 
the alias to a valid name.

Did create the field "Count_Dups" as the last field (with an integer data type) on the temporary table, #validate_Equipment.

However, the error remains.

Troubleshooting continues...

Will aliasing the fields in the SELECT portion resolve the error?

Code:
select rtrim(ltrim(ed1.JurisdictionCode))
     , rtrim(ltrim(ed1.AccountNo))
     , rtrim(ltrim(ed1.SequenceNumber))
       ...
       ...
       ...
     , case
       when len(ed1.SaleDate) = 8
        and isdate(ed1.SaleDate) = 1
        and rtrim(ltrim(ed1.SaleDate)) like '[0-9][0-9][0-9][0-9][0-1][0-2][0-3][0-9]'
            then convert(date, ed1.SaleDate, 112) 
       else cast(null as date)
       end as SaleDate
-- fields related to validation
     , ed1.SaleDate as Input_SaleDate
     , dup.count_dups
into #validate_Equipment
from #EquipmentData1 ed1
left outer join (select JurisdictionCode
                      , AccountNo
                      , SequenceNumber
                      , SaleDate
                      , count(*) as count_dups
                 from #EquipmentData1
                 group by JurisdictionCode
                        , AccountNo
                        , SequenceNumber
                        , SaleDate
                ) dup
   on dup.JurisdictionCode = ed1.JurisdictionCode
   and dup.AccountNo = ed1.AccountNo
   and dup.SequenceNumber = ed1.SequenceNumber
   and dup.SaleDate = ed1.SaleDate


 
yes - I forgot to put them in.

should be
select rtrim(ltrim(ed1.JurisdictionCode)) as JurisdictionCode
, rtrim(ltrim(ed1.AccountNo)) as AccountNo
, rtrim(ltrim(ed1.SequenceNumber)) as SequenceNumber
...

Regards

Frederico Fonseca
SysSoft Integrated Ltd

FAQ219-2884
FAQ181-2886
 
The error persists.

Comparing each field within the two temporary tables, #validate_Equipment and #EquipmentData1.

Also added the column "count_dups" to the temp table, #EquipmentData1?

Is there an alternative approach that accomplish the objective of loading the valid data into the production table?
 
Upon highlighting the line "into #validate_Equipment" within the section displayed below, I receive "No column was specified for column 1 of #validate_Equipment

Code:
into #validate_Equipment
from #EquipmentData1 ed1

Therefore, it appears that a column alias for the Left Join is needed.

However, "dup" is already specified as the alias for the Left Join.
 

Did resolve the error by modifying the data type for several fields on the temporary table, #validate_Equipment
to coincide with the data type for the field as displayed on the temp table #EquipmentData1.

Therefore, data was successfully loaded into the "validate_Equipment" temporary table when I highlight the "Insert into #validate_Equipment" section.

This leads to another question, should the data type for all fields on the temporary fields mentioned above be "Not Null?"

Or, maybe it does not matter as long as the data types coincide and that the "Null" or "Not Null" is more a concern for the tables in production?

Now, upon highlighting the "insert into Equipment_2014 section of the sql", no records were displayed. Stepping through the sql script by just selecting the "select statement" going forward and excluding the WHERE section still did not result in any rows in the query result.

Only when modifying the line displayed below is when rows began to appear in the query result set.

Code:
From
where ve.SaleDate between convert(date, '20140101') and convert(date, '20141231')

To
where ve.SaleDate between convert(date, '2014-01-01') and convert(date, '2014-12-31')

Then, re-running the sql script on the section "Insert into Equipment_2014" and highlighting successive lines in the WHERE section, there were fewer and fewer records displayed, as can be expected because "and" precedes each line. However, upon highlighting the full sql - including all of the lines that are preceded with "and" results in no records in the query result. Hence, no records will be imported into the table "Equipment_2014."


Code:
insert into Equipment_2014
select ve.JurisdictionCode
     , ve.AccountNo
     , ve.SequenceNumber
       ...
       ...
       ...
     , ve.SaleDate
from #validate_Equipment ve
where ve.SaleDate between convert(date, '20140101') and convert(date, '20141231') -- note that this will also ignore the records that are null
 and ve.JurisdictionCode is not null
 and ve.AccountNo is not null
 and ve.SequenceNumber is not null
 and ve.count_dups = 0 -- note that this will also ignore the records that are null


If a field is null on a record, I still would like for that record to be displayed within the database. I believe that the previously recommendation of preceding the lines in the WHERE section with "and" will result in some records not displaying in the database.

Any additional insight is appreciated.

Note, due to setting up a database for the first time, the learning curve is not as steep as previously and hopefully, the questions are not redundant.

 
Spoke too soon.

Overlooked the statement that "ve.count_dups = 0" already considers the null records.

Therefore, it appears that I need only include this line and not include the previous lines prefaced with "and.
 
regarding "where ve.SaleDate between convert(date, '20140101') and convert(date, '20141231')"
- this is an error that I had not noticed on your original code and didn't correct - should have been
"where ve.SaleDate between convert(date, '20140101', 112) and convert(date, '20141231', 112)"

regarding "ve.count_dups = 0" should be "ve.count_dups = 1" - my error on this.
(or "or ve.count_dups > 1" instead of "or ve.count_dups > 0" on the alternate version of the code)

as for having nulls on your database as you mentioned - your original code was excluding them so that is what I did - you will need to look at the data yourself and see if any of the records on table Equipment_2014_errors should be considered (after you do the change above).




Regards

Frederico Fonseca
SysSoft Integrated Ltd

FAQ219-2884
FAQ181-2886
 
Did make the revisions and not able to get past the Insert into Equipment_2014 Errors section.

Error received is

Code:
Operand type clash: date is incompatible with int


Using the sql like;

Code:
insert into Equipment_2014_Errors
select 
E.[JurisdictionCode],
E.[AccountNumber],
E.[SEQUENCE_NUMBER],
E.[SALE_AMOUNT],
E.[SALE_DATE] as Input_SALE_DATE

from #validate_EquipmentData E  

where not (E.SALE_DATE between convert(date, '20140101',112) and convert(date, '20141231',112)  
       and E.JurisdictionCode is not null
       and E.AccountNumber is not null
       and E.SEQUENCE_NUMBER is not null
       and E.count_dups = 1 
	   )

Note, when I query the table #validate_EquipmentData, the format of the Sale Date is like 2014-01-01.

Code:
select distinct top 10 SALE_DATE from #EquipmentData1  --format YYYYMMDD
select distinct top 10 SALE_DATE from Equipment_2014  --format YYYY-MM-DD
select distinct top 10 SALE_DATE from #validate_EquipmentData  --format YYYY-MM-DD
select distinct top 10 SALE_DATE from Equipment_2014_Errors  --format ??


Re-visited the dates throughout the sql script;
1. All fields declared as date format
2. Dates enclosed by apostrophe

Also data types across all temporary tables are consistent.

[Bold]
Any clue as to the cause and resolution to this error?
[/Bold]
What am I missing?

Tried the following modifications to the filter on the Sale Date but the error persists.

Code:
where E.SALE_DATE between convert(date, '20140101') and convert(date, '20141231') 
where E.SALE_DATE between '20140101' and '20141231'   
where E.SALE_DATE between '2014-01-01' and '2014-12-31'   
where E.SALE_DATE between convert(date, '2014-01-01') and convert(date, '2014-12-31')  
where E.SALE_DATE between convert(date, '20140101',112) and convert(date, '20141231',112)



 
I didn't read all the posts and this might be the clunkiest way but I use a CTE in a similar situation (Bulk copy and then manipulate)

I left all the column names just to show how granular the criteria can be:

SQL:
WITH CTE AS(
   SELECT
CreateDate,
ServiceVendorNumber, 
ServiceVendorName, 
PolicyNumber, 
BSN,
PolicyName, 
TaxID,
PolicyStatusEffectiveDate, 
PolicyStatus,
ReportingPeriodStartDate,
ReportingPeriodEndDate,
ManualClass, 
ManualClassType,
ManuaClassDescription, 
BWCCustomerIDforincludedIndividuals, 
IndividualFirstName,
IndividualMiddleName,
IndividualLastName,
IndividualTaxID,
ManualClassRate,
ReportingType, 
NumberofEmployees, 
Payroll,
RN = ROW_NUMBER()OVER(
PARTITION BY
CreateDate,
ServiceVendorNumber, 
ServiceVendorName, 
PolicyNumber, 
BSN,
PolicyName, 
TaxID,
PolicyStatusEffectiveDate, 
PolicyStatus,
ReportingPeriodStartDate,
ReportingPeriodEndDate,
ManualClass, 
ManualClassType,
ManuaClassDescription, 
BWCCustomerIDforincludedIndividuals, 
IndividualFirstName,
IndividualMiddleName,
IndividualLastName,
IndividualTaxID,
ManualClassRate,
ReportingType, 
NumberofEmployees, 
Payroll
ORDER BY PolicyNumber)
FROM BWC_RateFileBulkCopy
)
DELETE FROM CTE WHERE RN > 1
 
Re-visiting this once more...

Any insight as to how I can set the monthly bulk inserts up whereby I do not have to keep adding the following sections to the end of the sql script within SSMS (now, I have over 10,000 lines!) and running every time that I receive monthly data?

Currently reading about the use of a stored procedure or SSIS. However, I would like to first attempt the use of a stored procedure then proceed to the use of SSIS only if necessary. Based on the reading so far, it appears that I will need parameters for
@FilePath
@FileNameMask


[Bold]
As I am not familiar at all with stored procedures, I would appreciate any insight concerning best practices regarding the use of a stored procedure with several parameters. Are there any disadvantages or nuances/intricacies with the use of stored procedures that I should be aware of?[/Bold]




For example, for the October 2017 data, I am using something similar to the following;

Code:
USE EquipmentDatabase

If object_id ('tempdb..#EquipmentData') is not null
BEGIN
    DROP Table #EquipmentData
END
GO


Create Table #EquipmentData
.
.
.
GO


Bulk Insert #EquipmentData
	From 'C:\Data\TextFilesToLoad\Equipmentdata\EquipmentData_201710.txt'
	With
	   (
		Fieldterminator = '\|',
		Rowterminator = '\n'
		Firstrow = 2
	   );
Go



if object_id('tempdb..#validate_EquipmentData' is not null
BEGIN
   DROP table #validate_EquipmentData;
END
GO


Create Table #validate_EquipmentData
.
.
.
[Count_Dups] int
GO


Insert into #validate_EquipmentData 
select rtrim(ltrim(ed1.JurisdictionCode))
     , rtrim(ltrim(ed1.AccountNo))
     , rtrim(ltrim(ed1.SequenceNumber))
       ...
       ...
       ...
     , case
       when len(ed1.SaleDate) = 8
        and isdate(ed1.SaleDate) = 1
        and rtrim(ltrim(ed1.SaleDate)) like '[0-9][0-9][0-9][0-9][0-1][0-2][0-3][0-9]'
            then convert(date, ed1.SaleDate, 112) 
       else cast(null as date)
       end as SaleDate
-- fields related to validation
     , ed1.SaleDate as Input_SaleDate
     , dup.count_dups
from #EquipmentData ed1
left outer join (select JurisdictionCode
                      , AccountNo
                      , SequenceNumber
                      , SaleDate
                      , count(*) as count_dups
                 from #EquipmentData
                 group by JurisdictionCode
                        , AccountNo
                        , SequenceNumber
                        , SaleDate
                ) dup
   on dup.JurisdictionCode = ed1.JurisdictionCode
   and dup.AccountNo = ed1.AccountNo
   and dup.SequenceNumber = ed1.SequenceNumber
   and dup.SaleDate = ed1.SaleDate


/********************************************************************************* 
 one time setup; do not include in stored procedure

if object_id('dbo.Equipment_2017','U' is not null
BEGIN
   DROP table dbo.Equipment_2017;
END
GO

Create Table Equipment_2017 (
JurisdictionCode varchar (5) NULL,
AccountNo	varchar (45) NULL,
SequenceNo	int NOT NULL,
SaleDate	date NULL,
SaleAmount	int NULL,
contraint FK_CapitalTax2 Foreign Key (JurisdictionCode, AccountNo, SequenceNbr) REFERENCES CapitalTax2 (JurisdictionCode, AccountNumber, SequenceNbr)
***********************************************************************************************/

Insert into Equipment_2017
select
ve.JurisdictionCode,
ve.AccountNo,
ve.SequenceNumber,
.
.
.
from #validate_Equipment ve
where ve.SaleDate between convert(date, '20170101') and convert(date, '20171231') 
 and ve.JurisdictionCode is not null
 and ve.AccountNo is not null
 and ve.SequenceNumber is not null
 and ve.count_dups = 0


if object_id('Equipment..Equipment_201710_Errors') is not null
BEGIN
   DROP table Equipment_201710_Errors;
END
GO

Create Table Equipment_201710_Errors (
JurisdictionCode varchar (5) NULL,
AccountNo	varchar (45) NULL,
SequenceNo	int NOT NULL,
.
.
.
SaleDate	date NULL,
SaleAmount	int NULL
)
GO


insert into Equipment_201710_Errors
select ve.JurisdictionCode
     , ve.AccountNo
     , ve.SequenceNumber
       ...
       ...
       ...
     , ve.SaleDate
from #validate_Equipment ve
where not (ve.SaleDate between convert(date, '20170101') and convert(date, '20171231') 
       and ve.JurisdictionCode is not null
       and ve.AccountNo is not null
       and ve.SequenceNumber is not null
       and ve.count_dups = 0 
          )
 
Status
Not open for further replies.

Part and Inventory Search

Sponsor

Back
Top