Tuesday, 25 December 2018

SSIS Incremental Load Example - SQL Serve Integration Services

Incremental Loads in SSIS are often used to keep data between two systems in sync with one another. They are used in cases when source data is being loaded into the destination on a repeating basis, such as every night or throughout the day


Below we have a

ll the steps are done in the SQL Server Integration Service (SSIS).


We don’t used the Lookup function because, while working with the large amount of records you must take care of the SSIS performance. Lookup will fetch all the records and then it will compare to your condition.

But in this solution, we had considered only those records which was the latest in the live tables.

1. Store Max Timestamp/Value

insert into ETL_Table_Last_Entry
   Select 'G_L_Entry' TableName,B.Company, isnull(max(timestamp),0x0000000000000000)  TimeStamp
   from dbo.[G_L_Entry] B
   --Where [Posting Date] >@Date
       Group By B.Company

2. Find Last value and store in the variable
SELECT dbo.fnTS2VC([TimeStamp]) AS t
FROM ETL_Table_Last_Entry
WHERE (Company = 1) AND (TableName = 'G_L_Entry')


3. Truncate temporary table
    
     truncate table tmp_GL_Entry

4. Add Data flow task and add latest records

"Select * from dbo.[" +  @[User::Company_Name] + "$G_L Entry]
where timestamp > 0x"  +  @[User::PV1]  + " order by timestamp"

In above query we taking those record which are greater then our stored value, We have stored above query in variable to get the dynamic execution based on the multiple tables.
In Source we have execute the query 





Our Destination configuration

5. Delete duplicate  record


"Delete a from G_L_Entry a inner join tmp_GL_Entry b on a.[Entry No_]=b.[Entry No_] and a.Company=b.Company where a.Company =  " +  @[User::Company_No]

Above query we have stored in the variable and added in the SQL task.

6. Reinsert the remaining records in actual table.



7.Rebuild the index for actual table.

8.Update latest record to get the new record while reading in the next read



Now execute the package and check your output.


Thanks for reading.

1 comment:

  1. Thank you so much for this nice information. Hope so many people will get aware of this and useful as well. And please keep update like this.

    Big Data Consulting Services

    Data Lake Solutions

    Advanced Analytics Services

    Full Stack Development Solutions

    ReplyDelete