Using with (NOLOCK) and rows per batch?
I have a package that has something like this in an OLE DB Source task: select HEIGHT from dbo.PEOPLE [P] left join dbo.CURRENTHEIGHT [C] on [P].[ID] = [C].[ID] where [C].[HEIGHT] <> [P].[HEIGHT]; I then have an OLE DB Destination task that just inserts the results of that into dbo.[CURRENTHEIGHT]. On this task, my batch size is set to 100k and "maximum insert commit size" is set to 100k. I have Table Lock and Check Constraints turned on (checked). If dbo.PEOPLE has > 100k records, my package locks. So, I added with (nolock) to the left join and everything seems to work fine. What I am wondering though, is since I am reading from and inserting into dbo.CURRENTHEIGHT, will the select statement be able to pull back rows that were newly inserted? I am not sure how SSIS chunks things into 100k rows and reads them during the OLE DB Source task so I am not sure if with NOLOCK will cause issues in this case as I do not want to read the newly inserted rows. Hopefully that all makes sense.
November 7th, 2012 3:10pm

SSIS runtime engine does calculate the number of rows it can keep in the buffer based on parameters such as Estimated Row Size, Max Buffer Rows and Buffer Size. So depending on your data size the SSIS buffer manager adjusts the number of rows that is read and kept in the buffer memory. There is a whitepaper by the Hitachi group on Data Flow task Performance tuning. Read that for a better understanding on Buffer memory management. Your current design, does bring in some doubts to me. I am afraid it might cause deadlocks. So would it be a difficult task to alter the design 1. Select Data from the People Table 2. Add a Look Up transformation anf using the Partial cache identify all records that have a matching ID and different height 3. Redirect Rows that do not match the criteria to a OLE Destination to save data. Let me know your thoughts.Regards, Dinesh
Free Windows Admin Tool Kit Click here and download it now
November 7th, 2012 3:57pm

SSIS runtime engine does calculate the number of rows it can keep in the buffer based on parameters such as Estimated Row Size, Max Buffer Rows and Buffer Size. So depending on your data size the SSIS buffer manager adjusts the number of rows that is read and kept in the buffer memory. There is a whitepaper by the Hitachi group on Data Flow task Performance tuning. Read that for a better understanding on Buffer memory management. Your current design, does bring in some doubts to me. I am afraid it might cause deadlocks. So would it be a difficult task to alter the design 1. Select Data from the People Table 2. Add a Look Up transformation and using the Partial cache identify all records that have a matching ID and different height 3. Redirect Rows that do not match the criteria to a OLE Destination to save data. Let me know your thoughts. Regards, Dinesh
November 7th, 2012 3:57pm

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics