TransactionOption=Required fails when two SQL tasks run simultaneously - why?
Greetings, I have a package with a sequence container that has TransactionOption = Required. Within the sequence container are two SQL tasks that each call a (different) stored procedure. Neither stored procedure starts or commits a transaction. The SQL tasks run at the same time by design - they are lengthy and independent processes and I want them running at the same time. However, when I do this the package fails with the message that DTC has cancelled the transaction. DTC is configured properly on my machine and both the SQL task procedures will run successfully if the other task is disabled. Also, they will both run successfully if I use a connector to force them to run sequentially instead of simultaneously. I have used isolation levels of ReadCommited and Serializable and the outcome always is failure. Do you know what is causing the parallel executions to fail or what I could try to overcome this problem? Gracias, BCB
April 7th, 2011 5:13pm

Do the stored procedures read or make changes to the same tables?Russel Loski, MCITP Business Intelligence Developer and Database Developer 2008
Free Windows Admin Tool Kit Click here and download it now
April 7th, 2011 6:10pm

The two procedures read some of the same tables. Each of the procedures updates a separate table. Thanks-
April 7th, 2011 6:55pm

Hi Do you have the RetainSameConnection property on your database connection set to TRUE? My understanding is that if RetainSameConnection=TRUE and TransactionOption=Required you cannot run parallel database tasks. You are basically asking tow queries to run on the same database connection at the same time. Craig Bryden - Please mark correct answers
Free Windows Admin Tool Kit Click here and download it now
April 7th, 2011 9:40pm

No, the RetainSameConnection property is set False. I've tried it set to both True and False and the SSIS package fails either way with the identical "DTC" error message. I am using SS 2008. Gracias-
April 8th, 2011 9:22am

Can we see the exact wording of the error reported by the "DTC"?Arthur My Blog
Free Windows Admin Tool Kit Click here and download it now
April 8th, 2011 9:38am

One of the tasks fails with this error, which I believe is the "real" error: [Execute SQL Task] Error: Executing the query "EXEC dbo.usp_Prepare_Ext_..." failed with the following error: "The Microsoft Distributed Transaction Coordinator (MS DTC) has cancelled the distributed transaction.". Possible failure reasons: Problems with the query, "ResultSet" property not set correctly, parameters not set correctly, or connection not established correctly. The other task also fails and reports this error: [Execute SQL Task] Error: Failed to acquire connection "EW_OLEDB". Connection may not be configured correctly or you may not have the right permissions on this connection. Thanks...
April 8th, 2011 9:42am

Does the failure occur right away or after some time? The DTC may be the issue.Arthur My Blog
Free Windows Admin Tool Kit Click here and download it now
April 8th, 2011 10:03am

The failure is immediate. Is it possible to run two simultaneous SQL tasks within a transaction that is controlled by SSIS? No one has yet said that they have done this, so I'm wondering if the technique can work.
April 8th, 2011 10:18am

I do, I do not have the DTC though. I just do not connect the tasks. And I have several DFTs, this is all. But in your case you need to undo one SP or the other if any fails. So you may be better off using the DTC and not the sequence container. Or the reverse. Do you have in your SPs "begin distributed transaction" clause added? Arthur My Blog
Free Windows Admin Tool Kit Click here and download it now
April 8th, 2011 11:34am

Arthur - I guess I don't understand your reply. I don't have BEGIN DISTRIBUTED TRANSACTION because I'm relying upon SSIS to perform the transaction management due to my specification of TransactionOption = Required at the sequence container level. Also, what is a DFT? Thanks--
April 8th, 2011 3:01pm

I was thinking, you can experiment with adding the BEGIN DISTRIBUTED TRANSACTION rather than using the sequence container. DFT stand for Data Flow Task.Arthur My Blog
Free Windows Admin Tool Kit Click here and download it now
April 8th, 2011 3:10pm

check this post. not sure if this helps. http://social.msdn.microsoft.com/Forums/en-US/sqlintegrationservices/thread/566515fd-0c98-414e-ab1a-c7aeda6b162a/
April 8th, 2011 3:25pm

The approach that worked for me was to stop using stored procedures. I instead use a data flow task. I created sets of OLE DB Source and OLE DB Destination. Each pair is a replacement for one of the stored procedures that formerly did the insert. I now have TransactionOption = Required on the parent sequence container and TransactionOption = Supported on the data flow task. Everything appears to work properly within a transaction. One puzzling aspect of this is that I had to replace a TRUNCATE TABLE dbo.xxx with DELETE FROM dbo.xxx in a SQL task that precedes the data flow task. Using the TRUNCATE appeared to cause the package to freeze because of competition for lock resources. I never figured out the exact reason but the solution was to replace the TRUNCATE.
Free Windows Admin Tool Kit Click here and download it now
April 11th, 2011 4:29pm

One more piece of information to tie up loose ends. In the completed Data Flow Task I was inserting data into 21 destination tables. This occurs in a large multi-TB data warehouse environment and the processing of the 21 tables can take 6 hours, depending on the amount of new data that has to be processed in a given run. I was disappointed to find in my testing that my original approach, using a stored procedure, ran in a shorter time than the Data Flow Task. The stored procedure inserted rows into the 21 tables one at a time (sequentially) rather than in parallel as is done in the DFT. The sources for the 21 sets of data are views, many of which read data from the same tables, so perhaps the competition for the same table resources caused the DFT to run longer. My tests were performed with a small amount of data relative to our production warehouse, so it also is possible that the DFT would scale better than the stored procedure (single-thread) approach with large data loads. Still, the initial test results were so disappointing that we have gone back to using the stored procedure to load the 21 tables sequentially. This is the sad recap: 1. Multiple stored procedures running simultaneously within a sequence container doing the transaction management (TransactionOption = Required) failed with a DTC error. 2. A Data Flow Task processing the 21 tables ran successfully but was slower than the original approach of having a single stored procedure do 21 inserts sequentially. 3. We went back to the original approach of having a single stored procedure do 21 inserts sequentially. This approach surprisingly was faster than the Data Flow Task, when tested with a modest amount of data. BCB
April 27th, 2011 8:54am

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics