SQL Server Integration Services Transactions without MSDTC

By:   |   Updated: 2013-10-15   |   Comments (30)   |   Related: More > Integration Services Connection Managers


Problem

You want to incorporate transactions in your SQL Server Integration Services Packages. For example, you insert data into a table with a data flow and after the data flow has finished an update statement is issued against the same table. If this update statement fails, you want to roll back the data inserted by the data flow. Normally you can enable the built-in transaction model of SSIS which relies on the Distributed Transaction Coordinator (MSDTC), but for certain reasons you cannot use the MSDTC in your environment. How can you still enable transactions in your SSIS package?  Check out this tip to learn more.

Solution

Luckily the solution is pretty straightforward: for every task using a database connection, we can use database transactions. Let's find out how to make this work.

Sample SSIS package

As an example for this tip, we have a very basic SSIS package. It reads employee data from a source and stores the new employees in the employee dimension.

Basic sample package

After the data flow has finished, the Execute SQL Task issues an update statement against the dimension that will update the parent-child relationship. In other words, it will set the foreign key relationship from a child pointing to the primary key of the parent. The update statement also allows employees to change who their manager is for example.

The requirement is that when the update statement fails, the inserts from the data flow are rolled back. Since we cannot use MSDTC, we need an alternative.

Using SQL Server Database Transactions

If the tasks that need transactions all use the same connection manager, we can leverage database transactions to achieve our requirement. Let's start by putting an Execute SQL task before the data flow and connect both of them with a Success precedence constraint (the green arrow). This task uses the same connection manager as the data flow and the Execute SQL Task. The task specifies one simple SQL statement:

BEGIN TRANSACTION;

This statement will start an explicit transaction on the current connection. After the Execute SQL Task we put another Execute SQL Task and we connect both of them again with a Success precedence constraint. Again, this task uses the same connection manager as the other tasks. The tasks has the following SQL statement:

COMMIT TRANSACTION;

This statement will commit the explicit transaction on the current connection if the Execute SQL Task with the update statement has finished successfully.

Sample package with transactions

But what if the update statement fails? In that case, the SSIS package will fail because an error occurred and it will stop running. Any remaining open connections will be closed when the SSIS package is shut down. The SQL Server database engine will roll back the transaction when the connection is closed. If you prefer to have more control over when the transaction is rolled back, another Execute SQL Task can be added to the control flow. This task uses a Failure precedence constraint (the red arrow). This task explicitly rolls back the transaction with the following statement:

ROLLBACK TRANSACTION;

This approach is more robust and it allows you to add some possible clean-up code.

Sample package with transactions and rollback

SQL Server Integration Services Connection Manager

However, when we run the package, the following error is returned by SSIS:

Executing the query "COMMIT TRAN;" failed with the following error: "The COMMIT TRANSACTION request has no corresponding BEGIN TRANSACTION."

Sample package with transactions and rollback

This is caused by the fact that the connection manager actually doesn't create one single connection. When running the package, multiple connections are created for the different tasks. You can look at the connection manager as if it is a little factory, creating connections for all the tasks. This means that when SSIS reaches the final Execute SQL Task, the connection from the first Execute SQL Task - the one that starts the transaction - is already closed. Hence, SQL Server doesn't find the corresponding BEGIN TRAN, so it returns an error. How can we solve this issue? The connection manager has a very useful property called RetainSameConnection. By default this property is set to false. When we change it to true, we tell the connection manager to create only a single connection which will be shared amongst the different tasks.

Setting the RetainSameConnection property

The package will now complete successfully. Unless an error occurs of course and in that case everything is rolled back nicely.

Next Steps


sql server categories

sql server webinars

subscribe to mssqltips

sql server tutorials

sql server white papers

next tip



About the author
MSSQLTips author Koen Verbeeck Koen Verbeeck is a seasoned business intelligence consultant at AE. He has over a decade of experience with the Microsoft Data Platform in numerous industries. He holds several certifications and is a prolific writer contributing content about SSIS, ADF, SSAS, SSRS, MDS, Power BI, Snowflake and Azure services. He has spoken at PASS, SQLBits, dataMinds Connect and delivers webinars on MSSQLTips.com. Koen has been awarded the Microsoft MVP data platform award for many years.

This author pledges the content of this article is based on professional experience and not AI generated.

View all my tips


Article Last Updated: 2013-10-15

Comments For This Article




Thursday, March 29, 2018 - 12:14:46 PM - CHOCO Back To Top (75553)

 It looks to be when the DFT starts it is creating a new connection and commiting the inserts on completion instead of using the existing connection that the Execute Task is using.

 Does this logic works only with the Execute task ?

 I am using oledb Source , Destinations and Connections in the DFT.


Tuesday, March 20, 2018 - 5:59:10 AM - Koen Verbeeck Back To Top (75476)

What destinations are you using? If you are using fast load, what are the options you configured?


Friday, March 16, 2018 - 3:55:14 PM - CHOCO Back To Top (75451)

Sorry for the late response... The dataflows are in series 

 


Friday, March 16, 2018 - 8:21:12 AM - Koen Verbeeck Back To Top (75440)

 

You can maybe try a package connection manager instead of a project connection manager (although that shouldn't be much of a difference).
Are the data flows in parallel or in series?


Thursday, March 15, 2018 - 9:39:03 AM - CHOCO Back To Top (75426)

Thank you for your response Koen

Yes i did set RetainSameConnection to true

i am using the PROJECT connection manager

Yes  there is only one connection  in the connection manager.

But no luck with the dataflow tasks.  

 


Wednesday, March 14, 2018 - 3:00:12 AM - Koen Verbeeck Back To Top (75409)

 

Hi Choco,

did you set RetainSameConnection to true on the connection manager? Are you using a project or package connection manager?
Are all of the data flows using the same destination?

 

Koen


Tuesday, March 13, 2018 - 4:26:39 PM - CHOCO Back To Top (75404)

  The issue iam running into is i do have multiple data flow tasks (DF1,DF2,DF3 and DF4) all the the four Data flow tasks inserts into different tables. after i implement the above logic if the DF4 failed then the inserts that happend in DF1, 2, 3 all should ROLLBACK but that not happening currently. the ETL executes to the Rollback task but I still see the inserted records in the destination tables in DF 1, 2 and 3. also tried keeping all the DF in one SEQUENCE container still no luck. Pease suggest if there is a way to roll back DF1 and DF2 and DF3  data changes that happend during the same transactions. thank you

 


Tuesday, March 13, 2018 - 4:05:25 PM - CHOCO Back To Top (75403)

 Hi, Nice article. i tried to implement this on my ETL with dataflow tasks and it did not work as expected,  But it worked great if used SQL execute tasks. How can i make it work with dataflow tasks? Any thoughts would be really appreciable. thank you in advance . 

 


Wednesday, October 11, 2017 - 10:17:21 AM - Marius Back To Top (67199)

 It is a OLEDB destination, data access mode is set to "Table or view - fast load", Max insert commit size is max (2147483647), Rows per batch is not set, I understand it will be ignored since Max insert commit size is set...

SSIS can divide a dataflow into multiple parallel flows, I wonder if that's the reason why, maybe a parallel (synchronous) data flow finished successfully?

As you said, the only guarantee is to use transactions..

Thanks! 

Marius


Tuesday, October 10, 2017 - 2:00:44 AM - Koen Verbeeck Back To Top (67138)

Hi,

If you're using the normal load options - instead of the default fast load - rows are inserted row-by-row so your scenario is not that exotic in that case.


Monday, October 9, 2017 - 10:53:50 AM - Marius Back To Top (67108)

 I was surprised there were only 7 rows, so I thought maybe this is default behaviour when aborting...

Thanks you for your reply!

 


Monday, October 9, 2017 - 2:26:55 AM - Koen Verbeeck Back To Top (67080)

Hi Marius,

it depends :) In the destination component (the OLE DB one at least) has some settings for the batch size and the maximum insert commit size. With the default settings (batch empty and a large number for the commit size), theoretically all rows should be committed in one single batch. However, this is not guaranteed. It is possible - as you have seen - that some batches are committed earlier. To be 100% sure, you can either use transactions or you can do some sort of clean-up if the data flow fails.


Sunday, October 8, 2017 - 12:01:52 AM - Marius Back To Top (67035)

 Hi,

I have a question, I always thought dataflows run as a single transaction (even without using MSDTC transactions or what you taught us here) so they are rolled back if there'san error. But recently it happened that the dataflow encountered a different datatype in a column and aborted. To my surprise, the first 7 rows that were ok were left in the destination table.. So it looks like if you want all or nothing from a dataflow you prettymuch have to use transactions?

Thanks,

Marius 

 


Monday, June 12, 2017 - 9:40:16 AM - Ivana Back To Top (57254)

I went with explicit transactions, thank you!

Regards,
Ivana.


Friday, June 9, 2017 - 9:41:56 AM - Koen Verbeeck Back To Top (57010)

Hi Ivana,

explicit transactions are a database feature and the SSIS transactions rely on MSDTC.
I found that explicit transactions are easier to configure (you don't rely on other admins), but you just have to be careful you're not causing any locking or blocking.

MSDTC - from what I've heard - is not so easy to configure and troubleshoot.


Friday, June 9, 2017 - 5:15:09 AM - Ivana Back To Top (56995)

Hello Koen, 

I'm working on a project where data is collected from databases on 3 different servers and inserted in a single database on my server.
To save some time, first thing I do is truncate my table, and then insert their data into it (couple of Data Flow Tasks in a sequence container).
Transactions are needed in case something goes wrong with Data Flow Tasks and my table ends up empty.

So, would you say Explicit Transactions are better than TransactionOption in SSIS, and why?

Thank you in advance for your response,
Ivana. 


Thursday, March 9, 2017 - 5:56:36 AM - Koen Verbeeck Back To Top (47652)

Hi Eric,

what do you mean with "the error doesn't show a duplicate record"? Can you elaborate?

Koen


Tuesday, March 7, 2017 - 1:31:43 PM - Eric Back To Top (47510)

 I noticed when the package fails or rollsback the error doesn't show a duplicate record.  Do you need to adjust some other proprerty in your control or data flows?

 


Wednesday, November 16, 2016 - 9:11:45 AM - Koen Verbeeck Back To Top (43773)

Hi Qusai,

you can reach me at koen dot verbeeck at outlook dot com.


Wednesday, November 16, 2016 - 5:24:00 AM - Qusai Dalal Back To Top (43771)

 Hi Koen,

Update on my earlier comments. I was able to roll back the inserts in the table if I used a Local Connection Manager (Package Level) instead of Project Level which is a standard we use for all our Master & Child Packages. But not sure why the Updates done as part of data flow are not getting rolled back.

We have configured all our packages of a single Project to be called parallely through a Master Package using For Each Iteration. Hence we are using Project Level deployment as well as Common Connnection Managers across all packages. Hence using a package level connection manager is not a feasible option for us.

 Also request you to share what are the major limitations of using MSDTC as thats the option I would like to go for if Project Shared Connection cannot be used for above approach.


Wednesday, November 16, 2016 - 12:59:28 AM - Qusai Dalal Back To Top (43770)

 Hi Koen,

The problem earlier stated to execute the Rollback or Commit task has been solved. But even after executing the Roll back SQL task once the error occurs, its not actually rolling back the 5 new records inserted and old version of same 5 records updated to mark it as expired as part of the dataflow. Not sure why the data roll back is not happening.

The connection "DWpresentation" is a shared connection for which I have set the property as RetainSameConnection as True and thats where the Destination table resides

 Please share your email Address so that I can mail you the screenshot of the executed package.


Monday, November 14, 2016 - 5:23:53 PM - Koen Verbeeck Back To Top (43759)

Hi Qusai Dalal,

would it be possible to share a screenshot of your package?

Regards,
Koen


Monday, November 14, 2016 - 10:31:49 AM - Qusai Dalal Back To Top (43758)

 Hi Koen,

Thanks nice solution to use Transaction without using DTC. But when I replicated the steps above for my package. I saw that even on Error the package fails but even based on failure precedence it does not go to Roll back Transaction Task. Also on Success the Commit Transaction Task is not executed, but changes are committed by default. Not sure why its not working for me. Any other Properties of package or connection manager to be changed apart of RetainSameConnection as true. TransactionOption as Supported for all Tasks and Package which is the default value.

 


Tuesday, June 7, 2016 - 4:40:56 AM - Kumar Back To Top (41624)

Wonderful..!

 


Friday, September 18, 2015 - 1:37:24 PM - Koen Verbeeck Back To Top (38713)

Hi Scott,

do both tables use the same connection manager?


Thursday, September 17, 2015 - 4:14:09 AM - scott Back To Top (38697)

I am trying to use sql transaction in foreach loop where I am passing  1 ID @ a time .I want to rollback only current ID.

 1st task in foreach is execute sql task with begin tran and next is data flow task with 2 source and destination for 2 different tables (if anyone fails then rollback from all tables). After data flow is another execute sql task with rollback tran on failure (logical condition on pipe is OR). The rollback works but only rollback 1 table although I want to rollback on current iteration ID of foreach  from both tables.

 
Any Idea achieve this?

Thursday, August 21, 2014 - 6:17:19 AM - Neelam Back To Top (34218)

Excellent ! :)


Wednesday, August 6, 2014 - 12:22:24 PM - Koen Verbeeck Back To Top (34030)

Hi kwymore,

thanks for reading!

You are certainly correct. This was indicated in the article by the generic "Do some clean-up" task :D

Usually I don't have a failure path in such packages, but only the commit. If an error occurs, the package stops and SQL Server will automatically roll back. Then SQL Server Agent can do its thing.


Wednesday, August 6, 2014 - 11:35:37 AM - kwymore Back To Top (34028)

Nice article Koen. I always enjoy your write-ups.

One other thing to add here. If you are setting up a package to rollback a transaction on failure then you will want to set the event handler to fire an email off notifying the administrators that the task failed even though the package succeeded. Otherwise you might expect an update/insert/delete to be ocurring because SQL Agent is not showing failures when in fact it is rolling back every time!


Tuesday, October 15, 2013 - 1:44:17 AM - Vashistar Back To Top (27150)

Thanks for this article, Good One.

 















get free sql tips
agree to terms