SQL Server 2014 transaction log for database is full due to ACTIVE_TRANSACTION error

Hi, We are using SQL Server 2014 database and SSIS to load data to tables and the volume of data to load would vary from 2 million records to 17 million records and while I was working on loading data to a table approx. 6 million records I run into this error every once in while "transaction log for database is full due to ACTIVE_TRANSACTION".

The database is in Simple Recovery mode and not exactly sure what would cause this? What approach should I be taking to free up the transaction log from time to time.

As a work around to fix this issue I tried using checkpoint and even this did not help resolve this issue.

Thanks in advance..............

July 28th, 2015 2:49pm

  • General practice is to keep your t-Log growth restricted else you will end up in Disk space issues.
  • Don't shrink while some transactions are running against that database
  • If you think its growing frequently have more t-log backups in place.(For this You need to change to FULL recovery)

If its in simple recovery you can't perform T-Log backup (as it truncates automatically when it rises above 70% unless some transaction is ongoing) and also you wont be able to recover the database to a point in time as you don't have log backups

If you cant change it to FULL then probably you would need to add space on drive as Final solution. But I would say change this to Full recovery and schedule t-log backups based on frequency of your transaction.(every 30 mts/1 hour)

Also you can try BULK recovery model but again it all depends on criticality of your da

July 28th, 2015 3:05pm

Most lilkely the log filled up due to your SSIS Load process. While there is an active tran on the database the log will not be able to be freed up by until the transaction is finished.

Are you batching your SSIS load into SQL tables? if no, i would definetely set read row per batch , and commit row per batch on your SSIS destinations to help manage the TLOG.

Free Windows Admin Tool Kit Click here and download it now
July 28th, 2015 4:22pm

Thanks all for your response. I am using a stored procedure in SSIS Execute SQL Task to insert records and I have no access to change the database recovery model.

Regards......

July 28th, 2015 5:12pm

Well, you dont need to change your recovery model since you are already in SIMPLE.

IF you are using Execute SQL Task, take a look at the link below it may be helpfull  as it pertains to Trace Flag 610 

https://technet.microsoft.com/en-us/library/Dd425070(v=SQL.100).aspx

Let me know if it helps

Free Windows Admin Tool Kit Click here and download it now
July 28th, 2015 5:29pm

The log will never truncate past the oldest open transaction, so the log filling up could be due one of:

1) An orphaned transaction, for instance in a query window in SSMS. Run DBCC OPENTRAN in your database to see if there are any open transactions.

2) The load you do from SSIS is itself such a big transaction that the log fills up. I don't know much about SSIS, but it is a good idea to load data in batches.

And, oh, check the autogrow setting for the log. It may be out of whack. If it says that the log should grow with 50 GB, and there is only 49 GB free, the autogrow will fail.

July 28th, 2015 5:46pm

This is common issue what you are facing on one side you are loading millions of data and other side you expect logs would not bloat this is little uncalled for. If you are thinking transaction log cannot grow in simple recovery you are wrong simple recovery is almost same as full just the transaction log truncation Symantec are different in both. 

A long running open transaction can hold the logs and would not allow it to truncate.

Changing to bulk logged might not help either as only supported minimally logged transaction would be minimally logged.

I can only say look for open transaction, include multiple checkpoint in data load, try to reduce the amount of data you are entering. Dont enter in one go instead try entering in multiple batches

Look at autogrowth of log file it should not be in percentage

Free Windows Admin Tool Kit Click here and download it now
July 29th, 2015 12:53am

The appropriate response to a full transaction log depends partly on what condition or conditions caused the log to fill. To discover what is preventing log truncation in a given case, use the log_reuse_wait and log_reuse_wait_desc columns of the sys.database catalog view. 

Alternatives for responding to a full transaction log include:

  • Backing up the log.

  • Freeing disk space so that the log can automatically grow.

  • Moving the log file to a disk drive with sufficient space.

  • Increasing the size of a log file.

  • Adding a log file on a different disk.

  • Completing or killing a long-running transaction.

>>As a work around to fix this issue I tried using checkpoint and even this did not help resolve this issue.

You've got a long running transaction that is not allowing the database to do a checkpoint.

Run the following query in SSMS to determine the long running query.

SELECT last_request_start_time,* FROM sys.dm_exec_sessions WHERE status <> 'sleeping'

(or try DBCC OPENTRAN)

July 29th, 2015 1:39am

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics