Data Flow task buffer failure error
I am getting data flow task fuffer failure error while doing lookup. Since the lookup is Address columns, which is taking 2 GB space for full cache. My question is how could I allocate more memory to the run process. when running in DTExecUI it is completing in 20 minutes, while scheduling through SQL Agent, it is failing with buffers error. I have 8 cores and 16 gb ram on Win 2008/SQL 2008 64 bit system. I limited SQL server max mem to 10 GB and leaving 6 gb for others When running through agent, I could see 4 gb free space, but still failing with buffer space issues. So how could I utilize more memory to the process to avoid buffers issue. I tried partial cache and the pkg taking 4 hours through SQL agent. Thanks -- Anju The ProcessInput method on component "Lkp - Address" (151) failed with error code 0xC0209029 while processing input "Lookup Input" (164). The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure. End Error Error: 2010-12-18 13:50:51.86 Code: 0xC02020C4 Source: Flag Bill Records on Service Address SourceQuery - SADD Bill Records [1] Description: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
December 20th, 2010 9:22am

Are you searching for addresses in the lookup? Perhaps a better technique to employ is to use T-SQL (for example using EXISTS) say a stored procedure can do the lookup so your code is executed entirely inside the SQL Server and thus you do not get affected by the Agent.Arthur My Blog
Free Windows Admin Tool Kit Click here and download it now
December 20th, 2010 9:51am

I totally agree with using T-SQL. But my boss always wants everything in SSIS flow. His question was when it runs in DTExecUI, why can't it run in Schedule. Thanks --Anju
December 20th, 2010 2:08pm

The SQL Server Agent runs asynchronously, this is the difference. How about using the SQL Query Task?Arthur My Blog
Free Windows Admin Tool Kit Click here and download it now
December 20th, 2010 2:25pm

I didn't understand the difference of SQL Agent running asynchronous to other syncronous, but i guess it is using available memory in different runs. Could you shed some light on it and where can I get some notes on it (probably some links). Since this is an update and insert dataflow, I wrote a Merge query to update the records/insert the records, which is running faster, but loosing cause of our team (who are converting old stored proc's based process to GUI based ETL). Thanks --anju
December 20th, 2010 2:53pm

The shortest best note on this I could find is by Todd: http://social.msdn.microsoft.com/Forums/en/sqlintegrationservices/thread/79a76fcf-bf58-4c83-b1da-3b3174480adaArthur My Blog
Free Windows Admin Tool Kit Click here and download it now
December 20th, 2010 3:17pm

Thanks, I got your point of Agent Asyncronous run. But that doen't relate to my run time memory issue. To re-iterate my issue in simple terms, How come pkg run well in BIDS and fail with Data flow task buffers in SQL Agent run (both times it is running in the same server)?
December 20th, 2010 4:43pm

Hmm, I am starting to suspect you need to revise the settings in your Data Flow task to use: FastLoadOptions set to TABLOCK, CHECK_CONSTRAINTS, ROWS_PER_BATCH=1000 and FastLoadMaxInsertCommitSize to 1000.Arthur My Blog
Free Windows Admin Tool Kit Click here and download it now
December 20th, 2010 4:53pm

I tried with and with out those FastLoadOptions, with out any luck. Only way, it is running is by setting partial cache, and pkg is taking 4 hours to complete. With full cache, in BIDS it is taking 25 minutes. So what I can make it to use more memory available in the server for those buffers. It guess, dataflow MaxBuffer size is 100MB. with full cache of 1.5 gb lookup, does it use separate buffers from dataflow? if so, how it will allocate in BIDS runtime to SQL Agent run time. I think, I should contact Microsoft for this issue. We have a premier support, so I will pursue with them. Anyway, thanks for your time. Anju
December 20th, 2010 5:41pm

Anju, DTexecUI.exe is a 32 bit utility. Are you using the same server when you talk about running the package via dtexecui and via agent? Can you provide details about the SQL Server agent step running the package? - make sure the 'use 32 bit' runtime option in execution options tab in the step is unchecked Also, what else is on that data flow? any blocking transformation like sorts, aggregation?Help Others! Don't forget to mark your thread as answered
Free Windows Admin Tool Kit Click here and download it now
December 20th, 2010 10:42pm

Had faced such errors in past , i can suggest few approach for you to try 1) Replace the Full cache with left join in source. If your source is a file then dump into staging and use loopup. 2) Don't increase the maxbuffersize unless you have concluded your optimum memory size ( based on trial and error method) . based on various test we are now using 25mb. 3) Use Buffertempstoragepath on fast drives ( unless ur temp and tmp or not pointing to it) Cheers , Varun
December 21st, 2010 9:46am

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics