Out of Memory Error (bUFFER sWAPPING )-  SSIS
HI , Need some quick fix Help I have been trying to load data from AS400 to DB2 (windows) using ADO.NET connection in Data reader source and OLEDB Destination (IBM Oledb provider ) The files, Im trying to load, have number of rows more then 15 million. On execution of the package I get Out of Memory Error (see below) My Destination Box is 4GB+ RAM and 4 CPU Box. There seems to be some Buffer and Swapping related issue which Im not able to figure out. It says that System is unable to allocate memory Please help me on the same. Thanks in Advance Amit S SSIS package "ABCDE 1.dtsx" starting. Information: 0x4004300A at ABCDE 2003 to 2004, DTS.Pipeline: Validation phase is beginning. Information: 0x4004300A at ABCDE 2003 to 2004, DTS.Pipeline: Validation phase is beginning. Information: 0x40043006 at ABCDE 2003 to 2004, DTS.Pipeline: Prepare for Execute phase is beginning. Information: 0x40043007 at ABCDE 2003 to 2004, DTS.Pipeline: Pre-Execute phase is beginning. Information: 0x4004300C at ABCDE 2003 to 2004, DTS.Pipeline: Execute phase is beginning. Error: 0xC0202009 at ABCDE 2003 to 2004, OLE DB Destination [12]: An OLE DB error has occurred. Error code: 0x8007000E. An OLE DB record is available. Source: "Microsoft Cursor Engine" Hresult: 0x8007000E Description: "Out of memory.". Error: 0xC0047022 at ABCDE 2003 to 2004, DTS.Pipeline: The ProcessInput method on component "OLE DB Destination" (12) failed with error code 0xC0202009. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. Error: 0xC0047021 at ABCDE 2003 to 2004, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0xC0202009. Error: 0xC02090F5 at ABCDE 2003 to 2004, DataReader Source [61]: The component "DataReader Source" (61) was unable to process the data. Error: 0xC0047038 at ABCDE 2003 to 2004, DTS.Pipeline: The PrimeOutput method on component "DataReader Source" (61) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. Error: 0xC0047021 at ABCDE 2003 to 2004, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038. Information: 0x40043008 at ABCDE 2003 to 2004, DTS.Pipeline: Post Execute phase is beginning. Information: 0x40043009 at ABCDE 2003 to 2004, DTS.Pipeline: Cleanup phase is beginning. Information: 0x4004300B at ABCDE 2003 to 2004, DTS.Pipeline: "component "OLE DB Destination" (12)" wrote 289188 rows. Task failed: ABCDE 2003 to 2004 Warning: 0x80019002 at ABCDE 1: The Execution method succeeded, but the number of errors raised (6) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. Executing ExecutePackageTask: C:\Documents and Settings\Administrator\My Documents\Visual Studio 2005\Projects\Integration Services Project1\Integration Services Project1\ABCDE 2.dtsx Information: 0x4004300A at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Validation phase is beginning. Information: 0x4004300A at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Validation phase is beginning. Information: 0x4004300A at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Validation phase is beginning. Information: 0x40043006 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Prepare for Execute phase is beginning. Information: 0x40043007 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Pre-Execute phase is beginning. Information: 0x4004300C at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Execute phase is beginning. Information: 0x4004800D at ABCDE 2005_04 to 2005_11, DTS.Pipeline: The buffer manager failed a memory allocation call for 10484320 bytes, but was unable to swap out any buffers to relieve memory pressure. 3 buffers were considered and 3 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked. Error: 0xC0047012 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: A buffer failed while allocating 10484320 bytes. Error: 0xC0047011 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: The system reports 63 percent memory load. There are 4294660096 bytes of physical memory with 1548783616 bytes free. There are 2147352576 bytes of virtual memory with 227577856 bytes free. The paging file has 6268805120 bytes with 3607072768 bytes free. Error: 0xC02090F5 at ABCDE 2005_04 to 2005_11, DataReader Source [61]: The component "DataReader Source" (61) was unable to process the data. Error: 0xC0047038 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: The PrimeOutput method on component "DataReader Source" (61) returned error code 0xC02090F5. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. Error: 0xC0047021 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Thread "SourceThread0" has exited with error code 0xC0047038. Error: 0xC0047039 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown. Error: 0xC0047021 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Thread "WorkThread0" has exited with error code 0xC0047039. Information: 0x40043008 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Post Execute phase is beginning. Information: 0x40043009 at ABCDE 2005_04 to 2005_11, DTS.Pipeline: Cleanup phase is beginning. Information: 0x4004300B at ABCDE 2005_04 to 2005_11, DTS.Pipeline: "component "OLE DB Destination" (12)" wrote 0 rows. Task failed: ABCDE 2005_04 to 2005_11 Warning: 0x80019002 at ABCDE: The Execution method succeeded, but the number of errors raised (7) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. Executing ExecutePackageTask: C:\Documents and Settings\Administrator\My Documents\Visual Studio 2005\Projects\Integration Services Project1\Integration Services Project1\ABCDE 3.dtsx Information: 0x4004300A at ABCDE 2005_11 to 2006_04, DTS.Pipeline: Validation phase is beginning. Information: 0x4004300A at ABCDE 2005_11 to 2006_04, DTS.Pipeline: Validation phase is beginning. Information: 0x4004300A at ABCDE 2005_11 to 2006_04, DTS.Pipeline: Validation phase is beginning. Information: 0x40043006 at ABCDE 2005_11 to 2006_04, DTS.Pipeline: Prepare for Execute phase is beginning. Information: 0x40043007 at ABCDE 2005_11 to 2006_04, DTS.Pipeline: Pre-Execute phase is beginning. .
February 27th, 2007 8:18pm

HI,anyone aware how to get rid of this memory related error :::Package continues with these warning messages and then Fails after some time.Ram is enough in the system. Has this something to do with Buffer swapping ??any clue on this............ ????????????????Information: 0x4004800D at ABCDEF, DTS.Pipeline: The buffer manager failed a memory allocation call for 10485696 bytes, but was unable to swap out any buffers to relieve memory pressure. 3 buffers were considered and 3 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked.Error: 0xC0047012 at ABCDEF, DTS.Pipeline: A buffer failed while allocating 10485696 bytes.Error: 0xC0047011 at ABCDEF, DTS.Pipeline: The system reports 43 percent memory load. There are 1601118208 bytes of physical memory with 912363520 bytes free. There are 2147352576 bytes of virtual memory with 246427648 bytes free. The paging file has 3852472320 bytes with 2329001984 bytes free.Error: 0xC02090F5 at ABCDEF, DataReader Source [477]: The component "DataReader Source" (477) was unable to process the data.Error: 0xC0047021 at ABCDEF, DTS.Pipeline: SSIS Error Code DTS_E_THREADFAILED. Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited.Error: 0xC0047021 at ABCDEF, DTS.Pipeline: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0047039. There may be error messages posted before this with more information on why the thread has exited.
Free Windows Admin Tool Kit Click here and download it now
March 2nd, 2007 7:35am

Were you able to solve this problem? I'm having the same issue and I'm not sure what to do. The interesting thing in my case is that I was able to load up to 4 million records before, but nowit errors out after 400k records. Please let me know if you found a solution for this. Thanks.
March 7th, 2007 7:18pm

If the package is running on the same box as the destination SQL Server, the two may be competing for memory. I had this issue, found a thread on this forum that advised restricting the amount of memory SQL Server uses (by default SQL will use as much available memory as it wants). I set a a limit on how much RAM SQL can grab and it solved the problem.
Free Windows Admin Tool Kit Click here and download it now
March 8th, 2007 4:23pm

HI Mark, I tried to change max ram value for SQL server, but this also did not work out. Also tried running SQL server on a different machine, but no change. What i have found out in last few days is that the data load would fails once the table (DB2) reaches its maximum level 1GB( it's SMS type) . if i load data in chunks which is within 1GB limit, tasks won't fail. beyond that they start failing. this is the work around i have for the time being, but not a good way to get data in 10 chunks and then putting them together. shah
March 12th, 2007 3:01pm

This is my scenario:I'm moving data from SQL server to ORACLE. Initially I wasusing the Microsoft driver for Oracle and that's when I was running out of memory; but then I switched to the Oracle driver and that made a HUGE difference. The memory usage is very low now and the process moves way faster. I hope this helps.
Free Windows Admin Tool Kit Click here and download it now
March 12th, 2007 7:24pm

From the numbers in the error message, I'd expect the problem is that the issue is heap fragmentation. That is, while there's 900-something megabytes free in total, DTS's request for 10 megabytes of contiguous memory can't be satisfied.You might be able to work around the problem by reducing the buffer size; that will make DTS request smaller blocks of memory, and hopefully they'll fit in an available free block.
March 12th, 2007 7:53pm

Hi , Thanks for addressing the issue, but my problem is still stuck at the same level.One the limit of 1gb is touched by the db2(my destination table) it fails the package. I can't insert more then 1GB of records in DB2 table in a single go using SSIS package. ( db2 internally can store more then that ).and i'm not able to check out the microsoft OLEDB provider for DB2 as i'm using Standard Edition of SQL Server 2005 , which does not support this driver.What workaround i have right now is to load data to my destination tables ( db2 ) in chunks of size less then 1 gb. and after complete data comes to various parts ( set1, set2, set3, .... of main destination table ) i load them to my main destination table. But this is a temporary work around. I need to know how to get rid of this 1 GB issue for the DB2 tables. PLease help !!!!!!!!!
Free Windows Admin Tool Kit Click here and download it now
March 20th, 2007 7:21pm

Dear All I get the same 0xC02090F5 buffer error(s) when using SSIS works with large amounts of data (5 million row - 7 million row). The error occurs in DTS.Pipeline.1 or "Data Flow Task" at inconsistent components of pipeline. The error has occurred: 1) when a Data Reader ({BF01D463-7089-41EE-8F05-0A6DC17CE633}) bound to ODBC Informix driver writes values into buffer 2) When a SQL Destination ({C3BF62C8-7C5C-4F85-83C3-E0B6F6BE267C}) bound to SQL Server Table attempts to send a row to SQL Server Environment. Windows XP sp2 SQL 2005 Developers edition sp2 IBM Informix Connect 2.90 Observations. The MsDtsSrvr .Net CLR Memory performance counters has a very high #Total reserved Bytes for the capacity of the workstation. I do not have the tools to do an analysis of this heap. I suspect there maybe some memory fragmentation problems during long runs with large amounts of data which lead to the memory error. Questions Is there anyway to force a CLR GC between large Data Flow Tasks or a check points in a large table copy? Are there any techniques when working with large amounts or data which can optimize CLR fragmentation andlowerCLRheap resources? Rob
April 3rd, 2007 4:20pm

I assume you have a blocking component such as a sort which tries to load all of these rows in memory at once? Otherwise all the rows should flow through the Data Flow and use very little memory.A quick fix that sometimes yields surprising amounts of memory is to compare the external metadata of the source with the actual data. SSIS automatically adopts the external metadata, so if you have an external column defined as 255 chars, but is only using 50, then SSIS is wasting a lot of memory in anticipation of a 255 data value. Multiply that by 5 million rows and you're talking real memory.Lookups also can use a lot of memory. You should specify a SQL query with only the columns you need instead of loading an entire table. Casting the reference columns down to the minimal datatype can also help with conserving cache memory.If a dataflow can't be optimized to reduce memory usage, you can break it up into multiple flows using rawfiles to persist the pipeline from one flow to another.
Free Windows Admin Tool Kit Click here and download it now
April 3rd, 2007 6:18pm

JayH Thanks for the ideas. If the error returns I might give the vertical partition suggestion a try. Requirements restrinct the option to change metadata. I did find a solution that helped. I added a precedence constraint between each large table. Serializing the load sequence lowered the memory utilization; however, serialization of large tableslengthened the processing time. Contained in the Data Flow Task, I observe the Data Reader on an ADO.NET: System.Data.Odbc.OdbcConnection isperforming a translation for character data to Unicode. The Informix server and driver is configured for a code page of en_US.819. The SQL Server and database has a collation SQL_Latin1_General_CP1_CI_AS which makes the character data binary compatible. The SSIS Data Flow Task pipeline configuration addsextra code page translations for the charcter data. The resultincreases in CPU processing and a wastes of buffer space. The pipe line allocates space for the original en_US.819, the Unicode copy which is 2 times the size of en_US.819 and a SQL_Latin1_General_CP1_CI_AS copy to feed into the SQL Destination. If you know a way to turn off the translation for character data to Unicode in the ADO.NET: System.Data.Odbc.OdbcConnection then the buffer could be shrunk by a factor of 4. Do you know how to configure the ADO.NET reader in SSIS not to do the translation for character data? Rob
April 4th, 2007 1:57pm

Hi, Even Im facing Primeoutput issue. We are transferring data from siebel server to sql server 2005. Error: "Unable to process data" --> near to Execute sql task that retrieve data from siebel db. After this "Primeoutput error" Is there anyone who can help me on this. For past one month we are facing this issue and we are not able to fix this issue.
Free Windows Admin Tool Kit Click here and download it now
April 12th, 2012 3:42pm

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics