Facing the issue with SQL Server Destination:Unable to prepare the SSIS bulk insert for data insertion
HI All,
We have Flat file as source and SQL Server as Destination in our dataflow task (Five separate source and destination are there).Package runs through the SQL server agent and it fails
with the below error and when we restart the job it do not fails.
Error:
Microsoft (R) SQL Server Execute Package Utility Version 9.00.3042.00 for 32-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Code: 0xC0202071
Source: “DATA FLOW TASK”
Description: Unable to prepare the SSIS bulk insert for data insertion.
Code: 0xC004701A Source: “DATA FLOW TASK” DTS.Pipeline Description: component "DATA FLOW TASK" failed the pre-execute phase and returned error code 0xC0202071
I have checked this link also
http://support.microsoft.com/kb/2009672 but this issue do not occurs when we again starts the job.
NOTE:
At the same time multiple job runs on the server.
Please let me know if anyone has any thought on this.
Thanks
Chandan
January 7th, 2011 8:32am
Is it possible there is a temp objects or validation happening that requires the "ValidateExtrenalMetadata" to False
I haven't looked that exact error up and without details on the package that would be my first area to focus on given the jobs runs the second time it is executed (meaning an object like a temporary table may have been created)Ted Krueger
Blog on lessthandot.com @onpnt on twitter
Please click the Mark as Answer button if a post solves your problem!
Free Windows Admin Tool Kit Click here and download it now
January 7th, 2011 9:22am
Possible cause is the SQL Server Destination. IMHO it is better to us OLE DB connections than SQL Server. The performance difference is negligilbe but the benefits far outweigh. Are you using SQL Server Destination or OLE DB?Todd C - MSCTS SQL Server 2005 - Please mark posts as answered where appropriate.
January 7th, 2011 9:39am
Hi,
Thanks for your reply.
We are not creating any temp objects which requires "ValidateExtrenalMetadata" to False. Sometimes the same package will run three consecutive days and it will fail on fourth with the above
mentioned errors.
Not much idea why it is failing......
Thanks
Chandan
Free Windows Admin Tool Kit Click here and download it now
January 7th, 2011 9:46am
Hi Todd
I am using the SQL server Destination.Issue is not related to OLEDB or SQL server Destination because once the paakage fails and when re run it is working fine.So I can not say it is happening because of SQL server Destination.
My thought on this is,it is happening due to memory issue.
Changing from SQL server Destination to OLEDB destination is a huge work because many trasformations are there and also not surety is there it will not fail again.
Can you give your thought on this.
Thanks
Chandan
January 7th, 2011 10:01am
Changing from SQL server Destination to OLEDB destination is a huge work because many trasformations are there and also not surety is there it will not fail again.
What do you mean by "many transformation are there"? As in many field mappings from your data flow pipeline to fields in the destination?Todd C - MSCTS SQL Server 2005 - Please mark posts as answered where appropriate.
Free Windows Admin Tool Kit Click here and download it now
January 7th, 2011 10:08am
Hi Todd,
I mean to.... inside the data flow task we have many source and respective destinations are there and all the source&destinations are independent of each other.
example:
(Source)S1-(Destination)D1
(Source)S2-(Destination)D2
(Source)S3-(Destination)D3
(Source)S4-(Destination)D5
I believe all flow runs in parllel.Please let me know if the issue is still not clear.
Thanks in advance
Chandan
January 7th, 2011 12:16pm
Hi Chandan,
Based on my research, the issue "component "<component name>" failed the pre-execute phase and returned error code 0xC0202071" is generally caused by low available memory. When the issue occurs, there should have another error like:
The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 16 buffers were considered and 16 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes
are using it, or too many buffers are locked.
In this case, as you mentioned, there have multiple jobs run on the server when the job run, if other jobs consume too many memory, this package(mentioned in this case) may be fail due to fail to request memory.
Also, this package contains many data flow tasks, if one or many tasks need to transfer large data from the source to the destination, and use many Partially blocking transformations or Blocking transformations(such as lookup), the package need to request memory
which is times of the source data. This will cause the memory issue, and finally cause the package to be failed.
"but this issue do not occurs when we again starts the job." did you mean starting the job manually? If so, the reason why it is fine now is that, the current available memory is full enough to process the package.
In order to solve the issue, I would suggest you following these steps:
Apply the lastet hotfix for the SQL Server Integration Services(SSIS). This is a general suggestion, as I notice the SSIS in your environment is still SP2(3042)
Follow this article to turning the performance:
http://technet.microsoft.com/en-us/library/cc966529.aspx Add more physical memory to the machine Or, please use the Performance Counter to monitor the memory usage till the issue happens again
For more information, please see:
Monitoring the Performance of the Data Flow Engine:
http://msdn.microsoft.com/en-us/library/ms137622.aspx
If you have any more questions, please feel free to ask.
Thanks,
Jin ChenJin Chen - MSFT
Free Windows Admin Tool Kit Click here and download it now
January 10th, 2011 12:36am
Hi Chandan,
Based on my research, the issue "component "<component name>" failed the pre-execute phase and returned error code 0xC0202071" is generally caused by low available memory. When the issue occurs, there should have another error like:
The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 16 buffers were considered and 16 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes
are using it, or too many buffers are locked.
In this case, as you mentioned, there have multiple jobs run on the server when the job run, if other jobs consume too many memory, this package(mentioned in this case) may be fail due to fail to request memory.
Also, this package contains many data flow tasks, if one or many tasks need to transfer large data from the source to the destination, and use many Partially blocking transformations or Blocking transformations(such as lookup), the package need to request memory
which is times of the source data. This will cause the memory issue, and finally cause the package to be failed.
"but this issue do not occurs when we again starts the job." did you mean starting the job manually? If so, the reason why it is fine now is that, the current available memory is full enough to process the package.
In order to solve the issue, I would suggest you following these steps:
Apply the lastet hotfix for the SQL Server Integration Services(SSIS). This is a general suggestion, as I notice the SSIS in your environment is still SP2(3042)
Follow this article to turning the performance:
http://technet.microsoft.com/en-us/library/cc966529.aspx Add more physical memory to the machine Or, please use the Performance Counter to monitor the memory usage till the issue happens again
For more information, please see:
Monitoring the Performance of the Data Flow Engine:
http://msdn.microsoft.com/en-us/library/ms137622.aspx
If you have any more questions, please feel free to ask.
Thanks,
Jin ChenJin Chen - MSFT
January 10th, 2011 8:20am
Hi Jin,
Thanks for your reply.We are working towards above mentioned points.Meanwhile for a quick fix can we use MaxInsertCommitSize and Timeout property of the SQL Server Destination component as you know I am using the FlatFile as source and SQL Server Destination
as DESTINATION.
I mean if I can increase the Timeout property for 5 min and MaxInsertCommitSize as 500.Can you please suggest what will be the impact on changing these property?Will it stop failing job?
Notes:Once we start the job we TRUNCATE all the table before loading any data.
Thanks
Chandan
Free Windows Admin Tool Kit Click here and download it now
January 13th, 2011 1:22pm
Hi Jin,
Thanks for your reply.We are working towards above mentioned points.Meanwhile for a quick fix can we use MaxInsertCommitSize and Timeout property of the SQL Server Destination component as you know I am using the FlatFile as source and SQL Server Destination
as DESTINATION.
I mean if I can increase the Timeout property for 5 min and MaxInsertCommitSize as 500.Can you please suggest what will be the impact on changing these property?Will it stop failing job?
Notes:Once we start the job we TRUNCATE all the table before loading any data.
Thanks
Chandan
January 13th, 2011 1:22pm
Hi Chandan,
There is similar question as yours. I would suggest you have a look at it:
http://social.msdn.microsoft.com/forums/en-US/sqlintegrationservices/thread/f9c1f89d-9d3e-4abc-a8e6-484f7b00d3c3
Thanks,
Jin ChenJin Chen - MSFT
Free Windows Admin Tool Kit Click here and download it now
January 19th, 2011 3:03am
Hi Chandan,
There is similar question as yours. I would suggest you have a look at it:
http://social.msdn.microsoft.com/forums/en-US/sqlintegrationservices/thread/f9c1f89d-9d3e-4abc-a8e6-484f7b00d3c3
Thanks,
Jin ChenJin Chen - MSFT
January 19th, 2011 10:48am
Hi Jin,
I am still not finding the solution of this issue.
I have tried with changing the below things.
1.Changed SQL server Destination to OLEDB destination.
2.Changed Time out Property to 0 from 30 sec in SQL server Destination.
3.Changed commited size
After doing this also issue has not resolved.I am not sure why this error is occuring.We are not doing Partially blocking transformations or Blocking transformations,it is just FlatFile as Source and OLEDB or SQL Server Destination as destination.
If any one give us other idea to dump the data it will be great.the version of SQL is
SQL server standard Edition is there.Thanks Chandan
Free Windows Admin Tool Kit Click here and download it now
March 30th, 2011 10:45am