SSIS, Cube Processing Tasks and SQL Agent
Hi All, So my set up is as follows: 1. Master Package containing a) A sequence os SSIS packages c) Precedence constraint to b) Analysis Services Processing Task to process our cubes 2. SQL Agent Job which runs the Master Package It all worked fine until recently (!!!) then we started getting the following error: Microsoft (R) SQL Server Execute Package Utility Version 10.0.2531.0 for 64-bit Copyright (C) Microsoft Corp 1984-2005. All rights reserved. Started: 06:00:01 Error: 2010-02-17 06:42:27.20 Code: 0x00000000 Source: Process NETADATA_AS cubes Analysis Services Processing Task Description: End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 06:00:01 Finished: 06:42:52 Elapsed: 2571.53 seconds. The package execution failed. The step failed. We have tried the following changes to the layout, based on the fact that if I log straight onto the server and proces the cubes they run every time: 1. Taking the cube processing out of the SSIS package and putting SSAS commands into the job >>>This seems to work, but we don't know for certain why 2. Splitting the Master package into two, one for the SSIS packages and one just containing the cube processing. Then in the Job we have two steps >>>This doesn't work. I thought that if the first package was holding memory, it might be released once the first step was complete, but no So ultimately the question is what is responsible for the memory allocation/availabilty for each step in each of the three configurations and how can we manage this better so the SSIS work doesn't have an adverse affect on cube processing????? Thanks
February 17th, 2010 9:01pm

have you checked that your CUBE is perfectly processing through BIDS for the same data. Please check this, SSIS just invoke the SSAS cube processing with AS Processing task. I suspect that there is some problem with data. try to process your cube in BIDS with same data. Let us know your observation.Let us TRY this | Dont forget to mark the post(s) that answered your question http://quest4gen.blogspot.com/
Free Windows Admin Tool Kit Click here and download it now
February 17th, 2010 10:23pm

It'd also be helpful to enable logging for your package, so that you can report the exact error message from SSIS. There should be more detail about the actual error.John Welch | www.varigence.com | www.agilebi.com | ssisUnit.codeplex.com
February 18th, 2010 1:49am

Hi Guys, ETL - Yes the data is fine John - logging is enabled, I've been using the following to monitor the situation: select starttime as x,* from dbo.sysssislog --where (message like '%swap buffers have failed%' -- or message like '%buffer manager failed%') order by starttime desc And there's quite a lot of these type of messages: The buffer manager has allocated 65520 bytes, even though the memory pressure has been detected and repeated attempts to swap buffers have failed. The buffer manager failed a memory allocation call for 65536 bytes, but was unable to swap out any buffers to relieve memory pressure. 1690 buffers were considered and 1489 were locked. Either not enough memory is available to the pipeline because not enough are installed, other processes were using it, or too many buffers are locked. And what else does the log say around failure time you ask?? This is all I get, this is verbatim with everything being logged: 2010-02-18 06:55:32.000 - OnProgress - Process XXX cubes - Finished building aggregations for the 'Vw Fact Sales Detail Pre 2007' partition. 2010-02-18 06:55:47.000 - OnError - Process XXX cubes - (No Message) 2010-02-18 06:55:48.000 - Script Task - Populate the User variable "ErrorMessages" with Cube build error messages - (No Message) And proceeds to send the error email and end package execution. Presumably the lack of message/datacode/databytes is why the emailed error says "Code: 0x00000000". i.e. "I haven't a clue whats going on, please accept my apologies, love Microsoft"! So there's a lot of buffering going on which we are tuning up along the way. But the overarching question on my mind is why isn't this memory being managed better? What is the difference between a) SSIS Master package having finished doing the sub-packages then starting the cube processing and b) The whole thing having failed and me running the cubes manually at the end I noticed that the buffers spooled was remaining high even after the master package has failed, but that this eventually sorts itself out with no intervention, presumably cleaners are at work? Are the subpackages holding onto memory and therefore not allowing the cub build to complete. But once the job is complete the memory is available again? Is there anything I should look out for like SQL Agent does X with memory, but SSIS does Y and maybe the manual cube build runs in a different space than when its running from either SSIS or SQL Agent I've done quite a lot of trial and error and now I'm looking for theory please Thanks Guys
Free Windows Admin Tool Kit Click here and download it now
February 18th, 2010 12:54pm

Hi All, Spent Friday tuning up all SSIS sub-packages, ran the full build, no sign of any memory buffering but the problem has not gone away......... Reverting back to previous incarnations of the solution to find where this came from. Thanks
February 21st, 2010 8:51pm

Hi, Have you checked that the deployment setting on your SSIS project is set correct, so it is the same package you run from BIDS and SQL Agent?Kind Regards,SørenSøren D. Jensen - BI Consultant
Free Windows Admin Tool Kit Click here and download it now
February 23rd, 2010 11:06am

Hi sdj_dk, Its running from the file system so there's really no ambiguity potential. thanks anyway
February 23rd, 2010 3:54pm

I had this same problem for months but I turned off Replication this weekend and the package started processing the cube.
Free Windows Admin Tool Kit Click here and download it now
July 2nd, 2011 3:43pm

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics