IS Issues when running the Anslysis Services Processing tasks. It doesnt seem to be processing the dimensions so the cubes fail
I have an Integration Services package and in it I have Analysis Services Processing Tasks
I have a task for each dimension rather than have them all in one package to update the dimensions
Then 2 Procesing tasks for the cube updates.
The dimensions run successfully. But the two cubes fail due to the Attribute keys cant be found etc. It is as if the cubes have been run within the dimensions being processed first, which is incorrect because
I have run the dimensions.
I then run it in Analsis Services. Dimensions first, then the cubes and it is successful.
I then try again in the IS package and its also successful. Presumably because its been processed in Analysis Services.
So clearly even through Integrated Services says that the dimensions have run successfully isnt true.
Can anyone give me any tips on this? Unless I manually process the cubes and dimensions I cant be sure its working. I cant do overnight processing which is well rubbish.
BTW Everything is on Processing Options = Update. I took this to be process updates only to save time. I am really at a loss here at the moment.
Debbie
June 11th, 2012 11:25am
Seems it is a refresh thing, can you post the error message entirely?Arthur My Blog
Free Windows Admin Tool Kit Click here and download it now
June 11th, 2012 11:47am
Debbie--
Try the ProcessUpdate on the dimensions first, then Process on the rest ...
I am wondering if you deployed an initial version of the project to the target SSAS server before processing it? I'm a little fuzzy on the details, but I think with SSAS, there is concept of processing (or cycling through an
update) and a concept of cube deployment. If you haven't deployed your cube to the server, try that, then process the cube to see if the updates succeed from SSIS.
June 11th, 2012 6:47pm
Im sorry for the delay, I thought I had email updates on this one , Id forgotten to tick the box. Whoops
Right
Ive changed everything to process FULL as per some other advice I had a few dimension errors come through which I have sorted.
It processes quite happily in Analysis Services (I always run the dimensions on FULL first and then the two cubes on Full)
In IS I have a task for each dimension and then the cubes. Between each task is a script that records that it's been done and sets the duration time.
The dimensions run quite happily through and so does the cubes straight after an Analysis Services process.
I will try to get the error messages the next time it happens (which it will)
Im interested in If you haven't deployed your cube to the server, try that, then process the cube to see if the updates succeed from SSIS.
Is there an analysis task that I can use at the beginning of the whole thing to deploy the cube before processing that might help? Thats the one thing that happens in Analysis
Services which is maybe missing in the IS package.
Free Windows Admin Tool Kit Click here and download it now
June 13th, 2012 8:05am
Debbie--
two stage this--generate the cube on your dev box...then generate the deployment script with the deployment wizard....use the script inside the ssas services task.
Here's an msdn ref for doing this:
http://msdn.microsoft.com/en-us/library/ms174887(SQL.105).aspx
June 13th, 2012 9:38am
Whoah,
I didnt realise there was anything more to this that just adding the Analysis Services tasks.
Ive read through the link and to be honest I cant get my head round it at all. Is there an easier guide to work with anywhere. Thats beyond me. (Ive alwasy found the msdn user guides completely user unfriendly)
I dont have to manually Process the cube in analysis Services every time do I?
Debie
Free Windows Admin Tool Kit Click here and download it now
June 13th, 2012 10:13am
Debbie--
Here's another link that might be more straightforward with a series of practical steps:
http://www.learnmicrosoftbi.com/Forums/tabid/79/forumid/3/postid/591/scope/posts/Default.aspx
June 13th, 2012 11:03am
Now that looks more my style.
I will get cracking on it to see if I can get it sorted.
At least I now understand why it works nicely in Analysis services but fails in Integration Services. Id missed a step.
Thanks for that!
Free Windows Admin Tool Kit Click here and download it now
June 13th, 2012 11:10am
I dont believe it, even this has confused the hell out of me
The initial issue says that the user knows how to refresh the cube in IS but not Deploy that cube or (create the reports from the cube...No idea what
" There are a couple of ways to process a cube or dimension automatically
1 Use SSIS.
There are two choices here ; Use the Analysis Services Processing Task to specify the Cube or Dimensions you want to process
or
Alternatively - Add the XMLA Script to an Analysis Services Execute DDL task
Either way you can use SQL Agent to schedule the Package"
I already do this section. I use Processing tasks to process the dimension and cubes
"2. Use an XMLA script and add to sqlAgent Job as an Analysis Services Command. The schedule using SQL Agent."
Because it says there are 2 ways of processing, using SSIS or useing an XMLA script, I already do 1 so I can ignore this section (2).
Automated Reports in SSRS
This bit doesnt concern what Iam doing at the moment.
So after reading it Im left with nothing I can do since Im already doing option 1 which is use Analysis Services tasks to process the data. There is nothing in
the help file that deals with deployment.
Back to the drawing board. Am I just not getting it?
Debie
<//span>
June 13th, 2012 11:28am
Debbie--
What a frustrating day for you!:-\ Let's take SSIS out of the equation for a minute and use the SQL Server Agent instead. Try scheduling the SSAS package as a job. You'll have to fill in the xmla-command as a job step. If your cube isn't giant,
let's go for the gold and process it all. You can always fiddle around with generating a more specific script by right clicking on the dimension or cube in ssms to choose a process later. If this works for you, disable your ssas tasks
in SSIS and try launching the ssas scheduled job using the sql agent task.
Here's an example of the xmla stuff:
<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Parallel>
<Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<Object>
<DatabaseID>DebbiesCubeDBName</DatabaseID>
</Object>
<Type>ProcessFull</Type>
<WriteBackTableCreation>UseExisting</WriteBackTableCreation>
</Process>
</Parallel>
</Batch>
I'm not sure how often you use the SQL Agent job, but here's are 2 urls with some of the basic steps for setting that up for an SSAS cube build.
Official:
http://msdn.microsoft.com/en-us/library/ff929186.aspx
Informal:
http://dwbi1.wordpress.com/2010/06/16/scheduled-deploy-in-ssas/
Free Windows Admin Tool Kit Click here and download it now
June 13th, 2012 4:53pm
Oh no this gets worse.
Unfortunately our DBA team has full control of the SQL Server agent and I cannot do anything with it or even see it. Ive been asking them for years for my own test server but have been refused every time.
straight away I know that they are not going to be able to do this or have the time to do this for me (This is after years of frustration, dealing with dbas with little experience of Analysis Services)
If there is no other way to get it sorted Ill have to flag this up to my manager and process everything in Analysis Services manually. Gutting. I really dont understand how its not working adding Analysis Services tasks to an IS package though?
it did work last night though I think
Executed as user: DERBYSHIRE\DL_V-CQ10_SQL_Sv_Agt. Microsoft (R) SQL Server Execute Package Utility Version 10.50.2500.0 for 64-bit Copyright (C) Microsoft Corporation 2010. All rights reserved. Started: 9:00:01 PM DTExec: The package execution returned
DTSER_SUCCESS (0). Started: 9:00:01 PM Finished: 9:51:02 PM Elapsed: 3060.86 seconds. The package executed successfully. The step succeeded.
Ive checked my audit trail and its all in there too. Whether its actually processed anything at the moment is anyones guess.
Its all very confusing. Is it sorted now? I just dont know
Debbie
June 14th, 2012 5:31am
The package ran successfully, so there were no errors during processing.
How long did the job ran?
You can query a DMV to find the lastprocesseddate:
http://social.msdn.microsoft.com/Forums/en-US/sqlanalysisservices/thread/0d436dc4-fd49-4909-9356-39733d5a2505MCTS, MCITP - Please mark posts as answered where appropriate.
Free Windows Admin Tool Kit Click here and download it now
June 14th, 2012 5:56am
Its seems to have run around the same time I would expect
Part of Process Section
Date_Time Seconds Minutes of whole process
Analysis Services SEN Contacts LIO
2012-06-13 21:43:52.930 5 8
Analysis Services SEN Process Contacts Health Department 2012-06-13 21:43:57.817 4 8
Analysis Services SEN Process Attendance 2012-06-13 21:44:01.870
172 8
Analysis Services SEN Process Date Time 2012-06-13 21:46:53.930
3 8
Analysis Services SEN Process Special Needs 2012-06-13 21:46:56.647
2 8
Analysis Services SEN Process SEN
2012-06-13 21:46:58.983 72 8
Analysis Services SEN Process Age
2012-06-13 21:48:10.137 11 8
Analysis Services SEN Process SEN Cube 2012-06-13
21:48:21.743 23 8
Analysis Services SEN Process SEN Workflow Cube 2012-06-13 21:48:44.003 137 8
Analysis Services SEN End
2012-06-13 21:51:01.797 0 8
It does seem that the only thing missing is the deployment of the cube when its changed. Processing is fine if nothing has happened.
Ill keep processing and make sure I know what is different when it fails again (But works in Analysis Services)
June 14th, 2012 6:07am
As expeced, It failed again last night
Source: Analysis Services Processing Task SEN Cube Analysis Services Execute DDL Task Description: Internal error: The operation terminated unsuccessfully
Analysis Services Processing Task SEN Cube Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: An error occurred while
processing the 'SEN Fact' partition of the 'COP Stage Pupil Business Measures' measure group for the 'SEN' cube from the CAYA_DataWarehouse_Dev database. End Error Error: 2012-06-14 21:48:17.20 Code: 0xC11F0006 Source: Analysis Services Processing Task SEN
Cube Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. End Error Error:
2012-06-14 21:48:17.20 Code: 0xC11C0002 Source: Analysis Services Processing Task SEN Cube Analysis Services Execute DDL Task Description: Server: The operation has been cancelled. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC1000007 Source: Analysis Services
Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description: Internal error: The operation terminated unsuccessfully. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC11F0006 Source: Analysis Services Processing Task SEN Workflow Steps
Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. End Error Error: 2012-06-14
21:48:44.49 Code: 0xC11F000E Source: Analysis Services Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description: Errors in the OLAP storage engine: An error occurred while processing the 'SEN Workflow Steps' partition of the 'SEN Workflow
Steps' measure group for the 'SEN Workflow Steps' cube from the CAYA_DataWarehouse_Dev database. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC11F0006 Source: Analysis Services Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description:
Errors in the OLAP storage engine: The process operation ended because the number of errors encountered during processing reached the defined limit of allowable errors for the operation. End Error Error: 2012-06-14 21:48:44.49 Code: 0xC11C0002 Source: Analysis
Services Processing Task SEN Workflow Steps Analysis Services Execute DDL Task Description: Server: The operation has been cancelled. End Error DTExec: The package execution returned DTSER_FAILURE (1). Started: 9:00:01 PM Finished: 9:48:44 PM Elapsed: 2923.79
seconds. The package execution failed. The step failed.
So if i now go into Analysis Services Deploy and run everything my self it will be fine.
There must be an easy way of adding the deploy to the integration services task so the cube deploys first and then the dimensions and cubes are processed?
Free Windows Admin Tool Kit Click here and download it now
June 15th, 2012 5:31am
There's no reason to do a deployment on a server.
You only deploy when there are metadata changes.
The reason BIDS deploys everytime you process the database is because you are doing this inside the development environment, so it makes sense to assume there could be changes.
If you process interactively in SSMS, there is no deployment either.
Regarding the errors: the most irratating and frustrating aspect of SSAS is that the actual error is not in the errors, but in some warning.
You need to get this actual error, otherwise it's impossible to debug. I'm not really sure that if you log the SSIS warnings, you will get this warning. It's worth to check that out.MCTS, MCITP - Please mark posts as answered where appropriate.
June 15th, 2012 5:34am
Check what out?
When I process in Analysis Services it works. When It processes through the job afterwards it works.
When its left for a while and then does the overnight process it fails.
Where do I get these error messages from?
the errors that I sent was from the report that the dba team set up for me to show whats happened during the job.
the more I look into this the more confusing it gets.
Free Windows Admin Tool Kit Click here and download it now
June 15th, 2012 5:42am
If it fails, there are erros. And you need to check them out :)
The error report from the DBA team is the job history for that specific run of the jobstep.
But I hope you configured logging in the SSIS package and that errors and warnings are logged to a file or table. If not, enable it.
It's not normal that processing in BIDS works and in SSIS it doesn't. There must be some difference between those two and we need to find out what.MCTS, MCITP - Please mark posts as answered where appropriate.
June 15th, 2012 5:52am
http://msdn.microsoft.com/en-us/library/ms141212.aspx?ppud=4
http://stackoverflow.com/questions/5772667/how-to-create-an-error-log-or-custom-error-log-within-an-ssis-package
Right I think I have got something to work with there. I will get onto it!
Debbie
Free Windows Admin Tool Kit Click here and download it now
June 15th, 2012 5:57am
Ive been pulled over to something else this week but I still have the exact same problem
The Analysis Services section of the IS package fails every night, I have to run it manaully on Full which is fine.
My user tried to open a report this morning and got the error
An error occured during local report processing
Query execution failed for data set......
The Sen cube either does not exist or has not been processed
When I have finished what Im doing at the moment I can get cracking again and try and get it sorted but Im worried that its going to be beyond my expertise which is ridiculous sicne it should be easy to set up.
June 21st, 2012 6:17am
Hi Debbie--
I'm sorry this problem still persists. I am wondering if any of the processing prior to this task might be making the difference between the failed package and the manual success. I'm assuming when you run this process from ssms you are just focused
on processing the cube. What happens in your SSIS prior to initiating the cube processing? For example could you be manipulating related attributes assigning/reassigning parent relationships on the data that you're later working with--perhaps
in parellel?
I know that we've exchanged a few lines of text about possibly running this cube as a standalone job through sql agent. If you aren't able to get a new job set up by your dba crew, perhaps you can alter this ssis package to add a new ssas
task that does nothing more than launch the cube.... before .... any of your other processing begins. If the package doesn't error out on your newly added task, but does error out on the second launch (the one you have now) ... you'll know that
it isn't the relationship between the integration services and ssas per se, but something done in the body of the data processing flow.
Free Windows Admin Tool Kit Click here and download it now
June 21st, 2012 7:36pm
Finally got back to having a look at this thankfully,
Currently I still have to process it myself in Analysis Services so its not fit for purpose at all.
It still does the same thing. It errors unless you run it first in Analysis Services.
There isnt anything else happening during the Analysis Services sections. everything should be OK.
There isnt an Analysis Services Processing task to just launch the cube so Im not sure how to do this.....
Im going through all the old post on this to see if there is anything else I can do in regard to the error logging.
July 16th, 2012 10:39am