Hoping someone out there knows this error or possibly can help me to find out the reason.
We have a SQL Server 2008 relational Database as the source for SQL Server 2008 cube. The cube will be processed by a SQL Server job using XMLA command.
<Batch xmlns="http://schemas.microsoft.com/analysisservices/2003/engine"> <Parallel> <Process xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:ddl2="http://schemas.microsoft.com/analysisservices/2003/engine/2" xmlns:ddl2_2="http://schemas.microsoft.com/analysisservices/2003/engine/2/2" xmlns:ddl100_100="http://schemas.microsoft.com/analysisservices/2008/engine/100/100"> <Object> <DatabaseID>MyDbName</DatabaseID> </Object> <Type>ProcessFull</Type> <WriteBackTableCreation>UseExisting</WriteBackTableCreation> </Process> </Parallel> </Batch>
Our environment is:
Windows Server 2008 SP2, x64
MS SQL Server 2008 Enterprise SP1
VM-Ware Maschine
Processor: AMD Opteron 8381 HE (QuadCode 2,5GHz)
Memory: 8 GB
The problem is: sometimes (not always) the job step for processing hangs until we restart the analysis service.
The CPU is not busy, no error entries in windows event log, no entries in msmdsrv.log (Analysis Services-Log), no activity on analysis services traced by profiler…
And to be honest, I have no further idea where can I trace the error cause.
What we have already tried:
- Change xmla script to sequential processing
- Limit the analysis services memory usage to LowMemoryLimit = 30 and TotalMemoryLimit = 60
- Limit the sql server memory usage to max. 70%
- Set CommitTimeout to 10 min (in a hope the processing will be aborted after 10 min hanging)
My feeling is that the processing will be performed and while commit the changes (before the new cube files will be used) it hangs. In the cube file folder, all files are refreshed. If I try to access the cube using Excel, the cube is not yet refreshed.
So, may be someone knows this issue or could help me to find the source.
Thank you in advance for your help
Anatoli.