the job is failing and i see this error. has anyone any ideas why we would be seeing this
My colleague has fixed this issue on our TEST environment by following the steps listet in this thread:
BR
Alex
thanks alexander. I have started to try and run this powershell script, are you able to tell me why I get this error?
has anyone else got any advice on this? The transform job is still failing and therefor affecting our ability to obtain reports. I have been followiong this link to try and resolve the issues explained above, but treceived the above errors when running the powershell commands
Have you run this command directly on the Data Warehouse Managemenet Server or from the normal Management Server?
Otherwise try running the get-scdwwatermark with the -computername switch like this:
Get-SCDWWatermark -computername "DWHMS.contoso.com"
where DWHMS.contoso.com is the FQDN to the Data Warehouse Service in your environment.
Btw: All SCSM services are started and running properly I suppose?
Cheers
Alex
- Edited by alexander.markel Tuesday, May 28, 2013 2:07 PM
Have you run this command directly on the Data Warehouse Managemenet Server or from the normal Management Server?
Otherwise try running the get-scdwwatermark with the -computername switch like this:
Get-SCDWWatermark -computername "DWHMS.contoso.com"
where DWHMS.contoso.com is the FQDN to the Data Warehouse Service in your environment.
Btw: All SCSM services are started and running properly I suppose?
Cheers
Alex
- Edited by alexander.markel Tuesday, May 28, 2013 2:07 PM
Thanks for your help Alex, I am getting somewhere now.
So i guess the next step is to run set-scdwwatermark .
In my instance below I guess it shoul be
Set-SCDWWatermark -ComputerName GBPRDSCSM02
-EntityName Relationship
-WaterMarkValue 18/5/2013
Before I run this I would greatly appreciate it if you could confirm that you believe this to be correct.
Hi,
this would partially be correct.
I suppose the command above would fail saying that 18/05/2013 is not a valid datetime...
Therefore I recommend to do the following:
$CustomDate = get-date 18.05.2013 -Hour 7 -Minute 10 Set-SCDWWatermark -computername GBPRDSCSM02 -EntityName Relationship -WaterMarkValue $CustomDate
This ensures that the value you're passing to the WaterMarkValue property is indeed a datetime.
BR
Alex
Thank you - followed that command and the job seemed to move on - but still won't complete and is showing as failed with a start time of 18/05/2013 00:00:00
This is what i am seeing in the jobs now - anything you can recomend on the Alex? The TransformEntityRelatesToEntityFact seems to have completed, its the jobs after that that seem to be the problem now
What I can see from that picture is that those jobs having a lower batch ID than the first job (TransformIncidentDim with ID 6390). And although the TransformEntityRelatesToEntityFact job says it has completed successfully, it still shows error messages.
Please try to sort the view by the batch ID and see if the jobs with the highest batches are still failing?
If I remember correctly we had to wait quite a while until the Transform job became "alive" again. Whats the general status of the entire Transform job?
Everything with an ID higher that 5656 completed. Basically it looks fairly intermitant whether the jobs from ID 2897 to 1946 completed or failed. The job is in status of failed and shows a last run date of 18/05/2013 00:00:00 Next run 18/05/2013 00:30:00
It is set to run every 30 minutes and it starts running every 30 minutes then goes back to failed, even though it goes to running every 30 mins the run date and next run date stay as 18/05/2013
Okay, this seems to be a subsequent problem... Do you see something in the OperationsManager logs on the Data Warehouse Management Server while the transform job is running?
Another option would be to try to run the ETL Powershell script Travis Wright wrote once (http://gallery.technet.microsoft.com/PowerShell-Script-to-Run-a4a2081c/view/Discussions). This script ensures that all jobs run in a correct order. But if you see error Messages in the logs, I guess those have to be sorted first before the script completes successfully.
Cheers
Alex
Hello I have the same issue. I ran the ETL script from Travis Wright and it also failed.
PS C:\ETL> .\Run-ETL.ps1
Starting the Extract_SCSM Job...
Waiting 30 seconds
Extract_SCSM Job Status: Running
Waiting 30 seconds
Extract_SCSM Job Status: Running
Waiting 30 seconds
Extract_SCSM Job Status: Running
Waiting 30 seconds
Extract_SCSM Job Status: Not Started
Starting the Extract_DW_SCSM Job...
Waiting 30 seconds
Extract_DW_SCSM Job Status: Running
Waiting 30 seconds
Extract_DW_SCSM Job Status: Running
Waiting 30 seconds
Extract_DW_SCSM Job Status: Not Started
Exiting since the job is in an unexpected status
The error given is"cannot find either column "etl" or the user-defined function or aggregate "ETL.CanContinueExecution", or the name is ambiguous."
Hi,
Any one got the solution for this issue? I am getting same issue. I am using SCSM 2012 SP1 for FIM 2010 reporting. I have provided the error for your reference.
I am not getting any data in my FIM Reports. Transform.Common Job is Failed everytime. Module which is failing is "TransformEntityRelatesToEntityFact" & "TransformEntityManagedTypeFact".
Log Name: Operations Manager
Source: Data Warehouse
Date: 1/6/2014 10:05:17 AM
Event ID: 33502
Task Category: None
Level: Error
Keywords: Classic
User: N/A
Computer: ******
Description:
ETL Module Execution failed:
ETL process type: Transform
Batch ID: 19151
Module name: TransformEntityRelatesToEntityFact
Message: Cannot find either column "ETL" or the user-defined function or aggregate "ETL.CanContinueExecution", or the name is ambiguous.
Stack: at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
at System.Data.SqlClient.SqlDataReader.ConsumeMetaData()
at System.Data.SqlClient.SqlDataReader.get_MetaData()
at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)
at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async)
at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result)
at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)
at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)
at System.Data.SqlClient.SqlCommand.ExecuteReader()
at Microsoft.SystemCenter.Warehouse.Utility.SqlHelper.ExecuteReader(SqlConnection sqlCon, CommandType cmdType, String cmdText, SqlParameter[] parameters)
at Microsoft.SystemCenter.Warehouse.Etl.StoredProcedure.Execute(IXPathNavigable config, Watermark wm, DomainUser sourceConnectionUser, DomainUser destinationConnectionUser)
at Microsoft.SystemCenter.Warehouse.Etl.TransformModule.Execute(IXPathNavigable config, Watermark wm, DomainUser sourceConnectionUser, DomainUser destinationConnectionUser, Int32 batchSize)
at Microsoft.SystemCenter.Warehouse.Etl.TransformModule.Execute(IXPathNavigable config, Watermark wm, DomainUser sourceConnectionUser, DomainUser destinationConnectionUser)
at Microsoft.SystemCenter.Etl.ETLModule.OnDataItem(DataItemBase dataItem, DataItemAcknowledgementCallback acknowledgedCallback, Object acknowledgedState, DataItemProcessingCompleteCallback completionCallback, Object completionState)
The following query resolved my problem. Please use at your own risk.
USE [DWRepository]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [etl].[CanContinueExecution] (
@transformName VARCHAR(256),
@executionStartTime DATETIME
)
RETURNS BIT
AS BEGIN
DECLARE @canContinue BIT = 1,
@executionTimeLimit INT = 30 -- minutes
-- do not continue if the time we have consumed is
-- more or equal to the amount of time that we are
-- allowed to take for this batch
IF(ABS(DATEDIFF(mi, @executionStartTime, GETUTCDATE())) >= @executionTimeLimit)
BEGIN
SET @canContinue = 0
END
RETURN @canContinue;
END
GO
- Proposed as answer by Enayathulla Tuesday, January 07, 2014 7:41 AM
The following query resolved my problem. Please use at your own risk.
USE [DWRepository]
GO
SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
CREATE FUNCTION [etl].[CanContinueExecution] (
@transformName VARCHAR(256),
@executionStartTime DATETIME
)
RETURNS BIT
AS BEGIN
DECLARE @canContinue BIT = 1,
@executionTimeLimit INT = 30 -- minutes
-- do not continue if the time we have consumed is
-- more or equal to the amount of time that we are
-- allowed to take for this batch
IF(ABS(DATEDIFF(mi, @executionStartTime, GETUTCDATE())) >= @executionTimeLimit)
BEGIN
SET @canContinue = 0
END
RETURN @canContinue;
END
GO
- Proposed as answer by Enayathulla Tuesday, January 07, 2014 7:41 AM
Thank you Dimitar it sorted out my problem as well (which was identical to Enayathulla).
Carol