Quick easy way to see if data in sql server has changed

I have an automated test environment with 100's of test scripts that run against a build.  Some may change data, some may not.  Currently between each test I pull down the back up and restore the database.  If no data in the db is changed, I really don't need to restore the db.  I am looking for a quick way to to check that (via script) and if data has changed do a restore, if not proceed to next test.

June 24th, 2015 5:41pm

Hello!

If some data has been changed you will get a different result from this query:

SELECT CHECKSUM_AGG(BINARY_CHECKSUM(*)) as result FROM Table1;

declare @table table 
(
id int,
name varchar(20)
)

insert into @table 
values
(1, 'Josh'),
(2, 'Brian')


SELECT CHECKSUM_AGG(BINARY_CHECKSUM(*)) as result FROM @table

update @table
set name = 'Bob'
where id = 1

SELECT CHECKSUM_AGG(BINARY_CHECKSUM(*)) as result FROM @table

delete from @table where id = 1

SELECT CHECKSUM_AGG(BINARY_CHECKSUM(*)) as result FROM @table


Free Windows Admin Tool Kit Click here and download it now
June 24th, 2015 6:13pm

I do not think there is any easy way to check this unless the source system has something like change tracking enabled

and see if there is any version difference between the databases. 

You can use data compare tools to find this but i do not think you need(depending on your need).

couple of ways to get around:

1. if you have some kind of audit log table, you can query that table and see if there is any new/modified date.

2. if you take log backups, there is an unsupported way to read the log records and you track the events in there like, update,delete,exec,insert,drop etc and since you can consider all this as changes, you can restore the database. I did not suggest reading the actual log file because it is unsupported.

3. you can also use schema compare in ssis. i have not used but i believe, this should work.

But, is it always good to restore the backup's that way your actually testing your backups as well?

June 24th, 2015 6:22pm

Possibly you could record the value of

   SELECT MAX([Current LSN]) FROM sys.fn_dblog(NULL, NULL)

and if this value is the same after the tests, no updates have been performed. However, I suspect that it could change due to background operations, even if your tests do not performs any updates, so it may not be reliable.

Free Windows Admin Tool Kit Click here and download it now
June 24th, 2015 10:05pm

In my recent research, I find a very informative article that looks closer to your asked query.

It lets you track every login change, operation performed, and change made using a dedicated agent based on server, database, objects, operations, and users : http://sqlserverauditing.blogspot.in/2014/02/track-all-critical-and-granular-changes.html

June 25th, 2015 6:54am

You can record at each backup the ckecksum of your databases by summing the checksum of each table: SUM(tblCheckSum).

Before restoring, you have to compare the recorded backup checksum with the new one. If they are equal, you don't need to restore.

CREATE  TABLE DBCheckSum (tblCheckSum BIGINT)
EXEC SP_MSFOREACHTABLE 'INSERT INTO DBCheckSum SELECT CHECKSUM_AGG(BINARY_CHECKSUM(*)) FROM ?'
SELECT SUM(tblCheckSum) from DBCheckSum
DROP TABLE DBCheckSum

Hope this helps!

Free Windows Admin Tool Kit Click here and download it now
June 25th, 2015 7:48am

@Berimi, I found a query quite similar to yours that I was playing with but it does not seem to work.  Here is my sample script.  I capture a before and after checksum for each table.  In between I have a select that says run tests.  No data is changed in the db but the checksums are different.

--Before

CREATE  TABLE #DBCheckSum (tbl nvarchar(100), before_chksm int, after_chksm int)
EXEC SP_MSFOREACHTABLE 'INSERT INTO #DBCheckSum (tbl, before_chksm) SELECT "?", CHECKSUM_AGG(BINARY_CHECKSUM(*)) FROM ?'
SELECT * from #DBCheckSum
SELECT SUM(before_chksm) as before_totalchksm from #DBCheckSum

--Run Tests With No Data Change
Select 'Run Test With No Data Changed...' as Test

--After

EXEC sp_MSforeachtable @command1 = 'Update #DBCheckSum set after_chksm = (select CHECKSUM_AGG(BINARY_CHECKSUM(*)) as hs from ? where tbl=CAST("?" as nvarchar(100)))'
SELECT * from #DBCheckSum
SELECT SUM(before_chksm) as before_totalchksm, SUM(after_chksm)  as after_totalchksm from #DBCheckSum

DROP TABLE #DBCheckSum




June 25th, 2015 8:47am

I executed your query.

I've noticed that the column "after_chksm" is only updated for the last table. That's why the results are different.

If you replace the update by insert, you will have the same results.

Don't forget to replace INT by BIGINT.

Hope it helps!

Free Windows Admin Tool Kit Click here and download it now
June 25th, 2015 10:02am

You can record at each backup the ckecksum of your databases by summing the checksum of each table: SUM(tblCheckSum).

But that will only tell you with certainty that the database has changed if they are different. If they are the same, you don't know. The checksum algorithm is not particularly sophisticated, so different values could easily have the same checksum.

And since it requires scanning all tables, it seems easier to retsore the backup anyway...

June 25th, 2015 6:18pm

I will agree with Erland if it's about of one or not many tables. But if the database contains many tables, the probability of having the sums of the checksum of each table equal is very very small.
Free Windows Admin Tool Kit Click here and download it now
June 25th, 2015 6:51pm

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics