Get-md5hash not working on remote server

You are not passing anything into the function.  The function is not aware of the pipeline.

Being fancy here is just getting in the way.  Try this:

$blok={
    $md5 = new-object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
    get-childitem f:\data_protector_7_0 -Recurse -force |?{!$_.Psiscontainer}| 
         select fullname ,
                @{
                   N='Hash';
                   E={
                     [System.BitConverter]::ToString($md5.ComputeHash([System.IO.File]::ReadAllBytes($_)))
                   }
                }
}
February 16th, 2015 9:18am

Hi,
the code isn't working..   Have no results at all, running on the local server. 
I used the hash function before without problem (just locally) and its working fine.   Passing the commands through invoke-session or via ps-session is the only issue.. and it isn't working only on huge files.

I used the function before, as u can see there
https://social.technet.microsoft.com/Forums/windowsserver/en-US/e0fc5b7e-c182-4212-85ce-17ddff70c2cd/delete-duplicate-files-using-md5-and-group-object?forum=winserverpowershell

PS C:\>  $md5 = new-object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
PS C:\>     get-childitem f:\data_up\UICT\data_protector_7_0 -Recurse -force |?{!$_.Psiscontainer}|
>>          select fullname ,
>>                 @{
>>                    N='Hash';
>>                    E={
>>                      [System.BitConverter]::ToString($md5.ComputeHash([System.IO.File]::ReadAllBytes($_)))
>>                    }
>>                 }
>>

FullName                                                    Hash
--------                                                    ----
F:\data_up\UICT\data_protector_7_0\DPWINBDL_00703.EXE
F:\data_up\UICT\data_protector_7_0\ESD_HP_DP_7.00_for_Wi...


PS C:\>
PS C:\> function get-hash
>> {
>>  $md5 = new-object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
>>  return $hash = [System.BitConverter]::ToString($md5.ComputeHash([System.IO.File]::ReadAllBytes($_.fullname)))
>> }
>> get-childitem f:\data_up\UICT\data_protector_7_0 -Recurse -force |?{!$_.Psiscontainer}| select fullname ,@{n="Hash";e={$_ |get-hash}}
>>

FullName                                                    Hash
--------                                                    ----
F:\data_up\UICT\data_protector_7_0\DPWINBDL_00703.EXE       1E-3F-98-31-DD-9B-32-EE-96-53-A9-32-CF-9D-9B-91
F:\data_up\UICT\data_protector_7_0\ESD_HP_DP_7.00_for_Wi... 00-38-7B-7C-9D-A1-2C-43-5F-AB-31-6E-1A-9C-5D-8B


PS C:\>

Ive searched through my repository, and found the same function with a little tweak.

$someFilePath = "F:\DATA_UP\UICT\data_protector_7_0\ESD_HP_DP_7.00_for_Windows_TD586-15003.01.zip"
$md5 = new-object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
$hash = [System.BitConverter]::ToString($md5.ComputeHash([System.IO.File]::ReadAllBytes($someFilePath)))
$hash

Locally working, remotely throwing an error: 

Exception calling "ReadAllBytes" with "1" argument(s): "Exception of type 'System.OutOfMemoryException' was thrown."
At line:3 char:88
+ $hash = [System.BitConverter]::ToString($md5.ComputeHash([System.IO.File]::ReadAllBytes <<<< ($someFilePath)))
    + CategoryInfo          : NotSpecified: (:) [], MethodInvocationException
    + FullyQualifiedErrorId : DotNetMethodException

Now the question is, how to check validity of these big files ... :(  I had the idea of comparing MD5 but i look like i couldnt get it work properly

Free Windows Admin Tool Kit Click here and download it now
February 16th, 2015 9:49am

Before you had errors in the script and now you are getting the true error which is: 'System.OutOfMemoryException'

That cannot be fixed with a script. It can only be fixed by increasing the amount of memory in the remote session.  You will need to customize the endpoint.

February 16th, 2015 9:53am

Well, 16GB on remote session is quite OK for a file server.

And yet, the script was working fine (no obvious errors in the script, because it is working for smaller files... please)

Why should i increase RAM on remote server, when i can run this script directly on remote server, but cannot do it via PSSession?  Does remoting have increased memory requirements or what?


  • Edited by Mekac 20 hours 30 minutes ago
Free Windows Admin Tool Kit Click here and download it now
February 16th, 2015 10:04am

The endpoint is restricted to about 2Mb of memory. All remoting is restricted by default to reduce impact.  What you are doing can have a large impact on a production server so the WinRM throttles you to keep the system safe.

February 16th, 2015 10:07am

So, another idea how to distinguish bad file?  Since the length is the same .. 
Free Windows Admin Tool Kit Click here and download it now
February 16th, 2015 11:42am

I am sorry but I don't think you understand what "Out of memory" means.  It means that you cannot do this with any large files on the remote system without reconfiguring the remoting endpoint.  It is not a bad file. It is an out-of-memory condition,

February 16th, 2015 12:18pm

Hello,
im not able to receive md5 hash on big files (1.5GB) via scriptblock.

$blok = {

function get-hash
{
 $md5 = new-object -TypeName System.Security.Cryptography.MD5CryptoServiceProvider
 return $hash = [System.BitConverter]::ToString($md5.ComputeHash([System.IO.File]::ReadAllBytes($_.fullname)))
}
get-childitem f:\data_protector_7_0 -Recurse -force |?{!$_.Psiscontainer}| select fullname ,@{n="Hash";e={$_ |get-hash}}
}

I receive only Fullname, hash property is empty

FullName : F:\data_protector_7_0\ESD_HP_DP_7.00_for_Windows_TD586-15003.01.zip
Hash     :

When I run the code locally on server, it works fine

The code itself is working remotely, I can receive hash keys on other files with no problem
G:\log\110829081433share69974.txt          46-1D-E2-6F-28-E5-F1-BC-F3-96-11-5F-7B-EC-A6-B9
G:\log\110829081433vysledekGroup69974.csv  C5-A8-E8-50-5A-72-88-98-97-43-B7-F2-E8-77-3A-71

Only have issue with these big files.  Its like getting OutOfMemory exception without any error.

Same issue occur, when I run the code via enter-pssession ... still not working on these big files.

I need to check validity on some ZIP files.

Server1: (good)
Name           : ESD_HP_DP_7.00_for_Windows_TD586-15003.01.zip
Length         : 1692480938
00-38-7B-7C-9D-A1-2C-43-5F-AB-31-6E-1A-9C-5D-8B

Server2 (bad)

Name           : ESD_HP_DP_7.00_for_Windows_TD586-15003.01.zip

Length         : 1692480938

96-9B-B3-37-3F-59-D2-49-19-D8-CB-25-97-D1-1E-80




  • Edited by Mekac Monday, February 16, 2015 11:00 AM
Free Windows Admin Tool Kit Click here and download it now
February 16th, 2015 1:37pm

OMG..  Im not fuckin stupid as u think.

1. ZIP file on server1 cannot be opened, because its corrupted.
2. ZIP file on server2 is OK, thats why they both have different MD5, as i post above (before u tell me how stupid i am, read my posts and try to understand them - not everybody speak fluent english, but i try my best).
3. both files has the same length (allocated space), so i cannot say easily which one can be unzipped and which one throws the error. Thats why i wanted to check MD5 .
4. I understand that its memory problem (by scenario) and cant call MD5 checking on large files remotely (thats why i marked your post as helpfull).
5. In my last post I asked for another solution I could use to distinguish corrupted file.

Yes, one of them is corrupted. Copying large files over low bandwidth WLAN and having some network devices in route can sometimes be tricky and have unexpected results.  Imagine, ive copied this 1.5GB install file over 70 servers  and 12 .zip files cannot be opened - i need to delete them and rerun the distribution (using robocopy).

February 16th, 2015 1:52pm

There is no way to uncorrupt a file remotely with a script. Most Zip utilities can validate a file without unzipping.  Check you ZIP vendor for instructions.

An MD5 would work assuming that there is enough allocated memory which there obviously is not. Again - not a scripting issue.

Free Windows Admin Tool Kit Click here and download it now
February 16th, 2015 3:06pm

Your suggestion requires a 3rd party utility (zip utility), which is really no go. Cannot install 3rd party utils on production servers, its company policy.

I dont need to open corrupted zip file, i just want to know IF is it corrupted. MD5 checking was "easy" way to go ..

If remoting has some restrictions, I assume i should go this way:  run script via schtasks and  obtain required info

It wasnt scripting issue, i just didnt know why it didnt work.   U have good posts, but sometime u offer pears when apples were required.

Script is working just fine in local server, checking MD5 takes about 3secs (not that much imo). Only calling that script remotely fails (as u think[know], its some kind of restriction)

February 16th, 2015 3:39pm

I am sorry if you don't understand what non enough memory means.  You need to increase the end-point memory limits. Just look up how to configure PS remoting to see how to do it.  If you cannot do that then you will have to device other methods.

You asked why.  I explained.  Why is that a problem?

Free Windows Admin Tool Kit Click here and download it now
February 16th, 2015 4:32pm

Well, 16GB on remote session is quite OK for a file server.

And yet, the script was working fine (no obvious errors in the script, because it is working for smaller files... please)

Why should i increase RAM on remote server, when i can run this script directly on remote server, but cannot do it via PSSession?  Does remoting have increased memory requirements or what?


  • Edited by Mekac Monday, February 16, 2015 2:59 PM
February 16th, 2015 5:59pm

winrm get winrm/config/winrs Winrs MaxMemoryPerShellMB = 150

winrm set winrm/config/winrs `@`{MaxMemoryPerShellMB=`"2048`"`}
Winrs
    MaxMemoryPerShellMB = 2048

Script is working fine now. Thx Jrv


  • Marked as answer by Mekac 4 hours 13 minutes ago
  • Edited by Mekac 4 hours 13 minutes ago
Free Windows Admin Tool Kit Click here and download it now
February 17th, 2015 2:22am

winrm get winrm/config/winrs Winrs MaxMemoryPerShellMB = 150

winrm set winrm/config/winrs `@`{MaxMemoryPerShellMB=`"2048`"`}
Winrs
    MaxMemoryPerShellMB = 2048

Script is working fine now. Thx Jrv


  • Marked as answer by Mekac Tuesday, February 17, 2015 7:16 AM
  • Edited by Mekac Tuesday, February 17, 2015 7:17 AM
February 17th, 2015 10:16am

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics