Help Needed in Reading lines

Hi,

I am trying to read the file from text file and inserting into sqlserver table. I am facing trick issue, The code is reading the file but inserts into null value.

table structure:

create table Sample(Id int primary key identity(1,1),Name varchar(100),Age int)

sample code which loops through

$datatable = New-Object "System.Data.DataTable"
 $datatable.Columns.Add("Name",[string])
 $datatable.Columns.Add("Age", [int])
  # Read in the data, line by line 
    while (($line = $reader.ReadLine()) -ne $null)  {
       
			 $data = $line.Split("|")
			 $row = $datatable.NewRow()
			 $row.Item("Name") = $data[0]
			 $row.Item("Age") = [int]$data[1]
			 $datatable.Rows.Add($row) 
 
 
        # Once you reach your batch size, write to the db,  
        # then clear the datatable from memory 
        $i++; if (($i % $batchsize) -eq 0) { 
        $bulkcopy.WriteToServer($datatable) 
        Write-Output "$i rows have been inserted in $($elapsed.Elapsed.ToString())."; 
        $datatable.Clear() 
        } 
    }  

sample data

Name1|10
Name2|20
Name3|30
Script started...
Property 'EnableStreaming' cannot be found
At E:\Sample\PowerShell\Test.PS1:4
+ $bulkcopy. <<<< EnableStreaming = 1
    + CategoryInfo          : InvalidOperat
    + FullyQualifiedErrorId : PropertyAssig



AllowDBNull        : True
AutoIncrement      : False
AutoIncrementSeed  : 0
AutoIncrementStep  : 1
Caption            : Name
ColumnName         : Name
Prefix             :
DataType           : System.String
DateTimeMode       : UnspecifiedLocal
DefaultValue       :
Expression         :
ExtendedProperties : {}
MaxLength          : -1
Namespace          :
Ordinal            : 0
ReadOnly           : False
Table              : {}
Unique             : False
ColumnMapping      : Element
Site               :
Container          :
DesignMode         : False

AllowDBNull        : True
AutoIncrement      : False
AutoIncrementSeed  : 0
AutoIncrementStep  : 1
Caption            : Age
ColumnName         : Age
Prefix             :
DataType           : System.Int32
DateTimeMode       : UnspecifiedLocal
DefaultValue       :
Expression         :
ExtendedProperties : {}
MaxLength          : -1
Namespace          :
Ordinal            : 1
ReadOnly           : False
Table              : {}
Unique             : False
ColumnMapping      : Element
Site               :
Container          :
DesignMode         : False

It's putting age values in Name column and makes age column is null. not sure why it skips the name column. Any suggestions please

September 5th, 2015 2:01am

While you can use PowerShell to read from a text file and import into a SQL Server table, it is not recommended to do so if you already know the format of the files. SQL Server Integration Services is a recommended tool for doing so.

But in case you still want to use PowerShell, refer to this article for a different approach

https://www.mssqltips.com/sqlservertip/3208/automating-flat-file-sql-server-imports-with-powershell/

Free Windows Admin Tool Kit Click here and download it now
September 5th, 2015 2:30am

Your code does not match the error message.

Here is a better way to bulk load a table.

$csv=Import-Csv somefile.txt -Delimiter '|' -Header Name, Age

Now we have an object.

$dt=Out-DataTable $csv

Now we have a bulk loadable data table.

See this for some assistance: https://gallery.technet.microsoft.com/scriptcenter/4208a159-a52e-4b99-83d4-8048468d29ddH

Here is a sample driver:

$csvfile='.\MyCsv.csv'
$sqlTable='MyDataTable'
$DataSource='MyServer\MyInstance'
$DataBase='db1'

$ConnectionString ='Data Source={0}; Database={`}; Trusted_Connection=True;' -f $DataSource,$DataBase
$csv=Import-Csv $csvfile.txt -Delimiter '|' -Header Name, Age $csvDataTable = $csv | Out-DataTable $bulkCopy = new-object Data.SqlClient.SqlBulkCopy($ConnectionString) $bulkCopy.DestinationTableName=$sqlTable $bulkCopy.WriteToServer($csvDataTable)

You can add options to the bulkcopy or you can custom map the columns.  The default is to auto-map by name.

September 5th, 2015 3:30am

Hello Guys,

Thanks for the input and am doing some experiment using powershell.Blow the full code that i tried.  As i need to hash the name i am using SHA256managed which  i need to use. So i wrote a function to hash but seems it's not hashing and inserting null values. please let me know if am doing wrong in hashing.

Also, after doing the hashing the process become slow. earlier without hashing it tool less then a half minute to do the insert of 500000 records. But after adding the hash,it took around 5 minutes. What is the best way to speedup this .

 
# Database variables 
$sqlserver = "Server" 
$database = "Sample" 
$table = Test
$UserName = 'sa'
$Password = 'Test'
 
# CSV variables;  
$csvfile = "D:\PowerShell\Data.txt" 
$csvdelimiter = "|" 
$firstrowcolumnnames = $false 

function Hash($textToHash)
{
    $hasher = new-object System.Security.Cryptography.SHA256Managed
    $toHash = [System.Text.Encoding]::UTF8.GetBytes($textToHash)
    $hashByteArray = $hasher.ComputeHash($toHash)    
    return $hashByteArray.ToBase64String;
}
 

 
Write-Output "Script started..." 
$elapsed = [System.Diagnostics.Stopwatch]::StartNew() 
 
# 100k worked fastest and kept memory usage to a minimum 
$batchsize = 100000 
 
# Build the sqlbulkcopy connection, and set the timeout to infinite 
$connectionstring = "Server=$sqlserver;Database=$database;User ID=$UserName;Password=$Password;Trusted_Connection=false;"
# $connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;" 
#$bulkcopy = new-object ("Data.SqlClient.Sqlbulkcopy")  
$bulkCopy = New-Object Data.SqlClient.SqlBulkCopy($connectionstring,
        [System.Data.SqlClient.SqlBulkCopyOptions]::KeepIdentity)
$bulkcopy.DestinationTableName = $table 
$bulkcopy.bulkcopyTimeout = 0 
$bulkcopy.batchsize = $batchsize 
$bulkcopy.EnableStreaming = 1 
  
# Create the datatable, and autogenerate the columns. 
$datatable = New-Object "System.Data.DataTable" 
 
# Open the text file from disk 
$reader = new-object System.IO.StreamReader($csvfile) 
$line = $reader.ReadLine() 
#$columns =  $line.Split($csvdelimiter)
 $dataTable = New-Object System.Data.DataTable            

$dataTable.Columns.Add("Name")  | Out-Null
$dataTable.Columns.Add("Age", [int]) | Out-Null 
$ColumnMap1 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping("Name", "Name")
$ColumnMap2 = New-Object System.Data.SqlClient.SqlBulkCopyColumnMapping("Age", "Age")
$bulkcopy.ColumnMappings.Add($ColumnMap1)
$bulkcopy.ColumnMappings.Add($ColumnMap2)
  # Read in the data, line by line 
    while (($line = $reader.ReadLine()) -ne $null)  {
       
			 $data = $line.Split("|")
			 $row = $datatable.NewRow()
			 $row.Item("Name") = Hash($data[0])
			 $row.Item("Age") = [int]$data[1]
			 $datatable.Rows.Add($row) 
 
 
        # Once you reach your batch size, write to the db,  
        # then clear the datatable from memory 
        $i++; if (($i % $batchsize) -eq 0) { 
		
        $bulkcopy.WriteToServer($datatable) 
        Write-Output "$i rows have been inserted in $($elapsed.Elapsed.ToString())."; 
        $datatable.Clear() 
        } 
    }  
 
# Close the CSV file 
$reader.Close() 
 
    # Add in all the remaining rows since the last clear 
    if($datatable.Rows.Count -gt 0) { 
        $bulkcopy.WriteToServer($datatable) 
        $datatable.Clear() 
    } 
 
# Sometimes the Garbage Collector takes too long. 
[System.GC]::Collect()     
 
Write-Output "Script complete. $i rows have been inserted into the database." 
Write-Output "Total Elapsed Time: $($elapsed.Elapsed.ToString())"
PowerShell vs C#   which one will be best choice to do this operation? assume in my real time scenario i need to do this with 5 million records

Free Windows Admin Tool Kit Click here and download it now
September 5th, 2015 1:05pm

Without your full system it is not possible to determine what is happening.  You are asking a lot of fairly vague questions that seem unrelated.

I will say this. Your hash function will eat up memory because you keep recreating the object.

Don't use a function.  Create the hasher once and just add the two lines that do the work to the loop.

$toHash = [System.Text.Encoding]::UTF8.GetBytes($data[0])
$hashByteArray = $hasher.ComputeHash($toHash)   
$row.Item("Name") = [System.Convert]::ToBase64String($hashByteArray)

September 5th, 2015 2:09pm

Thank you JRV. am able to achieve now. appreciated....
Free Windows Admin Tool Kit Click here and download it now
September 5th, 2015 9:34pm

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics