SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table

Hello everyone,

I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.

On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?

I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.

Does anybody know which is the best way to setup the Dataflow task for this requirement?

Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.

Any help would be much appreciated. It's very urgent.

April 29th, 2015 2:53am

For Flat File Source, you get the file name from Variable.

I guess you can set the same variable name in OLEDB Destination which would contain the Table name.

Free Windows Admin Tool Kit Click here and download it now
April 29th, 2015 3:16am

Hello Vaibhav,

Thank you for the quick response.

As I specified in my post, it does not allow me to use the same variable name for OLEDB destination. SSIS does not support dynamic column mapping at runtime. Hence, I'm unable to change the column mappings for the second file in a loop.

April 29th, 2015 3:29am

Hello Vaibhav,

Thank you for the quick response.

As I specified in my post, it does not allow me to use the same variable name for OLEDB destination. SSIS does not support dynamic column mapping at runtime. Hence, I'm unable to change the column mappings for the second file in a loop.

Free Windows Admin Tool Kit Click here and download it now
April 29th, 2015 3:49am

You can generate data flow task dynamically if your file metadata is changing

see

http://www.selectsifiso.net/?p=288

April 29th, 2015 4:04am

Hello Visakh,

Thank you for sharing the link.

Is this approach quick enough to move the data from CSVs to respective SQL tables for 50 CSV files?

Free Windows Admin Tool Kit Click here and download it now
April 29th, 2015 4:12am

Since you are using the native SSIS data flow task (DFT) that does not support dynamic metadata, can't you choose 50 Data flow, each for it's own CSV ? Or is multiple DFT's not permitted ?
April 29th, 2015 11:20am

The standard Data Flow Task supports only static metadata defined during design time. I would recommend you check the commercial COZYROC Data Flow Task Plus. It is an extension of the standard Data Flow Task and it supports dynamic metadata at runtime. You can process all your input CSV files using a single Data Flow Task Plus. No programming skills are required.
Free Windows Admin Tool Kit Click here and download it now
April 29th, 2015 8:11pm

Hello COZYROC, thank you for sharing the link.

I would like to share the same with client and discuss on that. 

April 30th, 2015 2:52am

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics