SQL server 2014 and VS 2013 - Dataflow task, read CSV file and insert data to SQL table

Hello everyone,

I was assigned a work item wherein, I've a dataflow task on For Each Loop container at control flow of SSIS package. This For Each Loop container reads the CSV files from the specified location one by one, and populates a variable with current file name. Note, the tables where I would like to push the data from each CSV file are also having the same names as CSV file names.

On the dataflow task, I've Flat File component as a source, this component uses the above variable to read the data of a particular file. Now, here my question comes, how can I move the data to destination, SQL table, using the same variable name?

I've tried to setup the OLE DB destination component dynamically but it executes well only for first time. It does not change the mappings as per the columns of the second CSV file. There're around 50 CSV files, each has different set off columns in it. These files needs to be migrated to SQL tables using the optimum way.

Does anybody know which is the best way to setup the Dataflow task for this requirement?

Also, I cannot use Bulk insert task here as we would like to keep a log of corrupted rows.

Any help would be much appreciated. It's very urgent.

April 29th, 2015 2:52am

For Flat File Source, you get the file name from Variable.

I guess you can set the same variable name in OLEDB Destination which would contain the Table name.

Free Windows Admin Tool Kit Click here and download it now
April 29th, 2015 3:17am

Hello Vaibhav,

Thank you for the quick response.

As I specified in my post, it does not allow me to use the same variable name for OLEDB destination. SSIS does not support dynamic column mapping at runtime. Hence, I'm unable to change the column mappings for the second file in a loop.

April 29th, 2015 3:30am

This topic is archived. No further replies will be accepted.

Other recent topics Other recent topics