Scenario:
I have created transformation to load data into table from csv file and I have following columns in csv file:
But user may give input file with column ordering (random order) as
so, if I try to load file which has random column ordering, will kettle load correct column values as per column names ... ?
You can use Select / Rename values step to remove any field from record stream. Save this answer. Show activity on this post.
Using ETL Metadata Injection
you can use a transformation like this, to either normalize the data, or to store it to your database:
Then you just need to send the correct data to that transformation. You can read the header line from the CSV, and use Row Normaliser
to convert to the format used by ETL Metadata Injection
.
I have included a quick example here: csv_inject on Dropbox, if you make something like this and run it from something that runs it per csv file it should work.
Ooh, thats some nasty javascript!
The way to do this is with metadata injection. Look at the samples, but basically you need a template which reads the file, and writes it back out. you then use another parent transformation to figure out the headings, configure that template and then execute it.
There are samples in the PDI samples folder, and also take a look at the "figuring out file format" example in matt casters blueprints project on github.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With