I think importing using one of the methods mentioned is ideal if it truly is a large file, but you can use Excel to create insert statements:
="INSERT INTO table_name VALUES('"&A1&"','"&B1&"','"&C1&"')"
In MS SQL you can use:
SET NOCOUNT ON
To forego showing all the '1 row affected' comments. And if you are doing a lot of rows and it errors out, put a GO between statements every once in a while
There is a handy tool which saves a lot of time at
http://tools.perceptus.ca/text-wiz.php?ops=7
You just have to feed in the table name, field names and the data - tab separated and hit Go!
You can create an appropriate table through management studio interface and insert data into the table like it's shown below. It may take some time depending on the amount of data, but it is very handy.
You can use the following excel statement:
="INSERT INTO table_name(`"&$A$1&"`,`"&$B$1&"`,`"&$C$1&"`, `"&$D$1&"`) VALUES('"&SUBSTITUTE(A2, "'", "\'")&"','"&SUBSTITUTE(B2, "'", "\'")&"','"&SUBSTITUTE(C2, "'", "\'")&"', "&D2&");"
This improves upon Hart CO's answer as it takes into account column names and gets rid of compile errors due to quotes in the column. The final column is an example of a numeric value column, without quotes.
Depending on the database, you can export to CSV and then use an import method.
MySQL - http://dev.mysql.com/doc/refman/5.1/en/load-data.html
PostgreSQL - http://www.postgresql.org/docs/8.2/static/sql-copy.html
You could use VB to write something that will output to a file row by row adding in the appropriate sql statements around your data. I have done this before.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With