I have a bunch of pretty large CSV (comma separated values) files and I want to analyze them. SQL queries are ideal for this. So far I have been using MS Access to import the CSV files and execute queries on them. However, in addition to having a very bad SQL editor and having stupid arbitrary rules about when a query/table can be opened/edited/deleted, it is extremely slow. I can live with initial slowness when executing a query, but I really hate that it seems to re-execute it whenever I try to sort the table on another column, wait 5 minutes, or try to scroll.
Are there any better (free) tools for the job? (I would be willing to manually write a script to transform the CSV into something else if required)
Thanks!
Edit: Thanks for all the answers! I'm going to try out SQL Server and if it works out, I'll accept that answer. Just a clarification: importing the data file is not necessarily a problem for me (although easy ways to do this are of course welcome). What I need is some program that subsequently allows me to quickly and efficiently execute (reasonably complex) queries on it. FileHelpers isn't going to help me, because first of all I don't know any .NET languages and second of all, I had the data in Java, but I thought analyzing it would be a lot easier with SQL. Thanks for the answer though!
Why don't just import them to MySQL, it's easy.
LOAD DATA LOCAL INFILE 'file.csv'
INTO TABLE some_table
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
(field1, filed2, field3);
And then you can easilly run queries of any complexity...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With