Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

MySQL CSV import: datetime value

Tags:

mysql

csv

I have been banging my head against a wall trying to import datetime values from a .csv file.

Here's the import statement.

LOAD DATA LOCAL INFILE 'myData.csv' INTO TABLE equity_last_import FIELDS TERMINATED BY ',' LINES TERMINATED BY '\n' (equity,last,@last_date) SET last_date = STR_TO_DATE( @last_date, '%Y-%m-%d %H:%i:%s')

Here's a sample of the data:

4108,48.74,"2013-09-16 16:15:04"
4249,8.1,"2013-09-16 16:15:04"
4197,3.81,"2013-09-16 17:20:00"
4139,26.81,"2013-09-16 16:15:04"
4218,24.83,"2013-09-16 17:20:00"
4260,79.72,"2013-09-16 16:15:04"
4270,450.12,"2013-09-16 17:20:00"
4242,30.38,"2013-09-16 16:15:04"
4193,1.42,"2013-09-16 16:15:04"
4134,3.77,"2013-09-16 16:15:04"

I am able to import date values using STR_TO_DATE() but I not able to get datetime values to import. I have tried several different date formats other than '%Y-%m-%d %H:%i:%s' and I always get a null datetime [0000-00-00 00:00:00]. I have also tried not using STR_TO_DATE(), since the string is in the default MySQL datetime format.

Any help will be appreciated.

like image 336
Binary Alchemist Avatar asked Sep 16 '13 21:09

Binary Alchemist


4 Answers

I ran into the same problem. I fixed it by changing the format for the date column in my CSV file to match the MySQL datetime format.

  1. Open CSV in Excel.
  2. Highlight the column.
  3. Right-click on the column.
  4. Click on Format Cells.
  5. Pick Custom.
  6. Use yyyy/mm/dd hh:mm:ss in the Type field.
  7. Click ok

My CSV successfully imported after I changed the datetime format as above.

like image 103
Here Iam Avatar answered Oct 14 '22 03:10

Here Iam


The date in your data file is already in a format MySQL should natively understand. It's just enclosed in double quotes. You need to tell LOAD DATA INFILE how to deal with the quotes. Try something like this:

LOAD DATA LOCAL INFILE 'myData.csv'
INTO TABLE equity_last_import
FIELDS OPTIONALLY ENCLOSED BY '"' TERMINATED BY ','
LINES TERMINATED BY '\n'
(equity,last,last_date)

Update:

Since you've said it doesn't work, I created a test table and verified that it does work. Here's the proof:

I've highlighted your csv data from the question and pasted into a new file called myData.csv in my system's /tmp folder. Then I connected to the mysql console, switched to the test database and ran the following:

mysql> create table equity_last_import (equity int, last decimal(10,2), last_date datetime) engine=innodb;
Query OK, 0 rows affected (0.02 sec)

mysql> LOAD DATA LOCAL INFILE '/tmp/myData.csv'
    -> INTO TABLE equity_last_import
    -> FIELDS OPTIONALLY ENCLOSED BY '"' TERMINATED BY ','
    -> LINES TERMINATED BY '\n'
    -> (equity,last,last_date);
Query OK, 10 rows affected (0.00 sec)
Records: 10  Deleted: 0  Skipped: 0  Warnings: 0

mysql> select * from equity_last_import;
+--------+--------+---------------------+
| equity | last   | last_date           |
+--------+--------+---------------------+
|   4108 |  48.74 | 2013-09-16 16:15:04 |
|   4249 |   8.10 | 2013-09-16 16:15:04 |
|   4197 |   3.81 | 2013-09-16 17:20:00 |
|   4139 |  26.81 | 2013-09-16 16:15:04 |
|   4218 |  24.83 | 2013-09-16 17:20:00 |
|   4260 |  79.72 | 2013-09-16 16:15:04 |
|   4270 | 450.12 | 2013-09-16 17:20:00 |
|   4242 |  30.38 | 2013-09-16 16:15:04 |
|   4193 |   1.42 | 2013-09-16 16:15:04 |
|   4134 |   3.77 | 2013-09-16 16:15:04 |
+--------+--------+---------------------+
10 rows in set (0.00 sec)

See? It works perfectly.

Another Update:

You've specified that you're getting the following error now:

Out of range value for column 'last_date' at row 1

Does your CSV file have a header? If so, you may want to add IGNORE 1 LINES to your LOAD DATA INFILE command to tell MySQL to skip over the header.

like image 41
Asaph Avatar answered Oct 14 '22 05:10

Asaph


Pulled my hair out over this also because I'm not importing in the above suggested way.

Workaround: Created a temporary field temp_date of type "VARCHAR" on your import table and have no problems loading the data. I then was able to perform an update on my date column which is of type date.

    update table
    set date = temp_date
like image 42
user1746679 Avatar answered Oct 14 '22 04:10

user1746679


I was having the same trouble and here's what I discovered, I think it might help you

The problem regards a GMT conflict: The database I was extracting the .csv file was on GMT 00:00, so there wasn't daylight saving time.

My local server (which was the one I was trying to insert the .csv file) was running on my computer (system's) GMT, by default. In my case it was GMT -3, which is affected by daylight saving time.

SQL has a special way to deal with daylight saving time, it ignores the exact date and time when the daylight saving time starts to happen. In my case, it was october 20 and all the records between 00 and 1 AM of that day simply weren't recognized by the server, even if the format was correct, because they simply 'didn't exist' (see more here). This was affecting all timestamp records, not only the specifics of daylight saving time.

My solution was to set the local server to GMT +00, before creating the new table and import the .csv, using

SET time_zone='+00:00';

And then when I imported the .csv file all the records were read properly. I think if you set the time zone equal to the one that generated the .csv file should work!

like image 33
pablonaze Avatar answered Oct 14 '22 03:10

pablonaze