I have a .txt file that has a bunch of formatted data in it that looks like the following:
...
1 75175.18 95128.46
1 790890.89 795829.16
1 875975.98 880914.25
8 2137704.37 2162195.53
8 2167267.27 2375275.28
10 2375408.74 2763997.33
14 2764264.26 2804437.77
15 2804504.50 2881981.98
16 2882048.72 2887921.25
16 2993093.09 2998031.36
19 3004104.10 3008041.37
...
I am trying to load each row as an entry into a table in my database, where each column is a different field. I am having trouble getting mySQL to separate all of the data properly. I think the issue is coming from the fact that not all of the numbers are separated with an equidistant white-space amount.
Here are two queries I have tried so far (I have also tried several variations of these queries):
LOAD DATA LOCAL INFILE
'/some/Path/segmentation.txt'
INTO TABLE clip (slideNum, startTime, endTime)
SET presID = 1;
LOAD DATA LOCAL INFILE
'/some/Path/segmentation.txt'
INTO TABLE clip
FIELDS TERMINATED BY ' '
LINES TERMINATED BY '\n'
(slideNum, startTime, endTime)
SET presID = 1;
Any ideas how to get this to work?
These are what we call "fixed-width" records and LOAD DATA doesn't play well with them. Options:
SUBSTR()
and TRIM()
to slice out the columns you need into the final table.LOAD DATA LOCAL INFILE
'/some/Path/segmentation.txt'
INTO TABLE clip
(@row)
SET slideNum = TRIM(SUBSTR(@row,1,4)),
startTime = TRIM(SUBSTR(@row,5,13)),
endTime = TRIM(SUBSTR(@row,18,13))
;
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With