I have a db on my local machine and I want to import the data to the db on my hosting. Both db's are identical, the same table names
, column names
, etc.
When I export
the table from my local db through phpmyadmin
and import
it through phpmyadmin on my hosting an error pops up telling me that there are duplicate entries for the primary key
and stops the whole operation.
How can I import the data through phpmyadmin, skip the duplicate entries, and display a list of the duplicates at the end of the process?
A solution that I can do is call all the values of the primary key in the db at my hosting and filter the duplicates before import. BUT I am wondering if there is a quick solution for this with phpmyadmin?
While importing records, you can use the Skip or Overwrite option to avoid creating duplicate records. The duplicate records are identified based on a particular field for each type of record.
The go to solution for removing duplicate rows from your result sets is to include the distinct keyword in your select statement. It tells the query engine to remove duplicates to produce a result set in which every row is unique.
How it is possible ? - Check for fields that You send to API with records in user table. Make sure they're not exist. For more detailed information to client-side app I recommend to check database table for record existence before doing insert.
In phpMyAdmin , in Settings tab, you can try checking the following values:
If you are using CSV format:
If you are using SQL format:
There are a couple of ways to do what you want:
The brutal way:
TRUNCATE TABLE yourTbl; -- emtpies out the table
Then import, but you might loose data, so perhaps create a backup table. All things considered, just don't do this, check the alternatives listed below:
Write your own INSERT
query, with IGNORE
clause:
INSERT IGNORE INTO yourTbl -- as shown in the linked duplicate
But, since you are importing a file, the query will, most likely be a LOAD DATA [LOCAL] INFILE
. As you can see in the manual, you can easily add an IGNORE
to that query, too:
LOAD DATA LOCAL INFILE '/path/to/files/filename1.csv' IGNORE -- IGNORE goes here INTO TABLE your_db.your_tbl FIELDS TERMINATED BY ';' OPTIONALLY ENCLOSED BY '"' LINES TERMINATED BY '\n' (`field1`,`field2`);
That's it. Don't worry if you're not too comfortable writing your own queries like this, there are other ways of doing what you want to do:
The CLI way:
mysqlimport -i dbname fileToImport # Or mysqlimport --ignore dbname fileToImport
Also CLI, create a file containing the LOAD DATA
query above, then:
$: mysql -u root -p *********** #enter password mysql> source /path/to/queryFile.sql
This requires you to have access to the command line, and run this command Here's the manual page of MySQL
Using phpMyAdmin, when importing, you'll find a checkbox saying "Ignore duplicates", check that and import. Here's a page with screenshots
You could also choose to check "Ignore errors", but that's another brute-force approach, and I wouldn't recommend that.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With