Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to skip duplicate records when importing in phpmyadmin

Tags:

I have a db on my local machine and I want to import the data to the db on my hosting. Both db's are identical, the same table names, column names, etc.

When I export the table from my local db through phpmyadmin and import it through phpmyadmin on my hosting an error pops up telling me that there are duplicate entries for the primary key and stops the whole operation.

How can I import the data through phpmyadmin, skip the duplicate entries, and display a list of the duplicates at the end of the process?

A solution that I can do is call all the values of the primary key in the db at my hosting and filter the duplicates before import. BUT I am wondering if there is a quick solution for this with phpmyadmin?

like image 757
Jo E. Avatar asked Aug 28 '13 10:08

Jo E.


People also ask

How can we avoid creating duplicates while importing data?

While importing records, you can use the Skip or Overwrite option to avoid creating duplicate records. The duplicate records are identified based on a particular field for each type of record.

How you will avoid duplicating records in a query?

The go to solution for removing duplicate rows from your result sets is to include the distinct keyword in your select statement. It tells the query engine to remove duplicates to produce a result set in which every row is unique.

How do I avoid insert duplicate records in node MySQL?

How it is possible ? - Check for fields that You send to API with records in user table. Make sure they're not exist. For more detailed information to client-side app I recommend to check database table for record existence before doing insert.


2 Answers

In phpMyAdmin , in Settings tab, you can try checking the following values:

  • Settings -> SQL Queries -> Ignore multiple statement errors

If you are using CSV format:

  • Settings -> Import -> CSV -> Do not abort on INSERT error

If you are using SQL format:

  • Settings -> Export -> SQL -> Use ignore inserts
like image 51
nl-x Avatar answered Oct 13 '22 04:10

nl-x


There are a couple of ways to do what you want:

The brutal way:

TRUNCATE TABLE yourTbl; -- emtpies out the table 

Then import, but you might loose data, so perhaps create a backup table. All things considered, just don't do this, check the alternatives listed below:

Write your own INSERT query, with IGNORE clause:

INSERT IGNORE INTO yourTbl -- as shown in the linked duplicate 

But, since you are importing a file, the query will, most likely be a LOAD DATA [LOCAL] INFILE. As you can see in the manual, you can easily add an IGNORE to that query, too:

LOAD DATA LOCAL INFILE '/path/to/files/filename1.csv' IGNORE -- IGNORE goes here     INTO TABLE your_db.your_tbl         FIELDS TERMINATED BY ';'                OPTIONALLY ENCLOSED BY '"'         LINES TERMINATED BY '\n'     (`field1`,`field2`); 

That's it. Don't worry if you're not too comfortable writing your own queries like this, there are other ways of doing what you want to do:
The CLI way:

mysqlimport -i dbname fileToImport # Or mysqlimport --ignore dbname fileToImport 

Also CLI, create a file containing the LOAD DATA query above, then:

$: mysql -u root -p *********** #enter password mysql> source /path/to/queryFile.sql 

This requires you to have access to the command line, and run this command Here's the manual page of MySQL

Using phpMyAdmin, when importing, you'll find a checkbox saying "Ignore duplicates", check that and import. Here's a page with screenshots
You could also choose to check "Ignore errors", but that's another brute-force approach, and I wouldn't recommend that.

like image 32
Elias Van Ootegem Avatar answered Oct 13 '22 03:10

Elias Van Ootegem