Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How can I import a JSON file into PostgreSQL?

For example I have a file customers.json which is an array of objects (strictly formed) and it's pretty plain (without nested objects) like this (what is important: it's already include ids):

[   {     "id": 23635,     "name": "Jerry Green",     "comment": "Imported from facebook."   },   {     "id": 23636,     "name": "John Wayne",     "comment": "Imported from facebook."   } ] 

And I want to import them all into my postgres db into a table customers.

I found some pretty difficult ways when I should import it as json-typed column to a table like imported_json and column named data with objects listed there, then to use sql to get these values and insert it into a real table.

But is there a simple way of importing json to postgres with no touching of sql?

like image 686
Jerry Green Avatar asked Aug 30 '16 09:08

Jerry Green


People also ask

How do I import a JSON file into pgAdmin 4?

Using pgAdmin 4 GUI¶ Use the Import/Export field to select the Server Groups/Servers to be imported or exported. Use the Filename field to select the JSON file to import servers or create the new file in case of Export where the servers to be exported in the JSON format. Use the Remove all the existing servers?

Can we store JSON in PostgreSQL?

PostgreSQL offers two types for storing JSON data: json and jsonb . To implement efficient query mechanisms for these data types, PostgreSQL also provides the jsonpath data type described in Section 8.14. 7. The json and jsonb data types accept almost identical sets of values as input.

Should I use JSON in Postgres?

Follow these guidelines when you consider using JSON in PostgreSQL: Don't use JSON for data that can easily be stored in database tables. Avoid large JSON objects if you want to modify individual attributes. Don't use JSON if you want to use attributes in complicated WHERE conditions.


2 Answers

You can feed the JSON into a SQL statement that extracts the information and inserts that into the table. If the JSON attributes have exactly the name as the table columns you can do something like this:

with customer_json (doc) as (    values      ('[       {         "id": 23635,         "name": "Jerry Green",         "comment": "Imported from facebook."       },       {         "id": 23636,         "name": "John Wayne",         "comment": "Imported from facebook."       }     ]'::json) ) insert into customer (id, name, comment) select p.* from customer_json l   cross join lateral json_populate_recordset(null::customer, doc) as p on conflict (id) do update    set name = excluded.name,        comment = excluded.comment; 

New customers will be inserted, existing ones will be updated. The "magic" part is the json_populate_recordset(null::customer, doc) which generates a relational representation of the JSON objects.


The above assumes a table definition like this:

create table customer  (   id        integer primary key,   name      text not null,   comment   text ); 

If the data is provided as a file, you need to first put that file into some table in the database. Something like this:

create unlogged table customer_import (doc json); 

Then upload the file into a single row of that table, e.g. using the \copy command in psql (or whatever your SQL client offers):

\copy customer_import from 'customers.json' .... 

Then you can use the above statement, just remove the CTE and use the staging table:

insert into customer (id, name, comment) select p.* from customer_import l   cross join lateral json_populate_recordset(null::customer, doc) as p on conflict (id) do update    set name = excluded.name,        comment = excluded.comment; 
like image 85
a_horse_with_no_name Avatar answered Sep 18 '22 13:09

a_horse_with_no_name


It turns out there's an easy way to import a multi-line JSON object into a JSON column in a postgres database using the command line psql tool, without needing to explicitly embed the JSON into the SQL statement. The technique is documented in the postgresql docs, but it's a bit hidden.

The trick is to load the JSON into a psql variable using backticks. For example, given a multi-line JSON file in /tmp/test.json such as:

{   "dog": "cat",   "frog": "frat" } 

We can use the following SQL to load it into a temporary table:

sql> \set content `cat /tmp/test.json` sql> create temp table t ( j jsonb ); sql> insert into t values (:'content'); sql> select * from t; 

which gives the result:

               j                 ────────────────────────────────  {"dog": "cat", "frog": "frat"} (1 row) 

You can also perform operations on the data directly:

sql> select :'content'::jsonb -> 'dog';  ?column?  ──────────  "cat" (1 row) 

Under the covers this is just embedding the JSON in the SQL, but it's a lot neater to let psql perform the interpolation itself.

like image 24
Doctor Eval Avatar answered Sep 19 '22 13:09

Doctor Eval