I've imported a JSON array in a text field and want to convert this to JSONB.
I get the following error
SQL Error [54000]: ERROR: total size of jsonb array elements exceeds the maximum of 268435455 bytes ERROR: total size of jsonb array elements exceeds the maximum of 268435455 bytes ERROR: total size of jsonb array elements exceeds the maximum of 268435455 bytes
The pg_column_size()
function returns a size of 59,596,497.
The size of the JSON data was 200 MB on disk before I used the COPY command to import the data into the Postgres database.
How can I get around this? E.g. by calculating the resulting size from my JSON file on disk before importing?
C
can have a look at that.Maximum size Postgres JsonB can support As per Postgres official documentation, the maximum size is 255 MB per document. Please see the source code here.
The jsonb datatype is an advanced binary storage format with full processing, indexing and searching capabilities, and as such pre-processes the JSON data to an internal format, which does include a single value per key; and also isn't sensible to extra whitespace or indentation.
The json data type stores an exact copy of the input text, which processing functions must reparse on each execution; while jsonb data is stored in a decomposed binary format that makes it slightly slower to input due to added conversion overhead, but significantly faster to process, since no reparsing is needed.
JSONB supports indexing the JSON data, and is very efficient at parsing and querying the JSON data. In most cases, when you work with JSON in PostgreSQL, you should be using JSONB.
Text data will be larger when being saved in postgres tables.
You can get an approximate value: DATA_SIZE*1,28
2nd Quadrant.
Also, before loading your data select pg_column_size(to_jsonb('[DATA]'::text))
will give you a hint of your expected size.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With