Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Handling Unicode sequences in postgresql

I have some JSON data stored in a JSON (not JSONB) column in my postgresql database (9.4.1). Some of these JSON structures contain unicode sequences in their attribute values. For example:

{"client_id": 1, "device_name": "FooBar\ufffd\u0000\ufffd\u000f\ufffd" } 

When I try to query this JSON column (even if I'm not directly trying to access the device_name attribute), I get the following error:

ERROR: unsupported Unicode escape sequence
Detail: \u0000 cannot be converted to text.

You can recreate this error by executing the following command on a postgresql server:

select '{"client_id": 1, "device_name": "FooBar\ufffd\u0000\ufffd\u000f\ufffd" }'::json->>'client_id' 

The error makes sense to me - there is simply no way to represent the unicode sequence NULL in a textual result.

Is there any way for me to query the same JSON data without having to perform "sanitation" on the incoming data? These JSON structures change regularly so scanning a specific attribute (device_name in this case) would not be a good solution since there could easily be other attributes that might hold similar data.


After some more investigations, it seems that this behavior is new for version 9.4.1 as mentioned in the changelog:

...Therefore \u0000 will now also be rejected in json values when conversion to de-escaped form is required. This change does not break the ability to store \u0000 in json columns so long as no processing is done on the values...

Was this really the intention? Is a downgrade to pre 9.4.1 a viable option here?


As a side note, this property is taken from the name of the client's mobile device - it's the user that entered this text into the device. How on earth did a user insert NULL and REPLACEMENT CHARACTER values?!

like image 215
Lix Avatar asked Jul 28 '15 08:07

Lix


People also ask

Does PostgreSQL support Unicode?

One of the interesting features of PostgreSQL database is the ability to handle Unicode characters. In SQL Server, to store non-English characters, we need to use NVARCHAR or NCAHR data type. In PostgreSQL, the varchar data type itself will store both English and non-English characters.

Does PostgreSQL support UTF 16?

Short answer, this is not directly possible as PostgreSQL only supports a UTF-8 character set. UTF-16 based formats like Java, JavaScript, Windows can contain half surrogate pairs which have no representation in UTF-8 or UTF-32. These may easily be created by sub-stringing a Java, JavaScript, VB.Net string.

Does PostgreSQL support Emoji?

emoji is a pure SQL PostgreSQL extension to encode/decode bytea/text to/from emoji.


2 Answers

\u0000 is the one Unicode code point which is not valid in a string. I see no other way than to sanitize the string.

Since json is just a string in a specific format, you can use the standard string functions, without worrying about the JSON structure. A one-line sanitizer to remove the code point would be:

SELECT (regexp_replace(the_string::text, '\\u0000', '', 'g'))::json; 

But you can also insert any character of your liking, which would be useful if the zero code point is used as some form of delimiter.

Note also the subtle difference between what is stored in the database and how it is presented to the user. You can store the code point in a JSON string, but you have to pre-process it to some other character before processing the value as a json data type.

like image 159
Patrick Avatar answered Sep 24 '22 14:09

Patrick


The solution by Patrick didn't work out of the box for me. Regardless there was always an error thrown. I then researched a little more and was able to write a small custom function that fixed the issue for me.

First I could reproduce the error by writing:

select json '{ "a":  "null \u0000 escape" }' ->> 'a' as fails 

Then I added a custom function which I used in my query:

CREATE OR REPLACE FUNCTION null_if_invalid_string(json_input JSON, record_id UUID)   RETURNS JSON AS $$ DECLARE json_value JSON DEFAULT NULL; BEGIN   BEGIN     json_value := json_input ->> 'location';     EXCEPTION WHEN OTHERS     THEN       RAISE NOTICE 'Invalid json value: "%".  Returning NULL.', record_id;       RETURN NULL;   END;   RETURN json_input; END; $$ LANGUAGE plpgsql; 

To call the function do this. You should not receive an error.

select null_if_invalid_string('{ "a":  "null \u0000 escape" }', id) from my_table 

Whereas this should return the json as expected:

select null_if_invalid_string('{ "a":  "null" }', id) from my_table 
like image 38
Hendrik Avatar answered Sep 22 '22 14:09

Hendrik