I have a table in a postgresql 9.4 database with a jsonb field called receivers. Some example rows:
[{"id": "145119603", "name": "145119603", "type": 2}] [{"id": "1884595530", "name": "1884595530", "type": 1}] [{"id": "363058213", "name": "363058213", "type": 1}] [{"id": "1427965764", "name": "1427965764", "type": 1}] [{"id": "193623800", "name": "193623800", "type": 0}, {"id": "419955814", "name": "419955814", "type": 0}] [{"id": "624635532", "name": "624635532", "type": 0}, {"id": "1884595530", "name": "1884595530", "type": 1}] [{"id": "791712670", "name": "791712670", "type": 0}] [{"id": "895207852", "name": "895207852", "type": 0}] [{"id": "144695994", "name": "144695994", "type": 0}, {"id": "384217055", "name": "384217055", "type": 0}] [{"id": "1079725696", "name": "1079725696", "type": 0}]
I have a list of values for id and want to select any row that contains an object with any of the values from that list, within the array in the jsonb field.
Is that possible? Is there a GIN index I can make that will speed this up?
jsonb[] is not an "extra" datatype, it's simply an array of JSONB values. Similar to text[] or integer[] . You can create arrays from every type.
Postgres offers a jsonb_set function for updating JSON fields. The second parameter path defines, which property you want to update. To update items in an array, you can use an index-based approach. To update the first entry in the items array in the example above, a path woud look like this: {items, 0, customerId} .
JSONB and IndexesPostgreSQL can use indexes for the text results as compare operands. GIN index can be used by the GIN JSONB operator class.
Querying the JSON documentPostgreSQL has two native operators -> and ->> to query JSON documents. The first operator -> returns a JSON object, while the operator ->> returns text. These operators work on both JSON as well as JSONB columns. There are additional operators available for JSONB columns.
There is no single operation, which can help you, but you have a few options:
1. If you have a small (and fixed) number of ids to query, you can use multiple containment operators @>
combined with or
; f.ex.:
where data @> '[{"id": "1884595530"}]' or data @> '[{"id": "791712670"}]'
A simple gin
index can help you on your data column here.
2. If you have variable number of ids (or you have a lot of them), you can use json[b]_array_elements()
to extract each element of the array, build up an id list and then query it with the any-containment operator ?|
:
select * from jsonbtest where to_json(array(select jsonb_array_elements(data) ->> 'id'))::jsonb ?| array['1884595530', '791712670'];
Unfortunately, you cannot index an expression, which has a sub-query in it. If you want to index it, you need to create a function for it:
create function idlist_jsonb(jsonbtest) returns jsonb language sql strict immutable as $func$ select to_json(array(select jsonb_array_elements($1.data) ->> 'id'))::jsonb $func$; create index on jsonbtest using gin (idlist_jsonb(jsonbtest));
After this, you can query ids like this:
select *, jsonbtest.idlist_jsonb from jsonbtest where jsonbtest.idlist_jsonb ?| array['193623800', '895207852'];
Note: I used dot notation / computed field here, but you don't have to.
3. But at this point, you don't have to stick with json[b]: you have a simple text array, which is supported by PostgreSQL too.
create function idlist_array(jsonbtest) returns text[] language sql strict immutable as $func$ select array(select jsonb_array_elements($1.data) ->> 'id') $func$; create index on jsonbtest using gin (idlist_array(jsonbtest));
And query this computed field with the overlap array operator &&
:
select *, jsonbtest.idlist_array from jsonbtest where jsonbtest.idlist_array && array['193623800', '895207852'];
Note: From my internal testing, this latter solution is calculated with a higher cost than the jsonb variant, but in fact it is faster than that, a little. If performance really matters to you, you should test both.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With