I have a table with JSON array data I'd like to search.
CREATE TABLE data (id SERIAL, json JSON);
INSERT INTO data (id, json)
VALUES (1, '[{"name": "Value A", "value": 10}]');
INSERT INTO data (id, json)
VALUES (2, '[{"name": "Value B1", "value": 5}, {"name": "Value B2", "value": 15}]');
As described in this answer, i created a function, which also allows to create an index on the array data (important).
CREATE OR REPLACE FUNCTION json_val_arr(_j json, _key text)
RETURNS text[] AS
$$
SELECT array_agg(elem->>_key)
FROM json_array_elements(_j) AS x(elem)
$$
LANGUAGE sql IMMUTABLE;
This works nicely if I want to find an entire value (eg. "Value B1"):
SELECT *
FROM data
WHERE '{"Value B1"}'::text[] <@ (json_val_arr(json, 'name'));
Now my questions:
Is it possible to find values with a wildcard (eg. "Value*")? Something like the following (naive) approach:
...
WHERE '{"Value%"}'::text[] <@ (json_val_arr(json, 'name'));
Is it possible to find numeric values with comparison operators (eg. >= 10)? Again, a naive and obviously wrong approach:
...
WHERE '{10}'::int[] >= (json_val_arr(json, 'value'));
I tried to create a new function returning int[]
but that did not work.
I created a SQL Fiddle to illustrate my problem.
Or would it be better to use a different approach like the following working queries:
SELECT *
FROM data,
json_array_elements(json) jsondata
WHERE jsondata ->> 'name' LIKE 'Value%';
and
...
WHERE cast(jsondata ->> 'value' as integer) <= 10;
However, for these queries, I was not able to create any index that was actually picked up by the queries.
Also, I'd like to implement all this in Postgresql 9.4 with JSONB eventually, but I think for the above questions this should not be an issue.
Thank you very much!
I know its been a while but I was just chugging on something similar (using wild cards to query json datatypes) and thought I'd share what I found.
Firstly, this was a huge point in the right direction: http://schinckel.net/2014/05/25/querying-json-in-postgres/
The take away is that your method of exploding the json element into something else (a record-set) is the way to go. It lets you query the json elements with normal postgres stuff.
In my case:
#Table:test
ID | jsonb_column
1 | {"name": "", "value": "reserved", "expires_in": 13732}
2 | {"name": "poop", "value": "{\"ns\":[\"Whaaat.\"]}", "expires_in": 4554}
3 | {"name": "dog", "value": "{\"ns\":[\"woof.\"]}", "expires_in": 4554}
Example Query
select * from test jsonb_to_recordset(x) where jsonb_column->>'name' like '%o%';
# => Returns
# 2 | {"name": "poop", "value": "{\"ns\":[\"Whaaat.\"]}", "expires_in": 4554}
And to answer your question about jsonb: It looks like jsonb is the better route MOST of the time. It has more methods and faster read (but slower write) times.
Sources:
Happy hunting!
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With